Evaluation of advanced air bag deployment algorithm performance using event data recorders.
Gabler, Hampton C; Hinch, John
2008-10-01
This paper characterizes the field performance of occupant restraint systems designed with advanced air bag features including those specified in the US Federal Motor Vehicle Safety Standard (FMVSS) No. 208 for advanced air bags, through the use of Event Data Recorders (EDRs). Although advanced restraint systems have been extensively tested in the laboratory, we are only beginning to understand the performance of these systems in the field. Because EDRs record many of the inputs to the advanced air bag control module, these devices can provide unique insights into the characteristics of field performance of air bags. The study was based on 164 advanced air bag cases extracted from NASS/CDS 2002-2006 with associated EDR data. In this dataset, advanced driver air bags were observed to deploy with a 50% probability at a longitudinal delta-V of 9 mph for the first stage, and at 26 mph for both inflator stages. In general, advanced air bag performance was as expected, however, the study identified cases of air bag deployments at delta-Vs as low as 3-4 mph, non-deployments at delta-Vs over 26 mph, and possible delayed air bag deployments.
Evaluation of Advanced Air Bag Deployment Algorithm Performance using Event Data Recorders
Gabler, Hampton C.; Hinch, John
2008-01-01
This paper characterizes the field performance of occupant restraint systems designed with advanced air bag features including those specified in the US Federal Motor Vehicle Safety Standard (FMVSS) No. 208 for advanced air bags, through the use of Event Data Recorders (EDRs). Although advanced restraint systems have been extensively tested in the laboratory, we are only beginning to understand the performance of these systems in the field. Because EDRs record many of the inputs to the advanced air bag control module, these devices can provide unique insights into the characteristics of field performance of air bags. The study was based on 164 advanced air bag cases extracted from NASS/CDS 2002-2006 with associated EDR data. In this dataset, advanced driver air bags were observed to deploy with a 50% probability at a longitudinal delta-V of 9 mph for the first stage, and at 26 mph for both inflator stages. In general, advanced air bag performance was as expected, however, the study identified cases of air bag deployments at delta-Vs as low as 3-4 mph, non-deployments at delta-Vs over 26 mph, and possible delayed air bag deployments. PMID:19026234
Automated Deployment of Advanced Controls and Analytics in Buildings
NASA Astrophysics Data System (ADS)
Pritoni, Marco
Buildings use 40% of primary energy in the US. Recent studies show that developing energy analytics and enhancing control strategies can significantly improve their energy performance. However, the deployment of advanced control software applications has been mostly limited to academic studies. Larger-scale implementations are prevented by the significant engineering time and customization required, due to significant differences among buildings. This study demonstrates how physics-inspired data-driven models can be used to develop portable analytics and control applications for buildings. Specifically, I demonstrate application of these models in all phases of the deployment of advanced controls and analytics in buildings: in the first phase, "Site Preparation and Interface with Legacy Systems" I used models to discover or map relationships among building components, automatically gathering metadata (information about data points) necessary to run the applications. During the second phase: "Application Deployment and Commissioning", models automatically learn system parameters, used for advanced controls and analytics. In the third phase: "Continuous Monitoring and Verification" I utilized models to automatically measure the energy performance of a building that has implemented advanced control strategies. In the conclusions, I discuss future challenges and suggest potential strategies for these innovative control systems to be widely deployed in the market. This dissertation provides useful new tools in terms of procedures, algorithms, and models to facilitate the automation of deployment of advanced controls and analytics and accelerate their wide adoption in buildings.
Advanced GPS Technologies (AGT)
2015-05-01
Distribution A GPS Ill Developmental Optical Clock Deployable Antenna Concept 3 \\.J Science and Technology for GPS •:• Spacecraft • AFRL has funded a...Digital Waveform Generators New antenna concepts Supporting electronics Algorithms and new signal combining methods Satellite bus technologies...GPS Military High Gain Antenna Developing Options for Ground Testing 1) Deployable phased array • Low profile element • High efficiency phase
NASA Astrophysics Data System (ADS)
Yang, Chen; Zhang, Xuepan; Huang, Xiaoqi; Cheng, ZhengAi; Zhang, Xinghua; Hou, Xinbin
2017-11-01
The concept of space solar power satellite (SSPS) is an advanced system for collecting solar energy in space and transmitting it wirelessly to earth. However, due to the long service life, in-orbit damage may occur in the structural system of SSPS. Therefore, sensor placement layouts for structural health monitoring should be firstly considered in this concept. In this paper, based on genetic algorithm, an optimal sensor placement method for deployable antenna module health monitoring in SSPS is proposed. According to the characteristics of the deployable antenna module, the designs of sensor placement are listed. Furthermore, based on effective independence method and effective interval index, a combined fitness function is defined to maximize linear independence in targeted modes while simultaneously avoiding redundant information at nearby positions. In addition, by considering the reliability of sensors located at deployable mechanisms, another fitness function is constituted. Moreover, the solution process of optimal sensor placement by using genetic algorithm is clearly demonstrated. At last, a numerical example about the sensor placement layout in a deployable antenna module of SSPS is presented, which by synthetically considering all the above mentioned performances. All results can illustrate the effectiveness and feasibility of the proposed sensor placement method in SSPS.
2006-01-01
enabling technologies such as built-in-test, advanced health monitoring algorithms, reliability and component aging models, prognostics methods, and...deployment and acceptance. This framework and vision is consistent with the onboard PHM ( Prognostic and Health Management) as well as advanced... monitored . In addition to the prognostic forecasting capabilities provided by monitoring system power, multiple confounding errors by electronic
NASA Astrophysics Data System (ADS)
Longmore, S. P.; Knaff, J. A.; Schumacher, A.; Dostalek, J.; DeMaria, R.; Chirokova, G.; Demaria, M.; Powell, D. C.; Sigmund, A.; Yu, W.
2014-12-01
The Colorado State University (CSU) Cooperative Institute for Research in the Atmosphere (CIRA) has recently deployed a tropical cyclone (TC) intensity and surface wind radii estimation algorithm that utilizes Suomi National Polar-orbiting Partnership (S-NPP) satellite Advanced Technology Microwave Sounder (ATMS) and Advanced Microwave Sounding Unit (AMSU) from the NOAA18, NOAA19 and METOPA polar orbiting satellites for testing, integration and operations for the Product System Development and Implementation (PSDI) projects at NOAA's National Environmental Satellite, Data, and Information Service (NESDIS). This presentation discusses the evolution of the CIRA NPP/AMSU TC algorithms internally at CIRA and its migration and integration into the NOAA Data Exploitation (NDE) development and testing frameworks. The discussion will focus on 1) the development cycle of internal NPP/AMSU TC algorithms components by scientists and software engineers, 2) the exchange of these components into the NPP/AMSU TC software systems using the subversion version control system and other exchange methods, 3) testing, debugging and integration of the NPP/AMSU TC systems both at CIRA/NESDIS and 4) the update cycle of new releases through continuous integration. Lastly, a discussion of the methods that were effective and those that need revision will be detailed for the next iteration of the NPP/AMSU TC system.
Advanced processing for high-bandwidth sensor systems
NASA Astrophysics Data System (ADS)
Szymanski, John J.; Blain, Phil C.; Bloch, Jeffrey J.; Brislawn, Christopher M.; Brumby, Steven P.; Cafferty, Maureen M.; Dunham, Mark E.; Frigo, Janette R.; Gokhale, Maya; Harvey, Neal R.; Kenyon, Garrett; Kim, Won-Ha; Layne, J.; Lavenier, Dominique D.; McCabe, Kevin P.; Mitchell, Melanie; Moore, Kurt R.; Perkins, Simon J.; Porter, Reid B.; Robinson, S.; Salazar, Alfonso; Theiler, James P.; Young, Aaron C.
2000-11-01
Compute performance and algorithm design are key problems of image processing and scientific computing in general. For example, imaging spectrometers are capable of producing data in hundreds of spectral bands with millions of pixels. These data sets show great promise for remote sensing applications, but require new and computationally intensive processing. The goal of the Deployable Adaptive Processing Systems (DAPS) project at Los Alamos National Laboratory is to develop advanced processing hardware and algorithms for high-bandwidth sensor applications. The project has produced electronics for processing multi- and hyper-spectral sensor data, as well as LIDAR data, while employing processing elements using a variety of technologies. The project team is currently working on reconfigurable computing technology and advanced feature extraction techniques, with an emphasis on their application to image and RF signal processing. This paper presents reconfigurable computing technology and advanced feature extraction algorithm work and their application to multi- and hyperspectral image processing. Related projects on genetic algorithms as applied to image processing will be introduced, as will the collaboration between the DAPS project and the DARPA Adaptive Computing Systems program. Further details are presented in other talks during this conference and in other conferences taking place during this symposium.
ERIC Educational Resources Information Center
Osler, James Edward
2013-01-01
This paper discusses the implementation of the Tri-Squared Test as an advanced statistical measure used to verify and validate the research outcomes of Educational Technology software. A mathematical and epistemological rational is provided for the transformative process of qualitative data into quantitative outcomes through the Tri-Squared Test…
NASA Astrophysics Data System (ADS)
Ozana, Stepan; Pies, Martin; Docekal, Tomas
2016-06-01
REX Control System is a professional advanced tool for design and implementation of complex control systems that belongs to softPLC category. It covers the entire process starting from simulation of functionality of the application before deployment, through implementation on real-time target, towards analysis, diagnostics and visualization. Basically it consists of two parts: the development tools and the runtime system. It is also compatible with Simulink environment, and the way of implementation of control algorithm is very similar. The control scheme is finally compiled (using RexDraw utility) and uploaded into a chosen real-time target (using RexView utility). There is a wide variety of hardware platforms and real-time operating systems supported by REX Control System such as for example Windows Embedded, Linux, Linux/Xenomai deployed on SBC, IPC, PAC, Raspberry Pi and others with many I/O interfaces. It is modern system designed both for measurement and control applications, offering a lot of additional functions concerning data archiving, visualization based on HTML5, and communication standards. The paper will sum up possibilities of its use in educational process, focused on control of case studies of physical models with classical and advanced control algorithms.
Wang, Xue; Wang, Sheng; Ma, Jun-Jie
2007-01-01
The effectiveness of wireless sensor networks (WSNs) depends on the coverage and target detection probability provided by dynamic deployment, which is usually supported by the virtual force (VF) algorithm. However, in the VF algorithm, the virtual force exerted by stationary sensor nodes will hinder the movement of mobile sensor nodes. Particle swarm optimization (PSO) is introduced as another dynamic deployment algorithm, but in this case the computation time required is the big bottleneck. This paper proposes a dynamic deployment algorithm which is named “virtual force directed co-evolutionary particle swarm optimization” (VFCPSO), since this algorithm combines the co-evolutionary particle swarm optimization (CPSO) with the VF algorithm, whereby the CPSO uses multiple swarms to optimize different components of the solution vectors for dynamic deployment cooperatively and the velocity of each particle is updated according to not only the historical local and global optimal solutions, but also the virtual forces of sensor nodes. Simulation results demonstrate that the proposed VFCPSO is competent for dynamic deployment in WSNs and has better performance with respect to computation time and effectiveness than the VF, PSO and VFPSO algorithms.
An Online Scheduling Algorithm with Advance Reservation for Large-Scale Data Transfers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Balman, Mehmet; Kosar, Tevfik
Scientific applications and experimental facilities generate massive data sets that need to be transferred to remote collaborating sites for sharing, processing, and long term storage. In order to support increasingly data-intensive science, next generation research networks have been deployed to provide high-speed on-demand data access between collaborating institutions. In this paper, we present a practical model for online data scheduling in which data movement operations are scheduled in advance for end-to-end high performance transfers. In our model, data scheduler interacts with reservation managers and data transfer nodes in order to reserve available bandwidth to guarantee completion of jobs that aremore » accepted and confirmed to satisfy preferred time constraint given by the user. Our methodology improves current systems by allowing researchers and higher level meta-schedulers to use data placement as a service where theycan plan ahead and reserve the scheduler time in advance for their data movement operations. We have implemented our algorithm and examined possible techniques for incorporation into current reservation frameworks. Performance measurements confirm that the proposed algorithm is efficient and scalable.« less
Han, Guangjie; Li, Shanshan; Zhu, Chunsheng; Jiang, Jinfang; Zhang, Wenbo
2017-02-08
Marine environmental monitoring provides crucial information and support for the exploitation, utilization, and protection of marine resources. With the rapid development of information technology, the development of three-dimensional underwater acoustic sensor networks (3D UASNs) provides a novel strategy to acquire marine environment information conveniently, efficiently and accurately. However, the specific propagation effects of acoustic communication channel lead to decreased successful information delivery probability with increased distance. Therefore, we investigate two probabilistic neighborhood-based data collection algorithms for 3D UASNs which are based on a probabilistic acoustic communication model instead of the traditional deterministic acoustic communication model. An autonomous underwater vehicle (AUV) is employed to traverse along the designed path to collect data from neighborhoods. For 3D UASNs without prior deployment knowledge, partitioning the network into grids can allow the AUV to visit the central location of each grid for data collection. For 3D UASNs in which the deployment knowledge is known in advance, the AUV only needs to visit several selected locations by constructing a minimum probabilistic neighborhood covering set to reduce data latency. Otherwise, by increasing the transmission rounds, our proposed algorithms can provide a tradeoff between data collection latency and information gain. These algorithms are compared with basic Nearest-neighbor Heuristic algorithm via simulations. Simulation analyses show that our proposed algorithms can efficiently reduce the average data collection completion time, corresponding to a decrease of data latency.
NASA Astrophysics Data System (ADS)
Zou, Zhen-Zhen; Yu, Xu-Tao; Zhang, Zai-Chen
2018-04-01
At first, the entanglement source deployment problem is studied in a quantum multi-hop network, which has a significant influence on quantum connectivity. Two optimization algorithms are introduced with limited entanglement sources in this paper. A deployment algorithm based on node position (DNP) improves connectivity by guaranteeing that all overlapping areas of the distribution ranges of the entanglement sources contain nodes. In addition, a deployment algorithm based on an improved genetic algorithm (DIGA) is implemented by dividing the region into grids. From the simulation results, DNP and DIGA improve quantum connectivity by 213.73% and 248.83% compared to random deployment, respectively, and the latter performs better in terms of connectivity. However, DNP is more flexible and adaptive to change, as it stops running when all nodes are covered.
NASA Astrophysics Data System (ADS)
Thoreson, E. J.; Stievater, T. H.; Rabinovich, W. S.; Ferraro, M. S.; Papanicolaou, N. A.; Bass, R.; Boos, J. B.; Stepnowski, J. L.; McGill, R. A.
2008-10-01
Low cost passive detection of Chemical Warfare Agents (CWA) and being able to distinguish them from interferents is of great interest in the protection of human capital. If CWA sensors could be made cheaply enough, they could be deployed profusely throughout the environment intended for protection. NRL (Naval Research Labs) has demonstrated a small sensor with potentially very low unit cost and compatible with high volume production which has the ability to distinguish between H2O, DMMP, and Toluene. Additionally, they have measured concentrations as low as 17 ppb passively in a package the size of a quarter. Using the latest MEMS technology coupled with advanced chemical identification algorithms we propose a development path for a low cost, highly integrated chemical sensor capable of detecting CWA's, Explosives, VOC's (Volatile Organic Chemicals), and TIC's (Toxic Industrial Chemicals). ITT AES (Advanced Engineering & Sciences) has partnered with NRL (Naval Research Labs) to develop this ``microharp'' technology into a field deployable sensor that will be capable of remote communication with a central server.
Operational algorithm development and refinement approaches
NASA Astrophysics Data System (ADS)
Ardanuy, Philip E.
2003-11-01
Next-generation polar and geostationary systems, such as the National Polar-orbiting Operational Environmental Satellite System (NPOESS) and the Geostationary Operational Environmental Satellite (GOES)-R, will deploy new generations of electro-optical reflective and emissive capabilities. These will include low-radiometric-noise, improved spatial resolution multi-spectral and hyperspectral imagers and sounders. To achieve specified performances (e.g., measurement accuracy, precision, uncertainty, and stability), and best utilize the advanced space-borne sensing capabilities, a new generation of retrieval algorithms will be implemented. In most cases, these advanced algorithms benefit from ongoing testing and validation using heritage research mission algorithms and data [e.g., the Earth Observing System (EOS)] Moderate-resolution Imaging Spectroradiometer (MODIS) and Shuttle Ozone Limb Scattering Experiment (SOLSE)/Limb Ozone Retreival Experiment (LORE). In these instances, an algorithm's theoretical basis is not static, but rather improves with time. Once frozen, an operational algorithm can "lose ground" relative to research analogs. Cost/benefit analyses provide a basis for change management. The challenge is in reconciling and balancing the stability, and "comfort," that today"s generation of operational platforms provide (well-characterized, known, sensors and algorithms) with the greatly improved quality, opportunities, and risks, that the next generation of operational sensors and algorithms offer. By using the best practices and lessons learned from heritage/groundbreaking activities, it is possible to implement an agile process that enables change, while managing change. This approach combines a "known-risk" frozen baseline with preset completion schedules with insertion opportunities for algorithm advances as ongoing validation activities identify and repair areas of weak performance. This paper describes an objective, adaptive implementation roadmap that takes into account the specific maturities of each system"s (sensor and algorithm) technology to provide for a program that contains continuous improvement while retaining its manageability.
A Novel Deployment Method for Communication-Intensive Applications in Service Clouds
Liu, Chuanchang; Yang, Jingqi
2014-01-01
The service platforms are migrating to clouds for reasonably solving long construction periods, low resource utilizations, and isolated constructions of service platforms. However, when the migration is conducted in service clouds, there is a little focus of deploying communication-intensive applications in previous deployment methods. To address this problem, this paper proposed the combination of the online deployment and the offline deployment for deploying communication-intensive applications in service clouds. Firstly, the system architecture was designed for implementing the communication-aware deployment method for communication-intensive applications in service clouds. Secondly, in the online-deployment algorithm and the offline-deployment algorithm, service instances were deployed in an optimal cloud node based on the communication overhead which is determined by the communication traffic between services, as well as the communication performance between cloud nodes. Finally, the experimental results demonstrated that the proposed methods deployed communication-intensive applications effectively with lower latency and lower load compared with existing algorithms. PMID:25140331
A novel deployment method for communication-intensive applications in service clouds.
Liu, Chuanchang; Yang, Jingqi
2014-01-01
The service platforms are migrating to clouds for reasonably solving long construction periods, low resource utilizations, and isolated constructions of service platforms. However, when the migration is conducted in service clouds, there is a little focus of deploying communication-intensive applications in previous deployment methods. To address this problem, this paper proposed the combination of the online deployment and the offline deployment for deploying communication-intensive applications in service clouds. Firstly, the system architecture was designed for implementing the communication-aware deployment method for communication-intensive applications in service clouds. Secondly, in the online-deployment algorithm and the offline-deployment algorithm, service instances were deployed in an optimal cloud node based on the communication overhead which is determined by the communication traffic between services, as well as the communication performance between cloud nodes. Finally, the experimental results demonstrated that the proposed methods deployed communication-intensive applications effectively with lower latency and lower load compared with existing algorithms.
Explosive Detection in Aviation Applications Using CT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Martz, H E; Crawford, C R
2011-02-15
CT scanners are deployed world-wide to detect explosives in checked and carry-on baggage. Though very similar to single- and dual-energy multi-slice CT scanners used today in medical imaging, some recently developed explosives detection scanners employ multiple sources and detector arrays to eliminate mechanical rotation of a gantry, photon counting detectors for spectral imaging, and limited number of views to reduce cost. For each bag scanned, the resulting reconstructed images are first processed by automated threat recognition algorithms to screen for explosives and other threats. Human operators review the images only when these automated algorithms report the presence of possible threats.more » The US Department of Homeland Security (DHS) has requirements for future scanners that include dealing with a larger number of threats, higher probability of detection, lower false alarm rates and lower operating costs. One tactic that DHS is pursuing to achieve these requirements is to augment the capabilities of the established security vendors with third-party algorithm developers. A third-party in this context refers to academics and companies other than the established vendors. DHS is particularly interested in exploring the model that has been used very successfully by the medical imaging industry, in which university researchers develop algorithms that are eventually deployed in commercial medical imaging equipment. The purpose of this paper is to discuss opportunities for third-parties to develop advanced reconstruction and threat detection algorithms.« less
Weighted Global Artificial Bee Colony Algorithm Makes Gas Sensor Deployment Efficient
Jiang, Ye; He, Ziqing; Li, Yanhai; Xu, Zhengyi; Wei, Jianming
2016-01-01
This paper proposes an improved artificial bee colony algorithm named Weighted Global ABC (WGABC) algorithm, which is designed to improve the convergence speed in the search stage of solution search equation. The new method not only considers the effect of global factors on the convergence speed in the search phase, but also provides the expression of global factor weights. Experiment on benchmark functions proved that the algorithm can improve the convergence speed greatly. We arrive at the gas diffusion concentration based on the theory of CFD and then simulate the gas diffusion model with the influence of buildings based on the algorithm. Simulation verified the effectiveness of the WGABC algorithm in improving the convergence speed in optimal deployment scheme of gas sensors. Finally, it is verified that the optimal deployment method based on WGABC algorithm can improve the monitoring efficiency of sensors greatly as compared with the conventional deployment methods. PMID:27322262
Systematic Benchmarking of Diagnostic Technologies for an Electrical Power System
NASA Technical Reports Server (NTRS)
Kurtoglu, Tolga; Jensen, David; Poll, Scott
2009-01-01
Automated health management is a critical functionality for complex aerospace systems. A wide variety of diagnostic algorithms have been developed to address this technical challenge. Unfortunately, the lack of support to perform large-scale V&V (verification and validation) of diagnostic technologies continues to create barriers to effective development and deployment of such algorithms for aerospace vehicles. In this paper, we describe a formal framework developed for benchmarking of diagnostic technologies. The diagnosed system is the Advanced Diagnostics and Prognostics Testbed (ADAPT), a real-world electrical power system (EPS), developed and maintained at the NASA Ames Research Center. The benchmarking approach provides a systematic, empirical basis to the testing of diagnostic software and is used to provide performance assessment for different diagnostic algorithms.
Han, Guangjie; Li, Shanshan; Zhu, Chunsheng; Jiang, Jinfang; Zhang, Wenbo
2017-01-01
Marine environmental monitoring provides crucial information and support for the exploitation, utilization, and protection of marine resources. With the rapid development of information technology, the development of three-dimensional underwater acoustic sensor networks (3D UASNs) provides a novel strategy to acquire marine environment information conveniently, efficiently and accurately. However, the specific propagation effects of acoustic communication channel lead to decreased successful information delivery probability with increased distance. Therefore, we investigate two probabilistic neighborhood-based data collection algorithms for 3D UASNs which are based on a probabilistic acoustic communication model instead of the traditional deterministic acoustic communication model. An autonomous underwater vehicle (AUV) is employed to traverse along the designed path to collect data from neighborhoods. For 3D UASNs without prior deployment knowledge, partitioning the network into grids can allow the AUV to visit the central location of each grid for data collection. For 3D UASNs in which the deployment knowledge is known in advance, the AUV only needs to visit several selected locations by constructing a minimum probabilistic neighborhood covering set to reduce data latency. Otherwise, by increasing the transmission rounds, our proposed algorithms can provide a tradeoff between data collection latency and information gain. These algorithms are compared with basic Nearest-neighbor Heuristic algorithm via simulations. Simulation analyses show that our proposed algorithms can efficiently reduce the average data collection completion time, corresponding to a decrease of data latency. PMID:28208735
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ozana, Stepan, E-mail: stepan.ozana@vsb.cz; Pies, Martin, E-mail: martin.pies@vsb.cz; Docekal, Tomas, E-mail: docekalt@email.cz
REX Control System is a professional advanced tool for design and implementation of complex control systems that belongs to softPLC category. It covers the entire process starting from simulation of functionality of the application before deployment, through implementation on real-time target, towards analysis, diagnostics and visualization. Basically it consists of two parts: the development tools and the runtime system. It is also compatible with Simulink environment, and the way of implementation of control algorithm is very similar. The control scheme is finally compiled (using RexDraw utility) and uploaded into a chosen real-time target (using RexView utility). There is a widemore » variety of hardware platforms and real-time operating systems supported by REX Control System such as for example Windows Embedded, Linux, Linux/Xenomai deployed on SBC, IPC, PAC, Raspberry Pi and others with many I/O interfaces. It is modern system designed both for measurement and control applications, offering a lot of additional functions concerning data archiving, visualization based on HTML5, and communication standards. The paper will sum up possibilities of its use in educational process, focused on control of case studies of physical models with classical and advanced control algorithms.« less
Baseline Assessment and Prioritization Framework for IVHM Integrity Assurance Enabling Capabilities
NASA Technical Reports Server (NTRS)
Cooper, Eric G.; DiVito, Benedetto L.; Jacklin, Stephen A.; Miner, Paul S.
2009-01-01
Fundamental to vehicle health management is the deployment of systems incorporating advanced technologies for predicting and detecting anomalous conditions in highly complex and integrated environments. Integrated structural integrity health monitoring, statistical algorithms for detection, estimation, prediction, and fusion, and diagnosis supporting adaptive control are examples of advanced technologies that present considerable verification and validation challenges. These systems necessitate interactions between physical and software-based systems that are highly networked with sensing and actuation subsystems, and incorporate technologies that are, in many respects, different from those employed in civil aviation today. A formidable barrier to deploying these advanced technologies in civil aviation is the lack of enabling verification and validation tools, methods, and technologies. The development of new verification and validation capabilities will not only enable the fielding of advanced vehicle health management systems, but will also provide new assurance capabilities for verification and validation of current generation aviation software which has been implicated in anomalous in-flight behavior. This paper describes the research focused on enabling capabilities for verification and validation underway within NASA s Integrated Vehicle Health Management project, discusses the state of the art of these capabilities, and includes a framework for prioritizing activities.
Earth sensing: from ice to the Internet of Things
NASA Astrophysics Data System (ADS)
Martinez, K.
2017-12-01
The evolution of technology has led to improvements in our ability to use sensors for earth science research. Radio communications have improved in terms of range and power use. Miniaturisation means we now use 32 bit processors with embedded memory, storage and interfaces. Sensor technology makes it simpler to integrate devices such as accelerometers, compasses, gas and biosensors. Programming languages have developed so that it has become easier to create software for these systems. This combined with the power of the processors has made research into advanced algorithms and communications feasible. The term environmental sensor networks describes these advanced systems which are designed specifically to take sensor measurements in the natural environment. Through a decade of research into sensor networks, deployed mainly in glaciers, many areas of this still emerging technology have been explored. From deploying the first subglacial sensor probes with custom electronics and protocols we learnt tuning to harsh environments and energy management. More recently installing sensor systems in the mountains of Scotland has shown that standards have allowed complete internet and web integration. This talk will discuss the technologies used in a range of recent deployments in Scotland and Iceland focussed on creating new data streams for cryospheric and climate change research.
Jiang, Peng; Liu, Shuai; Liu, Jun; Wu, Feng; Zhang, Le
2016-07-14
Most of the existing node depth-adjustment deployment algorithms for underwater wireless sensor networks (UWSNs) just consider how to optimize network coverage and connectivity rate. However, these literatures don't discuss full network connectivity, while optimization of network energy efficiency and network reliability are vital topics for UWSN deployment. Therefore, in this study, a depth-adjustment deployment algorithm based on two-dimensional (2D) convex hull and spanning tree (NDACS) for UWSNs is proposed. First, the proposed algorithm uses the geometric characteristics of a 2D convex hull and empty circle to find the optimal location of a sleep node and activate it, minimizes the network coverage overlaps of the 2D plane, and then increases the coverage rate until the first layer coverage threshold is reached. Second, the sink node acts as a root node of all active nodes on the 2D convex hull and then forms a small spanning tree gradually. Finally, the depth-adjustment strategy based on time marker is used to achieve the three-dimensional overall network deployment. Compared with existing depth-adjustment deployment algorithms, the simulation results show that the NDACS algorithm can maintain full network connectivity with high network coverage rate, as well as improved network average node degree, thus increasing network reliability.
Jiang, Peng; Liu, Shuai; Liu, Jun; Wu, Feng; Zhang, Le
2016-01-01
Most of the existing node depth-adjustment deployment algorithms for underwater wireless sensor networks (UWSNs) just consider how to optimize network coverage and connectivity rate. However, these literatures don’t discuss full network connectivity, while optimization of network energy efficiency and network reliability are vital topics for UWSN deployment. Therefore, in this study, a depth-adjustment deployment algorithm based on two-dimensional (2D) convex hull and spanning tree (NDACS) for UWSNs is proposed. First, the proposed algorithm uses the geometric characteristics of a 2D convex hull and empty circle to find the optimal location of a sleep node and activate it, minimizes the network coverage overlaps of the 2D plane, and then increases the coverage rate until the first layer coverage threshold is reached. Second, the sink node acts as a root node of all active nodes on the 2D convex hull and then forms a small spanning tree gradually. Finally, the depth-adjustment strategy based on time marker is used to achieve the three-dimensional overall network deployment. Compared with existing depth-adjustment deployment algorithms, the simulation results show that the NDACS algorithm can maintain full network connectivity with high network coverage rate, as well as improved network average node degree, thus increasing network reliability. PMID:27428970
Unaldi, Numan; Temel, Samil; Asari, Vijayan K.
2012-01-01
One of the most critical issues of Wireless Sensor Networks (WSNs) is the deployment of a limited number of sensors in order to achieve maximum coverage on a terrain. The optimal sensor deployment which enables one to minimize the consumed energy, communication time and manpower for the maintenance of the network has attracted interest with the increased number of studies conducted on the subject in the last decade. Most of the studies in the literature today are proposed for two dimensional (2D) surfaces; however, real world sensor deployments often arise on three dimensional (3D) environments. In this paper, a guided wavelet transform (WT) based deployment strategy (WTDS) for 3D terrains, in which the sensor movements are carried out within the mutation phase of the genetic algorithms (GAs) is proposed. The proposed algorithm aims to maximize the Quality of Coverage (QoC) of a WSN via deploying a limited number of sensors on a 3D surface by utilizing a probabilistic sensing model and the Bresenham's line of sight (LOS) algorithm. In addition, the method followed in this paper is novel to the literature and the performance of the proposed algorithm is compared with the Delaunay Triangulation (DT) method as well as a standard genetic algorithm based method and the results reveal that the proposed method is a more powerful and more successful method for sensor deployment on 3D terrains. PMID:22666078
Global Deployment Anaylsis System Algorithm Description (With Updates)
1998-09-01
Global Deployment Analysis System Algorithm Description (with Updates) By Noetics , Inc. For U.S. Army Concepts Analysis Agency Contract...t "O -Q £5.3 Q 20000224 107 aQU’no-bi-o^f r This Algorithm Description for the Global Deployment Analysis System (GDAS) was prepared by Noetics ...support for Paradox Runtime will be provided by the GDAS developers, CAA and Noetics Inc., and not by Borland International. GDAS for Windows has
An efficient genetic algorithm for maximum coverage deployment in wireless sensor networks.
Yoon, Yourim; Kim, Yong-Hyuk
2013-10-01
Sensor networks have a lot of applications such as battlefield surveillance, environmental monitoring, and industrial diagnostics. Coverage is one of the most important performance metrics for sensor networks since it reflects how well a sensor field is monitored. In this paper, we introduce the maximum coverage deployment problem in wireless sensor networks and analyze the properties of the problem and its solution space. Random deployment is the simplest way to deploy sensor nodes but may cause unbalanced deployment and therefore, we need a more intelligent way for sensor deployment. We found that the phenotype space of the problem is a quotient space of the genotype space in a mathematical view. Based on this property, we propose an efficient genetic algorithm using a novel normalization method. A Monte Carlo method is adopted to design an efficient evaluation function, and its computation time is decreased without loss of solution quality using a method that starts from a small number of random samples and gradually increases the number for subsequent generations. The proposed genetic algorithms could be further improved by combining with a well-designed local search. The performance of the proposed genetic algorithm is shown by a comparative experimental study. When compared with random deployment and existing methods, our genetic algorithm was not only about twice faster, but also showed significant performance improvement in quality.
Unified Framework for Development, Deployment and Robust Testing of Neuroimaging Algorithms
Joshi, Alark; Scheinost, Dustin; Okuda, Hirohito; Belhachemi, Dominique; Murphy, Isabella; Staib, Lawrence H.; Papademetris, Xenophon
2011-01-01
Developing both graphical and command-line user interfaces for neuroimaging algorithms requires considerable effort. Neuroimaging algorithms can meet their potential only if they can be easily and frequently used by their intended users. Deployment of a large suite of such algorithms on multiple platforms requires consistency of user interface controls, consistent results across various platforms and thorough testing. We present the design and implementation of a novel object-oriented framework that allows for rapid development of complex image analysis algorithms with many reusable components and the ability to easily add graphical user interface controls. Our framework also allows for simplified yet robust nightly testing of the algorithms to ensure stability and cross platform interoperability. All of the functionality is encapsulated into a software object requiring no separate source code for user interfaces, testing or deployment. This formulation makes our framework ideal for developing novel, stable and easy-to-use algorithms for medical image analysis and computer assisted interventions. The framework has been both deployed at Yale and released for public use in the open source multi-platform image analysis software—BioImage Suite (bioimagesuite.org). PMID:21249532
Competitive Swarm Optimizer Based Gateway Deployment Algorithm in Cyber-Physical Systems.
Huang, Shuqiang; Tao, Ming
2017-01-22
Wireless sensor network topology optimization is a highly important issue, and topology control through node selection can improve the efficiency of data forwarding, while saving energy and prolonging lifetime of the network. To address the problem of connecting a wireless sensor network to the Internet in cyber-physical systems, here we propose a geometric gateway deployment based on a competitive swarm optimizer algorithm. The particle swarm optimization (PSO) algorithm has a continuous search feature in the solution space, which makes it suitable for finding the geometric center of gateway deployment; however, its search mechanism is limited to the individual optimum (pbest) and the population optimum (gbest); thus, it easily falls into local optima. In order to improve the particle search mechanism and enhance the search efficiency of the algorithm, we introduce a new competitive swarm optimizer (CSO) algorithm. The CSO search algorithm is based on an inter-particle competition mechanism and can effectively avoid trapping of the population falling into a local optimum. With the improvement of an adaptive opposition-based search and its ability to dynamically parameter adjustments, this algorithm can maintain the diversity of the entire swarm to solve geometric K -center gateway deployment problems. The simulation results show that this CSO algorithm has a good global explorative ability as well as convergence speed and can improve the network quality of service (QoS) level of cyber-physical systems by obtaining a minimum network coverage radius. We also find that the CSO algorithm is more stable, robust and effective in solving the problem of geometric gateway deployment as compared to the PSO or Kmedoids algorithms.
A Polygon Model for Wireless Sensor Network Deployment with Directional Sensing Areas
Wu, Chun-Hsien; Chung, Yeh-Ching
2009-01-01
The modeling of the sensing area of a sensor node is essential for the deployment algorithm of wireless sensor networks (WSNs). In this paper, a polygon model is proposed for the sensor node with directional sensing area. In addition, a WSN deployment algorithm is presented with topology control and scoring mechanisms to maintain network connectivity and improve sensing coverage rate. To evaluate the proposed polygon model and WSN deployment algorithm, a simulation is conducted. The simulation results show that the proposed polygon model outperforms the existed disk model and circular sector model in terms of the maximum sensing coverage rate. PMID:22303159
NASA Technical Reports Server (NTRS)
Garbeff, Theodore J., II; Baerny, Jennifer K.
2017-01-01
The following details recent efforts undertaken at the NASA Ames Unitary Plan wind tunnels to design and deploy an advanced, production-level infrared (IR) flow visualization data system. Highly sensitive IR cameras, coupled with in-line image processing, have enabled the visualization of wind tunnel model surface flow features as they develop in real-time. Boundary layer transition, shock impingement, junction flow, vortex dynamics, and buffet are routinely observed in both transonic and supersonic flow regimes all without the need of dedicated ramps in test section total temperature. Successful measurements have been performed on wing-body sting mounted test articles, semi-span floor mounted aircraft models, and sting mounted launch vehicle configurations. The unique requirements of imaging in production wind tunnel testing has led to advancements in the deployment of advanced IR cameras in a harsh test environment, robust data acquisition storage and workflow, real-time image processing algorithms, and evaluation of optimal surface treatments. The addition of a multi-camera IR flow visualization data system to the Ames UPWT has demonstrated itself to be a valuable analyses tool in the study of new and old aircraft/launch vehicle aerodynamics and has provided new insight for the evaluation of computational techniques.
NASA Astrophysics Data System (ADS)
Various papers on communications for the information age are presented. Among the general topics considered are: telematic services and terminals, satellite communications, telecommunications mangaement network, control of integrated broadband networks, advances in digital radio systems, the intelligent network, broadband networks and services deployment, future switch architectures, performance analysis of computer networks, advances in spread spectrum, optical high-speed LANs, and broadband switching and networks. Also addressed are: multiple access protocols, video coding techniques, modulation and coding, photonic switching, SONET terminals and applications, standards for video coding, digital switching, progress in MANs, mobile and portable radio, software design for improved maintainability, multipath propagation and advanced countermeasure, data communication, network control and management, fiber in the loop, network algorithm and protocols, and advances in computer communications.
Yu, Shanen; Xu, Yiming; Jiang, Peng; Wu, Feng; Xu, Huan
2017-01-01
At present, free-to-move node self-deployment algorithms aim at event coverage and cannot improve network coverage under the premise of considering network connectivity, network reliability and network deployment energy consumption. Thus, this study proposes pigeon-based self-deployment algorithm (PSA) for underwater wireless sensor networks to overcome the limitations of these existing algorithms. In PSA, the sink node first finds its one-hop nodes and maximizes the network coverage in its one-hop region. The one-hop nodes subsequently divide the network into layers and cluster in each layer. Each cluster head node constructs a connected path to the sink node to guarantee network connectivity. Finally, the cluster head node regards the ratio of the movement distance of the node to the change in the coverage redundancy ratio as the target function and employs pigeon swarm optimization to determine the positions of the nodes. Simulation results show that PSA improves both network connectivity and network reliability, decreases network deployment energy consumption, and increases network coverage. PMID:28338615
On Efficient Deployment of Wireless Sensors for Coverage and Connectivity in Constrained 3D Space.
Wu, Chase Q; Wang, Li
2017-10-10
Sensor networks have been used in a rapidly increasing number of applications in many fields. This work generalizes a sensor deployment problem to place a minimum set of wireless sensors at candidate locations in constrained 3D space to k -cover a given set of target objects. By exhausting the combinations of discreteness/continuousness constraints on either sensor locations or target objects, we formulate four classes of sensor deployment problems in 3D space: deploy sensors at Discrete/Continuous Locations (D/CL) to cover Discrete/Continuous Targets (D/CT). We begin with the design of an approximate algorithm for DLDT and then reduce DLCT, CLDT, and CLCT to DLDT by discretizing continuous sensor locations or target objects into a set of divisions without sacrificing sensing precision. Furthermore, we consider a connected version of each problem where the deployed sensors must form a connected network, and design an approximation algorithm to minimize the number of deployed sensors with connectivity guarantee. For performance comparison, we design and implement an optimal solution and a genetic algorithm (GA)-based approach. Extensive simulation results show that the proposed deployment algorithms consistently outperform the GA-based heuristic and achieve a close-to-optimal performance in small-scale problem instances and a significantly superior overall performance than the theoretical upper bound.
Competitive Swarm Optimizer Based Gateway Deployment Algorithm in Cyber-Physical Systems
Huang, Shuqiang; Tao, Ming
2017-01-01
Wireless sensor network topology optimization is a highly important issue, and topology control through node selection can improve the efficiency of data forwarding, while saving energy and prolonging lifetime of the network. To address the problem of connecting a wireless sensor network to the Internet in cyber-physical systems, here we propose a geometric gateway deployment based on a competitive swarm optimizer algorithm. The particle swarm optimization (PSO) algorithm has a continuous search feature in the solution space, which makes it suitable for finding the geometric center of gateway deployment; however, its search mechanism is limited to the individual optimum (pbest) and the population optimum (gbest); thus, it easily falls into local optima. In order to improve the particle search mechanism and enhance the search efficiency of the algorithm, we introduce a new competitive swarm optimizer (CSO) algorithm. The CSO search algorithm is based on an inter-particle competition mechanism and can effectively avoid trapping of the population falling into a local optimum. With the improvement of an adaptive opposition-based search and its ability to dynamically parameter adjustments, this algorithm can maintain the diversity of the entire swarm to solve geometric K-center gateway deployment problems. The simulation results show that this CSO algorithm has a good global explorative ability as well as convergence speed and can improve the network quality of service (QoS) level of cyber-physical systems by obtaining a minimum network coverage radius. We also find that the CSO algorithm is more stable, robust and effective in solving the problem of geometric gateway deployment as compared to the PSO or Kmedoids algorithms. PMID:28117735
Benchmarking Diagnostic Algorithms on an Electrical Power System Testbed
NASA Technical Reports Server (NTRS)
Kurtoglu, Tolga; Narasimhan, Sriram; Poll, Scott; Garcia, David; Wright, Stephanie
2009-01-01
Diagnostic algorithms (DAs) are key to enabling automated health management. These algorithms are designed to detect and isolate anomalies of either a component or the whole system based on observations received from sensors. In recent years a wide range of algorithms, both model-based and data-driven, have been developed to increase autonomy and improve system reliability and affordability. However, the lack of support to perform systematic benchmarking of these algorithms continues to create barriers for effective development and deployment of diagnostic technologies. In this paper, we present our efforts to benchmark a set of DAs on a common platform using a framework that was developed to evaluate and compare various performance metrics for diagnostic technologies. The diagnosed system is an electrical power system, namely the Advanced Diagnostics and Prognostics Testbed (ADAPT) developed and located at the NASA Ames Research Center. The paper presents the fundamentals of the benchmarking framework, the ADAPT system, description of faults and data sets, the metrics used for evaluation, and an in-depth analysis of benchmarking results obtained from testing ten diagnostic algorithms on the ADAPT electrical power system testbed.
González-Parada, Eva; Cano-García, Jose; Aguilera, Francisco; Sandoval, Francisco; Urdiales, Cristina
2017-01-01
Autonomous mobile nodes in mobile wireless sensor networks (MWSN) allow self-deployment and self-healing. In both cases, the goals are: (i) to achieve adequate coverage; and (ii) to extend network life. In dynamic environments, nodes may use reactive algorithms so that each node locally decides when and where to move. This paper presents a behavior-based deployment and self-healing algorithm based on the social potential fields algorithm. In the proposed algorithm, nodes are attached to low cost robots to autonomously navigate in the coverage area. The proposed algorithm has been tested in environments with and without obstacles. Our study also analyzes the differences between non-hierarchical and hierarchical routing configurations in terms of network life and coverage. PMID:28075364
González-Parada, Eva; Cano-García, Jose; Aguilera, Francisco; Sandoval, Francisco; Urdiales, Cristina
2017-01-09
Autonomous mobile nodes in mobile wireless sensor networks (MWSN) allow self-deployment and self-healing. In both cases, the goals are: (i) to achieve adequate coverage; and (ii) to extend network life. In dynamic environments, nodes may use reactive algorithms so that each node locally decides when and where to move. This paper presents a behavior-based deployment and self-healing algorithm based on the social potential fields algorithm. In the proposed algorithm, nodes are attached to low cost robots to autonomously navigate in the coverage area. The proposed algorithm has been tested in environments with and without obstacles. Our study also analyzes the differences between non-hierarchical and hierarchical routing configurations in terms of network life and coverage.
47 CFR 51.230 - Presumption of acceptability for deployment of an advanced services loop technology.
Code of Federal Regulations, 2010 CFR
2010-10-01
... an advanced services loop technology. 51.230 Section 51.230 Telecommunication FEDERAL COMMUNICATIONS... Carriers § 51.230 Presumption of acceptability for deployment of an advanced services loop technology. (a) An advanced services loop technology is presumed acceptable for deployment under any one of the...
Evaluation of Anomaly Detection Capability for Ground-Based Pre-Launch Shuttle Operations. Chapter 8
NASA Technical Reports Server (NTRS)
Martin, Rodney Alexander
2010-01-01
This chapter will provide a thorough end-to-end description of the process for evaluation of three different data-driven algorithms for anomaly detection to select the best candidate for deployment as part of a suite of IVHM (Integrated Vehicle Health Management) technologies. These algorithms were deemed to be sufficiently mature enough to be considered viable candidates for deployment in support of the maiden launch of Ares I-X, the successor to the Space Shuttle for NASA's Constellation program. Data-driven algorithms are just one of three different types being deployed. The other two types of algorithms being deployed include a "nile-based" expert system, and a "model-based" system. Within these two categories, the deployable candidates have already been selected based upon qualitative factors such as flight heritage. For the rule-based system, SHINE (Spacecraft High-speed Inference Engine) has been selected for deployment, which is a component of BEAM (Beacon-based Exception Analysis for Multimissions), a patented technology developed at NASA's JPL (Jet Propulsion Laboratory) and serves to aid in the management and identification of operational modes. For the "model-based" system, a commercially available package developed by QSI (Qualtech Systems, Inc.), TEAMS (Testability Engineering and Maintenance System) has been selected for deployment to aid in diagnosis. In the context of this particular deployment, distinctions among the use of the terms "data-driven," "rule-based," and "model-based," can be found in. Although there are three different categories of algorithms that have been selected for deployment, our main focus in this chapter will be on the evaluation of three candidates for data-driven anomaly detection. These algorithms will be evaluated upon their capability for robustly detecting incipient faults or failures in the ground-based phase of pre-launch space shuttle operations, rather than based oil heritage as performed in previous studies. Robust detection will allow for the achievement of pre-specified minimum false alarm and/or missed detection rates in the selection of alert thresholds. All algorithms will also be optimized with respect to an aggregation of these same criteria. Our study relies upon the use of Shuttle data to act as was a proxy for and in preparation for application to Ares I-X data, which uses a very similar hardware platform for the subsystems that are being targeted (TVC - Thrust Vector Control subsystem for the SRB (Solid Rocket Booster)).
Rule-Based vs. Behavior-Based Self-Deployment for Mobile Wireless Sensor Networks
Urdiales, Cristina; Aguilera, Francisco; González-Parada, Eva; Cano-García, Jose; Sandoval, Francisco
2016-01-01
In mobile wireless sensor networks (MWSN), nodes are allowed to move autonomously for deployment. This process is meant: (i) to achieve good coverage; and (ii) to distribute the communication load as homogeneously as possible. Rather than optimizing deployment, reactive algorithms are based on a set of rules or behaviors, so nodes can determine when to move. This paper presents an experimental evaluation of both reactive deployment approaches: rule-based and behavior-based ones. Specifically, we compare a backbone dispersion algorithm with a social potential fields algorithm. Most tests are done under simulation for a large number of nodes in environments with and without obstacles. Results are validated using a small robot network in the real world. Our results show that behavior-based deployment tends to provide better coverage and communication balance, especially for a large number of nodes in areas with obstacles. PMID:27399709
Distributed sensor coordination for advanced energy systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tumer, Kagan
Motivation: The ability to collect key system level information is critical to the safe, efficient and reliable operation of advanced power systems. Recent advances in sensor technology have enabled some level of decision making directly at the sensor level. However, coordinating large numbers of sensors, particularly heterogeneous sensors, to achieve system level objectives such as predicting plant efficiency, reducing downtime or predicting outages requires sophisticated coordination algorithms. Indeed, a critical issue in such systems is how to ensure the interaction of a large number of heterogenous system components do not interfere with one another and lead to undesirable behavior. Objectivesmore » and Contributions: The long-term objective of this work is to provide sensor deployment, coordination and networking algorithms for large numbers of sensors to ensure the safe, reliable, and robust operation of advanced energy systems. Our two specific objectives are to: 1. Derive sensor performance metrics for heterogeneous sensor networks. 2. Demonstrate effectiveness, scalability and reconfigurability of heterogeneous sensor network in advanced power systems. The key technical contribution of this work is to push the coordination step to the design of the objective functions of the sensors, allowing networks of heterogeneous sensors to be controlled. By ensuring that the control and coordination is not specific to particular sensor hardware, this approach enables the design and operation of large heterogeneous sensor networks. In addition to the coordination coordination mechanism, this approach allows the system to be reconfigured in response to changing needs (e.g., sudden external events requiring new responses) or changing sensor network characteristics (e.g., sudden changes to plant condition). Impact: The impact of this work extends to a large class of problems relevant to the National Energy Technology Laboratory including sensor placement, heterogeneous sensor coordination, and sensor network control in advanced power systems. Each application has specific needs, but they all share the one crucial underlying problem: how to ensure that the interactions of a large number of heterogenous agents lead to coordinated system behavior. This proposal describes a new paradigm that addresses that very issue in a systematic way. Key Results and Findings: All milestones have been completed. Our results demonstrate that by properly shaping agent objective functions, we can develop large (up to 10,000 devices) heterogeneous sensor networks with key desirable properties. The first milestone shows that properly choosing agent-specific objective functions increases system performance by up to 99.9% compared to global evaluations. The second milestone shows evolutionary algorithms learn excellent sensor network coordination policies prior to network deployment, and these policies can be refined online once the network is deployed. The third milestone shows the resulting sensor networks networks are extremely robust to sensor noise, where networks with up to 25% sensor noise are capable of providing measurements with errors on the order of 10⁻³. The fourth milestone shows the resulting sensor networks are extremely robust to sensor failure, with 25% of the sensors in the system failing resulting in no significant performance losses after system reconfiguration.« less
Machine learning for micro-tomography
NASA Astrophysics Data System (ADS)
Parkinson, Dilworth Y.; Pelt, Daniël. M.; Perciano, Talita; Ushizima, Daniela; Krishnan, Harinarayan; Barnard, Harold S.; MacDowell, Alastair A.; Sethian, James
2017-09-01
Machine learning has revolutionized a number of fields, but many micro-tomography users have never used it for their work. The micro-tomography beamline at the Advanced Light Source (ALS), in collaboration with the Center for Applied Mathematics for Energy Research Applications (CAMERA) at Lawrence Berkeley National Laboratory, has now deployed a series of tools to automate data processing for ALS users using machine learning. This includes new reconstruction algorithms, feature extraction tools, and image classification and recommen- dation systems for scientific image. Some of these tools are either in automated pipelines that operate on data as it is collected or as stand-alone software. Others are deployed on computing resources at Berkeley Lab-from workstations to supercomputers-and made accessible to users through either scripting or easy-to-use graphical interfaces. This paper presents a progress report on this work.
Optimal Deployment of Sensor Nodes Based on Performance Surface of Underwater Acoustic Communication
Choi, Jee Woong
2017-01-01
The underwater acoustic sensor network (UWASN) is a system that exchanges data between numerous sensor nodes deployed in the sea. The UWASN uses an underwater acoustic communication technique to exchange data. Therefore, it is important to design a robust system that will function even in severely fluctuating underwater communication conditions, along with variations in the ocean environment. In this paper, a new algorithm to find the optimal deployment positions of underwater sensor nodes is proposed. The algorithm uses the communication performance surface, which is a map showing the underwater acoustic communication performance of a targeted area. A virtual force-particle swarm optimization algorithm is then used as an optimization technique to find the optimal deployment positions of the sensor nodes, using the performance surface information to estimate the communication radii of the sensor nodes in each generation. The algorithm is evaluated by comparing simulation results between two different seasons (summer and winter) for an area located off the eastern coast of Korea as the selected targeted area. PMID:29053569
Rao, Akshay; Elara, Mohan Rajesh; Elangovan, Karthikeyan
This paper aims to develop a local path planning algorithm for a bio-inspired, reconfigurable crawling robot. A detailed description of the robotic platform is first provided, and the suitability for deployment of each of the current state-of-the-art local path planners is analyzed after an extensive literature review. The Enhanced Vector Polar Histogram algorithm is described and reformulated to better fit the requirements of the platform. The algorithm is deployed on the robotic platform in crawling configuration and favorably compared with other state-of-the-art local path planning algorithms.
Deploying advanced public transportation systems in Birmingham
DOT National Transportation Integrated Search
2003-08-01
Advanced Public Transportation Systems (APTS) technologies have been deployed by many urban transit systems in order to improve efficiency, reduce operating costs, and improve service quality. The majority of : these deployments, however, have been i...
Almas, Muhammad Shoaib; Vanfretti, Luigi
2017-01-01
Synchrophasor measurements from Phasor Measurement Units (PMUs) are the primary sensors used to deploy Wide-Area Monitoring, Protection and Control (WAMPAC) systems. PMUs stream out synchrophasor measurements through the IEEE C37.118.2 protocol using TCP/IP or UDP/IP. The proposed method establishes a direct communication between two PMUs, thus eliminating the requirement of an intermediate phasor data concentrator, data mediator and/or protocol parser and thereby ensuring minimum communication latency without considering communication link delays. This method allows utilizing synchrophasor measurements internally in a PMU to deploy custom protection and control algorithms. These algorithms are deployed using protection logic equations which are supported by all the PMU vendors. Moreover, this method reduces overall equipment cost as the algorithms execute internally in a PMU and therefore does not require any additional controller for their deployment. The proposed method can be utilized for fast prototyping of wide-area measurements based protection and control applications. The proposed method is tested by coupling commercial PMUs as Hardware-in-the-Loop (HIL) with Opal-RT's eMEGAsim Real-Time Simulator (RTS). As illustrative example, anti-islanding protection application is deployed using proposed method and its performance is assessed. The essential points in the method are: •Bypassing intermediate phasor data concentrator or protocol parsers as the synchrophasors are communicated directly between the PMUs (minimizes communication delays).•Wide Area Protection and Control Algorithm is deployed using logic equations in the client PMU, therefore eliminating the requirement for an external hardware controller (cost curtailment)•Effortless means to exploit PMU measurements in an environment familiar to protection engineers.
Jiang, Peng; Xu, Yiming; Wu, Feng
2016-01-01
Existing move-restricted node self-deployment algorithms are based on a fixed node communication radius, evaluate the performance based on network coverage or the connectivity rate and do not consider the number of nodes near the sink node and the energy consumption distribution of the network topology, thereby degrading network reliability and the energy consumption balance. Therefore, we propose a distributed underwater node self-deployment algorithm. First, each node begins the uneven clustering based on the distance on the water surface. Each cluster head node selects its next-hop node to synchronously construct a connected path to the sink node. Second, the cluster head node adjusts its depth while maintaining the layout formed by the uneven clustering and then adjusts the positions of in-cluster nodes. The algorithm originally considers the network reliability and energy consumption balance during node deployment and considers the coverage redundancy rate of all positions that a node may reach during the node position adjustment. Simulation results show, compared to the connected dominating set (CDS) based depth computation algorithm, that the proposed algorithm can increase the number of the nodes near the sink node and improve network reliability while guaranteeing the network connectivity rate. Moreover, it can balance energy consumption during network operation, further improve network coverage rate and reduce energy consumption. PMID:26784193
Certification Considerations for Adaptive Systems
NASA Technical Reports Server (NTRS)
Bhattacharyya, Siddhartha; Cofer, Darren; Musliner, David J.; Mueller, Joseph; Engstrom, Eric
2015-01-01
Advanced capabilities planned for the next generation of aircraft, including those that will operate within the Next Generation Air Transportation System (NextGen), will necessarily include complex new algorithms and non-traditional software elements. These aircraft will likely incorporate adaptive control algorithms that will provide enhanced safety, autonomy, and robustness during adverse conditions. Unmanned aircraft will operate alongside manned aircraft in the National Airspace (NAS), with intelligent software performing the high-level decision-making functions normally performed by human pilots. Even human-piloted aircraft will necessarily include more autonomy. However, there are serious barriers to the deployment of new capabilities, especially for those based upon software including adaptive control (AC) and artificial intelligence (AI) algorithms. Current civil aviation certification processes are based on the idea that the correct behavior of a system must be completely specified and verified prior to operation. This report by Rockwell Collins and SIFT documents our comprehensive study of the state of the art in intelligent and adaptive algorithms for the civil aviation domain, categorizing the approaches used and identifying gaps and challenges associated with certification of each approach.
The ADVANCE project : formal evaluation of the targeted deployment. Volume 2
DOT National Transportation Integrated Search
1997-01-01
This document reports on the formal evaluation of the targeted (limited but highly focused) deployment of the Advanced Driver and Vehicle Advisory Navigation ConcEpt (ADVANCE), an in-vehicle advanced traveler information system designed to provide sh...
Jevtić, Aleksandar; Gutiérrez, Álvaro
2011-01-01
Swarms of robots can use their sensing abilities to explore unknown environments and deploy on sites of interest. In this task, a large number of robots is more effective than a single unit because of their ability to quickly cover the area. However, the coordination of large teams of robots is not an easy problem, especially when the resources for the deployment are limited. In this paper, the Distributed Bees Algorithm (DBA), previously proposed by the authors, is optimized and applied to distributed target allocation in swarms of robots. Improved target allocation in terms of deployment cost efficiency is achieved through optimization of the DBA’s control parameters by means of a Genetic Algorithm. Experimental results show that with the optimized set of parameters, the deployment cost measured as the average distance traveled by the robots is reduced. The cost-efficient deployment is in some cases achieved at the expense of increased robots’ distribution error. Nevertheless, the proposed approach allows the swarm to adapt to the operating conditions when available resources are scarce. PMID:22346677
NASA Astrophysics Data System (ADS)
Huang, Yu
Solar energy becomes one of the major alternative renewable energy options for its huge abundance and accessibility. Due to the intermittent nature, the high demand of Maximum Power Point Tracking (MPPT) techniques exists when a Photovoltaic (PV) system is used to extract energy from the sunlight. This thesis proposed an advanced Perturbation and Observation (P&O) algorithm aiming for relatively practical circumstances. Firstly, a practical PV system model is studied with determining the series and shunt resistances which are neglected in some research. Moreover, in this proposed algorithm, the duty ratio of a boost DC-DC converter is the object of the perturbation deploying input impedance conversion to achieve working voltage adjustment. Based on the control strategy, the adaptive duty ratio step size P&O algorithm is proposed with major modifications made for sharp insolation change as well as low insolation scenarios. Matlab/Simulink simulation for PV model, boost converter control strategy and various MPPT process is conducted step by step. The proposed adaptive P&O algorithm is validated by the simulation results and detail analysis of sharp insolation changes, low insolation condition and continuous insolation variation.
A Genetic Algorithm Approach to Motion Sensor Placement in Smart Environments.
Thomas, Brian L; Crandall, Aaron S; Cook, Diane J
2016-04-01
Smart environments and ubiquitous computing technologies hold great promise for a wide range of real world applications. The medical community is particularly interested in high quality measurement of activities of daily living. With accurate computer modeling of older adults, decision support tools may be built to assist care providers. One aspect of effectively deploying these technologies is determining where the sensors should be placed in the home to effectively support these end goals. This work introduces and evaluates a set of approaches for generating sensor layouts in the home. These approaches range from the gold standard of human intuition-based placement to more advanced search algorithms, including Hill Climbing and Genetic Algorithms. The generated layouts are evaluated based on their ability to detect activities while minimizing the number of needed sensors. Sensor-rich environments can provide valuable insights about adults as they go about their lives. These sensors, once in place, provide information on daily behavior that can facilitate an aging-in-place approach to health care.
A Genetic Algorithm Approach to Motion Sensor Placement in Smart Environments
Thomas, Brian L.; Crandall, Aaron S.; Cook, Diane J.
2016-01-01
Smart environments and ubiquitous computing technologies hold great promise for a wide range of real world applications. The medical community is particularly interested in high quality measurement of activities of daily living. With accurate computer modeling of older adults, decision support tools may be built to assist care providers. One aspect of effectively deploying these technologies is determining where the sensors should be placed in the home to effectively support these end goals. This work introduces and evaluates a set of approaches for generating sensor layouts in the home. These approaches range from the gold standard of human intuition-based placement to more advanced search algorithms, including Hill Climbing and Genetic Algorithms. The generated layouts are evaluated based on their ability to detect activities while minimizing the number of needed sensors. Sensor-rich environments can provide valuable insights about adults as they go about their lives. These sensors, once in place, provide information on daily behavior that can facilitate an aging-in-place approach to health care. PMID:27453810
Peer-to-peer model for the area coverage and cooperative control of mobile sensor networks
NASA Astrophysics Data System (ADS)
Tan, Jindong; Xi, Ning
2004-09-01
This paper presents a novel model and distributed algorithms for the cooperation and redeployment of mobile sensor networks. A mobile sensor network composes of a collection of wireless connected mobile robots equipped with a variety of sensors. In such a sensor network, each mobile node has sensing, computation, communication, and locomotion capabilities. The locomotion ability enhances the autonomous deployment of the system. The system can be rapidly deployed to hostile environment, inaccessible terrains or disaster relief operations. The mobile sensor network is essentially a cooperative multiple robot system. This paper first presents a peer-to-peer model to define the relationship between neighboring communicating robots. Delaunay Triangulation and Voronoi diagrams are used to define the geometrical relationship between sensor nodes. This distributed model allows formal analysis for the fusion of spatio-temporal sensory information of the network. Based on the distributed model, this paper discusses a fault tolerant algorithm for autonomous self-deployment of the mobile robots. The algorithm considers the environment constraints, the presence of obstacles and the nonholonomic constraints of the robots. The distributed algorithm enables the system to reconfigure itself such that the area covered by the system can be enlarged. Simulation results have shown the effectiveness of the distributed model and deployment algorithms.
An investigation of interference coordination in heterogeneous network for LTE-Advanced systems
NASA Astrophysics Data System (ADS)
Hasan, M. K.; Ismail, A. F.; H, Aisha-Hassan A.; Abdullah, Khaizuran; Ramli, H. A. M.
2013-12-01
The novel "femtocell" in Heterogeneous Network (HetNet) for LTE-Advanced (LTE-A) set-up will allow Malaysian wireless telecommunication operators (Maxis, Celcom, Digi, U-Mobile, P1, YTL and etc2.) to extend connectivity coverage where access would otherwise be limited or unavailable, particularly indoors of large building complexes. A femtocell is a small-sized cellular base station that encompasses all the functionality of a typical station. It therefore allows a simpler and self-contained deployment including private residences. For the Malaysian service providers, the main attractions of femtocell usage are the improvements to both coverage and capacity. The operators can provide a better service to end-users in turn reduce much of the agitations and complaints. There will be opportunity for new services at reduced cost. In addition, the operator not only benefits from the improved capacity and coverage but also can reduce both capital expenditure and operating expense i.e. alternative to brand new base station or macrocell installation. Interference is a key issue associated with femtocell development. There are a large number of issues associated with interference all of which need to be investigated, identified, quantified and solved. This is to ensure that the deployment of any femtocells will take place successfully. Among the most critical challenges in femtocell deployment is the interference between femtocell-to-macrocell and femtocell-to-femtocell in HetNets. In this paper, all proposed methods and algorithms will be investigated in the OFDMA femtocell system considering HetNet scenarios for LTE-A.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wood, E.; Neubauer, J.; Burton, E.
The disparate characteristics between conventional (CVs) and battery electric vehicles (BEVs) in terms of driving range, refill/recharge time, and availability of refuel/recharge infrastructure inherently limit the relative utility of BEVs when benchmarked against traditional driver travel patterns. However, given a high penetration of high-power public charging combined with driver tolerance for rerouting travel to facilitate charging on long-distance trips, the difference in utility between CVs and BEVs could be marginalized. We quantify the relationships between BEV utility, the deployment of fast chargers, and driver tolerance for rerouting travel and extending travel durations by simulating BEVs operated over real-world travel patternsmore » using the National Renewable Energy Laboratory's Battery Lifetime Analysis and Simulation Tool for Vehicles (BLAST-V). With support from the U.S. Department of Energy's Vehicle Technologies Office, BLAST-V has been developed to include algorithms for estimating the available range of BEVs prior to the start of trips, for rerouting baseline travel to utilize public charging infrastructure when necessary, and for making driver travel decisions for those trips in the presence of available public charging infrastructure, all while conducting advanced vehicle simulations that account for battery electrical, thermal, and degradation response. Results from BLAST-V simulations on vehicle utility, frequency of inserted stops, duration of charging events, and additional time and distance necessary for rerouting travel are presented to illustrate how BEV utility and travel patterns can be affected by various fast charge deployments.« less
NASA Technical Reports Server (NTRS)
Mehr, Ali Farhang; Sauvageon, Julien; Agogino, Alice M.; Tumer, Irem Y.
2006-01-01
Recent advances in micro electromechanical systems technology, digital electronics, and wireless communications have enabled development of low-cost, low-power, multifunctional miniature smart sensors. These sensors can be deployed throughout a region in an aerospace vehicle to build a network for measurement, detection and surveillance applications. Event detection using such centralized sensor networks is often regarded as one of the most promising health management technologies in aerospace applications where timely detection of local anomalies has a great impact on the safety of the mission. In this paper, we propose to conduct a qualitative comparison of several local event detection algorithms for centralized redundant sensor networks. The algorithms are compared with respect to their ability to locate and evaluate an event in the presence of noise and sensor failures for various node geometries and densities.
Intelligent cognitive radio jamming - a game-theoretical approach
NASA Astrophysics Data System (ADS)
Dabcevic, Kresimir; Betancourt, Alejandro; Marcenaro, Lucio; Regazzoni, Carlo S.
2014-12-01
Cognitive radio (CR) promises to be a solution for the spectrum underutilization problems. However, security issues pertaining to cognitive radio technology are still an understudied topic. One of the prevailing such issues are intelligent radio frequency (RF) jamming attacks, where adversaries are able to exploit on-the-fly reconfigurability potentials and learning mechanisms of cognitive radios in order to devise and deploy advanced jamming tactics. In this paper, we use a game-theoretical approach to analyze jamming/anti-jamming behavior between cognitive radio systems. A non-zero-sum game with incomplete information on an opponent's strategy and payoff is modelled as an extension of Markov decision process (MDP). Learning algorithms based on adaptive payoff play and fictitious play are considered. A combination of frequency hopping and power alteration is deployed as an anti-jamming scheme. A real-life software-defined radio (SDR) platform is used in order to perform measurements useful for quantifying the jamming impacts, as well as to infer relevant hardware-related properties. Results of these measurements are then used as parameters for the modelled jamming/anti-jamming game and are compared to the Nash equilibrium of the game. Simulation results indicate, among other, the benefit provided to the jammer when it is employed with the spectrum sensing algorithm in proactive frequency hopping and power alteration schemes.
Lin, Yunyue; Wu, Qishi; Cai, Xiaoshan; ...
2010-01-01
Data transmission from sensor nodes to a base station or a sink node often incurs significant energy consumption, which critically affects network lifetime. We generalize and solve the problem of deploying multiple base stations to maximize network lifetime in terms of two different metrics under one-hop and multihop communication models. In the one-hop communication model, the sensors far away from base stations always deplete their energy much faster than others. We propose an optimal solution and a heuristic approach based on the minimal enclosing circle algorithm to deploy a base station at the geometric center of each cluster. In themore » multihop communication model, both base station location and data routing mechanism need to be considered in maximizing network lifetime. We propose an iterative algorithm based on rigorous mathematical derivations and use linear programming to compute the optimal routing paths for data transmission. Simulation results show the distinguished performance of the proposed deployment algorithms in maximizing network lifetime.« less
47 CFR 51.233 - Significant degradation of services caused by deployment of advanced services.
Code of Federal Regulations, 2010 CFR
2010-10-01
... deployment of advanced services. 51.233 Section 51.233 Telecommunication FEDERAL COMMUNICATIONS COMMISSION... relevant state commission that a particular technology deployment is causing the significant degradation..., the relevant state commission, must be supported with specific and verifiable information. (d) Where a...
Fault-Tolerant Algorithms for Connectivity Restoration in Wireless Sensor Networks.
Zeng, Yali; Xu, Li; Chen, Zhide
2015-12-22
As wireless sensor network (WSN) is often deployed in a hostile environment, nodes in the networks are prone to large-scale failures, resulting in the network not working normally. In this case, an effective restoration scheme is needed to restore the faulty network timely. Most of existing restoration schemes consider more about the number of deployed nodes or fault tolerance alone, but fail to take into account the fact that network coverage and topology quality are also important to a network. To address this issue, we present two algorithms named Full 2-Connectivity Restoration Algorithm (F2CRA) and Partial 3-Connectivity Restoration Algorithm (P3CRA), which restore a faulty WSN in different aspects. F2CRA constructs the fan-shaped topology structure to reduce the number of deployed nodes, while P3CRA constructs the dual-ring topology structure to improve the fault tolerance of the network. F2CRA is suitable when the restoration cost is given the priority, and P3CRA is suitable when the network quality is considered first. Compared with other algorithms, these two algorithms ensure that the network has stronger fault-tolerant function, larger coverage area and better balanced load after the restoration.
Constrained Low-Interference Relay Node Deployment for Underwater Acoustic Wireless Sensor Networks
NASA Astrophysics Data System (ADS)
Li, Deying; Li, Zheng; Ma, Wenkai; Chen, Wenping
An Underwater Acoustic Wireless Sensor Network (UA-WSN) consists of many resource-constrained Underwater Sensor Nodes (USNs), which are deployed to perform collaborative monitoring tasks over a given region. One way to preserve network connectivity while guaranteing other network QoS is to deploy some Relay Nodes (RNs) in the networks, in which RNs' function is more powerful than USNs and their cost is more expensive. This paper addresses Constrained Low-interference Relay Node Deployment (C-LRND) problem for 3-D UA-WSNs in which the RNs are placed at a subset of candidate locations to ensure connectivity between the USNs, under both the number of RNs deployed and the value of total incremental interference constraints. We first prove that it is NP-hard, then present a general approximation algorithm framework and get two polynomial time O(1)-approximation algorithms.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aziz, H. M. Abdul; Wang, Hong; Young, Stan
Documenting existing state of practice is an initial step in developing future control infrastructure to be co-deployed for heterogeneous mix of connected and automated vehicles with human drivers while leveraging benefits to safety, congestion, and energy. With advances in information technology and extensive deployment of connected and automated vehicle technology anticipated over the coming decades, cities globally are making efforts to plan and prepare for these transitions. CAVs not only offer opportunities to improve transportation systems through enhanced safety and efficient operations of vehicles. There are also significant needs in terms of exploring how best to leverage vehicle-to-vehicle (V2V) technology,more » vehicle-to-infrastructure (V2I) technology and vehicle-to-everything (V2X) technology. Both Connected Vehicle (CV) and Connected and Automated Vehicle (CAV) paradigms feature bi-directional connectivity and share similar applications in terms of signal control algorithm and infrastructure implementation. The discussion in our synthesis study assumes the CAV/CV context where connectivity exists with or without automated vehicles. Our synthesis study explores the current state of signal control algorithms and infrastructure, reports the completed and newly proposed CV/CAV deployment studies regarding signal control schemes, reviews the deployment costs for CAV/AV signal infrastructure, and concludes with a discussion on the opportunities such as detector free signal control schemes and dynamic performance management for intersections, and challenges such as dependency on market adaptation and the need to build a fault-tolerant signal system deployment in a CAV/CV environment. The study will serve as an initial critical assessment of existing signal control infrastructure (devices, control instruments, and firmware) and control schemes (actuated, adaptive, and coordinated-green wave). Also, the report will help to identify the future needs for the signal infrastructure to act as the nervous system for urban transportation networks, providing not only signaling, but also observability, surveillance, and measurement capacity. The discussion of the opportunities space includes network optimization and control theory perspectives, and the current states of observability for key system parameters (what can be detected, how frequently can it be reported) as well as controllability of dynamic parameters (this includes adjusting not only the signal phase and timing, but also the ability to alter vehicle trajectories through information or direct control). The perspective of observability and controllability of the dynamic systems provides an appropriate lens to discuss future directions as CAV/CV become more prevalent in the future.« less
Node Deployment Algorithm Based on Connected Tree for Underwater Sensor Networks
Jiang, Peng; Wang, Xingmin; Jiang, Lurong
2015-01-01
Designing an efficient deployment method to guarantee optimal monitoring quality is one of the key topics in underwater sensor networks. At present, a realistic approach of deployment involves adjusting the depths of nodes in water. One of the typical algorithms used in such process is the self-deployment depth adjustment algorithm (SDDA). This algorithm mainly focuses on maximizing network coverage by constantly adjusting node depths to reduce coverage overlaps between two neighboring nodes, and thus, achieves good performance. However, the connectivity performance of SDDA is irresolute. In this paper, we propose a depth adjustment algorithm based on connected tree (CTDA). In CTDA, the sink node is used as the first root node to start building a connected tree. Finally, the network can be organized as a forest to maintain network connectivity. Coverage overlaps between the parent node and the child node are then reduced within each sub-tree to optimize coverage. The hierarchical strategy is used to adjust the distance between the parent node and the child node to reduce node movement. Furthermore, the silent mode is adopted to reduce communication cost. Simulations show that compared with SDDA, CTDA can achieve high connectivity with various communication ranges and different numbers of nodes. Moreover, it can realize coverage as high as that of SDDA with various sensing ranges and numbers of nodes but with less energy consumption. Simulations under sparse environments show that the connectivity and energy consumption performances of CTDA are considerably better than those of SDDA. Meanwhile, the connectivity and coverage performances of CTDA are close to those depth adjustment algorithms base on connected dominating set (CDA), which is an algorithm similar to CTDA. However, the energy consumption of CTDA is less than that of CDA, particularly in sparse underwater environments. PMID:26184209
Optimal Control of a Surge-Mode WEC in Random Waves
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chertok, Allan; Ceberio, Olivier; Staby, Bill
2016-08-30
The objective of this project was to develop one or more real-time feedback and feed-forward (MPC) control algorithms for an Oscillating Surge Wave Converter (OSWC) developed by RME called SurgeWEC™ that leverages recent innovations in wave energy converter (WEC) control theory to maximize power production in random wave environments. The control algorithms synthesized innovations in dynamic programming and nonlinear wave dynamics using anticipatory wave sensors and localized sensor measurements; e.g. position and velocity of the WEC Power Take Off (PTO), with predictive wave forecasting data. The result was an advanced control system that uses feedback or feed-forward data from anmore » array of sensor channels comprised of both localized and deployed sensors fused into a single decision process that optimally compensates for uncertainties in the system dynamics, wave forecasts, and sensor measurement errors.« less
Performance characterization of image and video analysis systems at Siemens Corporate Research
NASA Astrophysics Data System (ADS)
Ramesh, Visvanathan; Jolly, Marie-Pierre; Greiffenhagen, Michael
2000-06-01
There has been a significant increase in commercial products using imaging analysis techniques to solve real-world problems in diverse fields such as manufacturing, medical imaging, document analysis, transportation and public security, etc. This has been accelerated by various factors: more advanced algorithms, the availability of cheaper sensors, and faster processors. While algorithms continue to improve in performance, a major stumbling block in translating improvements in algorithms to faster deployment of image analysis systems is the lack of characterization of limits of algorithms and how they affect total system performance. The research community has realized the need for performance analysis and there have been significant efforts in the last few years to remedy the situation. Our efforts at SCR have been on statistical modeling and characterization of modules and systems. The emphasis is on both white-box and black box methodologies to evaluate and optimize vision systems. In the first part of this paper we review the literature on performance characterization and then provide an overview of the status of research in performance characterization of image and video understanding systems. The second part of the paper is on performance evaluation of medical image segmentation algorithms. Finally, we highlight some research issues in performance analysis in medical imaging systems.
Context-Aided Sensor Fusion for Enhanced Urban Navigation
Martí, Enrique David; Martín, David; García, Jesús; de la Escalera, Arturo; Molina, José Manuel; Armingol, José María
2012-01-01
The deployment of Intelligent Vehicles in urban environments requires reliable estimation of positioning for urban navigation. The inherent complexity of this kind of environments fosters the development of novel systems which should provide reliable and precise solutions to the vehicle. This article details an advanced GNSS/IMU fusion system based on a context-aided Unscented Kalman filter for navigation in urban conditions. The constrained non-linear filter is here conditioned by a contextual knowledge module which reasons about sensor quality and driving context in order to adapt it to the situation, while at the same time it carries out a continuous estimation and correction of INS drift errors. An exhaustive analysis has been carried out with available data in order to characterize the behavior of available sensors and take it into account in the developed solution. The performance is then analyzed with an extensive dataset containing representative situations. The proposed solution suits the use of fusion algorithms for deploying Intelligent Transport Systems in urban environments. PMID:23223080
Context-aided sensor fusion for enhanced urban navigation.
Martí, Enrique David; Martín, David; García, Jesús; de la Escalera, Arturo; Molina, José Manuel; Armingol, José María
2012-12-06
The deployment of Intelligent Vehicles in urban environments requires reliable estimation of positioning for urban navigation. The inherent complexity of this kind of environments fosters the development of novel systems which should provide reliable and precise solutions to the vehicle. This article details an advanced GNSS/IMU fusion system based on a context-aided Unscented Kalman filter for navigation in urban conditions. The constrained non-linear filter is here conditioned by a contextual knowledge module which reasons about sensor quality and driving context in order to adapt it to the situation, while at the same time it carries out a continuous estimation and correction of INS drift errors. An exhaustive analysis has been carried out with available data in order to characterize the behavior of available sensors and take it into account in the developed solution. The performance is then analyzed with an extensive dataset containing representative situations. The proposed solution suits the use of fusion algorithms for deploying Intelligent Transport Systems in urban environments.
Underwater Sensor Network Redeployment Algorithm Based on Wolf Search
Jiang, Peng; Feng, Yang; Wu, Feng
2016-01-01
This study addresses the optimization of node redeployment coverage in underwater wireless sensor networks. Given that nodes could easily become invalid under a poor environment and the large scale of underwater wireless sensor networks, an underwater sensor network redeployment algorithm was developed based on wolf search. This study is to apply the wolf search algorithm combined with crowded degree control in the deployment of underwater wireless sensor networks. The proposed algorithm uses nodes to ensure coverage of the events, and it avoids the prematurity of the nodes. The algorithm has good coverage effects. In addition, considering that obstacles exist in the underwater environment, nodes are prevented from being invalid by imitating the mechanism of avoiding predators. Thus, the energy consumption of the network is reduced. Comparative analysis shows that the algorithm is simple and effective in wireless sensor network deployment. Compared with the optimized artificial fish swarm algorithm, the proposed algorithm exhibits advantages in network coverage, energy conservation, and obstacle avoidance. PMID:27775659
Bio-Optical Measurement and Modeling of the California Current and Polar Oceans. Chapter 13
NASA Technical Reports Server (NTRS)
Mitchell, B. Greg
2001-01-01
This Sensor Intercomparison and Merger for Biological and Interdisciplinary Oceanic Studies (SIMBIOS) project contract supports in situ ocean optical observations in the California Current, Southern Ocean, Indian Ocean as well as merger of other in situ data sets we have collected on various global cruises supported by separate grants or contracts. The principal goals of our research are to validate standard or experimental products through detailed bio-optical and biogeochemical measurements, and to combine ocean optical observations with advanced radiative transfer modeling to contribute to satellite vicarious radiometric calibration and advanced algorithm development. In collaboration with major oceanographic ship-based observation programs funded by various agencies (CalCOFI, US JGOFS, NOAA AMLR, INDOEX and Japan/East Sea) our SIMBIOS effort has resulted in data from diverse bio-optical provinces. For these global deployments we generate a high-quality, methodologically consistent, data set encompassing a wide-range of oceanic conditions. Global data collected in recent years have been integrated with our on-going CalCOFI database and have been used to evaluate Sea-Viewing Wide Field-of-view Sensor (SeaWiFS) algorithms and to carry out validation studies. The combined database we have assembled now comprises more than 700 stations and includes observations for the clearest oligotrophic waters, highly eutrophic blooms, red-tides and coastal case two conditions. The data has been used to validate water-leaving radiance estimated with SeaWiFS as well as bio optical algorithms for chlorophyll pigments. The comprehensive data is utilized for development of experimental algorithms (e.g., high-low latitude pigment transition, phytoplankton absorption, and cDOM).
Dong, Yu-Shuang; Xu, Gao-Chao; Fu, Xiao-Dong
2014-01-01
The cloud platform provides various services to users. More and more cloud centers provide infrastructure as the main way of operating. To improve the utilization rate of the cloud center and to decrease the operating cost, the cloud center provides services according to requirements of users by sharding the resources with virtualization. Considering both QoS for users and cost saving for cloud computing providers, we try to maximize performance and minimize energy cost as well. In this paper, we propose a distributed parallel genetic algorithm (DPGA) of placement strategy for virtual machines deployment on cloud platform. It executes the genetic algorithm parallelly and distributedly on several selected physical hosts in the first stage. Then it continues to execute the genetic algorithm of the second stage with solutions obtained from the first stage as the initial population. The solution calculated by the genetic algorithm of the second stage is the optimal one of the proposed approach. The experimental results show that the proposed placement strategy of VM deployment can ensure QoS for users and it is more effective and more energy efficient than other placement strategies on the cloud platform. PMID:25097872
Dong, Yu-Shuang; Xu, Gao-Chao; Fu, Xiao-Dong
2014-01-01
The cloud platform provides various services to users. More and more cloud centers provide infrastructure as the main way of operating. To improve the utilization rate of the cloud center and to decrease the operating cost, the cloud center provides services according to requirements of users by sharding the resources with virtualization. Considering both QoS for users and cost saving for cloud computing providers, we try to maximize performance and minimize energy cost as well. In this paper, we propose a distributed parallel genetic algorithm (DPGA) of placement strategy for virtual machines deployment on cloud platform. It executes the genetic algorithm parallelly and distributedly on several selected physical hosts in the first stage. Then it continues to execute the genetic algorithm of the second stage with solutions obtained from the first stage as the initial population. The solution calculated by the genetic algorithm of the second stage is the optimal one of the proposed approach. The experimental results show that the proposed placement strategy of VM deployment can ensure QoS for users and it is more effective and more energy efficient than other placement strategies on the cloud platform.
NASA Astrophysics Data System (ADS)
Chalmers, Alex
2007-10-01
A simple model is presented of a possible inspection regimen applied to each leg of a cargo containers' journey between its point of origin and destination. Several candidate modalities are proposed to be used at multiple remote locations to act as a pre-screen inspection as the target approaches a perimeter and as the primary inspection modality at the portal. Information from multiple data sets are fused to optimize the costs and performance of a network of such inspection systems. A series of image processing algorithms are presented that automatically process X-ray images of containerized cargo. The goal of this processing is to locate the container in a real time stream of traffic traversing a portal without impeding the flow of commerce. Such processing may facilitate the inclusion of unmanned/unattended inspection systems in such a network. Several samples of the processing applied to data collected from deployed systems are included. Simulated data from a notional cargo inspection system with multiple sensor modalities and advanced data fusion algorithms are also included to show the potential increased detection and throughput performance of such a configuration.
DOT National Transportation Integrated Search
2016-07-01
The Tampa Hillsborough Expressway Authority (THEA) Connected Vehicle (CV) Pilot Deployment Program is part of a national effort to advance CV technologies by deploying, demonstrating, testing and offering lessons learned for future deployers. The THE...
An earth imaging camera simulation using wide-scale construction of reflectance surfaces
NASA Astrophysics Data System (ADS)
Murthy, Kiran; Chau, Alexandra H.; Amin, Minesh B.; Robinson, M. Dirk
2013-10-01
Developing and testing advanced ground-based image processing systems for earth-observing remote sensing applications presents a unique challenge that requires advanced imagery simulation capabilities. This paper presents an earth-imaging multispectral framing camera simulation system called PayloadSim (PaySim) capable of generating terabytes of photorealistic simulated imagery. PaySim leverages previous work in 3-D scene-based image simulation, adding a novel method for automatically and efficiently constructing 3-D reflectance scenes by draping tiled orthorectified imagery over a geo-registered Digital Elevation Map (DEM). PaySim's modeling chain is presented in detail, with emphasis given to the techniques used to achieve computational efficiency. These techniques as well as cluster deployment of the simulator have enabled tuning and robust testing of image processing algorithms, and production of realistic sample data for customer-driven image product development. Examples of simulated imagery of Skybox's first imaging satellite are shown.
Bio-Optical Measurement and Modeling of the California Current and Polar Oceans
NASA Technical Reports Server (NTRS)
Mitchell, B. Greg; Fargion, Giulietta S. (Technical Monitor)
2001-01-01
The principal goals of our research are to validate standard or experimental products through detailed bio-optical and biogeochemical measurements, and to combine ocean optical observations with advanced radiative transfer modeling to contribute to satellite vicarious radiometric calibration and advanced algorithm development. To achieve our goals requires continued efforts to execute complex field programs globally, as well as development of advanced ocean optical measurement protocols. We completed a comprehensive set of ocean optical observations in the California Current, Southern Ocean, Indian Ocean requiring a large commitment to instrument calibration, measurement protocols, data processing and data merger. We augmented separately funded projects of our own, as well as others, to acquire ill situ data sets we have collected on various global cruises supported by separate grants or contracts. In collaboration with major oceanographic ship-based observation programs funded by various agencies (CalCOFI, US JGOFS, NOAA AMLR, INDOEX and Japan/East Sea) our SIMBIOS effort has resulted in data from diverse bio-optical provinces. For these global deployments we generate a high-quality, methodologically consistent, data set encompassing a wide-range of oceanic conditions. Global data collected in recent years have been integrated with our on-going CalCOFI database and have been used to evaluate SeaWiFS algorithms and to carry out validation studies. The combined database we have assembled now comprises more than 700 stations and includes observations for the clearest oligotrophic waters, highly eutrophic blooms, red-tides and coastal case 2 conditions. The data has been used to validate water-leaving radiance estimated with SeaWiFS as well as bio-optical algorithms for chlorophyll pigments. The comprehensive data is utilized for development of experimental algorithms (e.g. high-low latitude pigment transition, phytoplankton absorption, and cDOM). During this period we completed 9 peer-reviewed publications in high quality journals, and presented aspects of our work at more than 10 scientific conferences.
A revolute joint with linear load-displacement response for a deployable lidar telescope
NASA Technical Reports Server (NTRS)
Lake, Mark S.; Warren, Peter A.; Peterson, Lee D.
1996-01-01
NASA Langley Research Center is developing concepts for an advanced spacecraft, called LidarTechSat, to demonstrate key structures and mechanisms technologies necessary to deploy a segmented telescope reflector. Achieving micron-accuracy deployment requires significant advancements in deployment mechanism design, such as the revolute joint presented herein. The joint exhibits load-cycling response that is essentially linear with less than 2% hysteresis, and the joint rotates with less than 7 mN-m (1 in-oz) of resistance. A prototype reflector metering truss incorporating the joint exhibits only a few microns of kinematic error under repected deployment and impulse loading. No other mechanically deployment structure found in the literature has been demonstrated to be this kinematically accurate.
A flexible architecture for advanced process control solutions
NASA Astrophysics Data System (ADS)
Faron, Kamyar; Iourovitski, Ilia
2005-05-01
Advanced Process Control (APC) is now mainstream practice in the semiconductor manufacturing industry. Over the past decade and a half APC has evolved from a "good idea", and "wouldn"t it be great" concept to mandatory manufacturing practice. APC developments have primarily dealt with two major thrusts, algorithms and infrastructure, and often the line between them has been blurred. The algorithms have evolved from very simple single variable solutions to sophisticated and cutting edge adaptive multivariable (input and output) solutions. Spending patterns in recent times have demanded that the economics of a comprehensive APC infrastructure be completely justified for any and all cost conscious manufacturers. There are studies suggesting integration costs as high as 60% of the total APC solution costs. Such cost prohibitive figures clearly diminish the return on APC investments. This has limited the acceptance and development of pure APC infrastructure solutions for many fabs. Modern APC solution architectures must satisfy the wide array of requirements from very manual R&D environments to very advanced and automated "lights out" manufacturing facilities. A majority of commercially available control solutions and most in house developed solutions lack important attributes of scalability, flexibility, and adaptability and hence require significant resources for integration, deployment, and maintenance. Many APC improvement efforts have been abandoned and delayed due to legacy systems and inadequate architectural design. Recent advancements (Service Oriented Architectures) in the software industry have delivered ideal technologies for delivering scalable, flexible, and reliable solutions that can seamlessly integrate into any fabs" existing system and business practices. In this publication we shall evaluate the various attributes of the architectures required by fabs and illustrate the benefits of a Service Oriented Architecture to satisfy these requirements. Blue Control Technologies has developed an advance service oriented architecture Run to Run Control System which addresses these requirements.
Self-deployable mobile sensor networks for on-demand surveillance
NASA Astrophysics Data System (ADS)
Miao, Lidan; Qi, Hairong; Wang, Feiyi
2005-05-01
This paper studies two interconnected problems in mobile sensor network deployment, the optimal placement of heterogeneous mobile sensor platforms for cost-efficient and reliable coverage purposes, and the self-organizable deployment. We first develop an optimal placement algorithm based on a "mosaicked technology" such that different types of mobile sensors form a mosaicked pattern uniquely determined by the popularity of different types of sensor nodes. The initial state is assumed to be random. In order to converge to the optimal state, we investigate the swarm intelligence (SI)-based sensor movement strategy, through which the randomly deployed sensors can self-organize themselves to reach the optimal placement state. The proposed algorithm is compared with the random movement and the centralized method using performance metrics such as network coverage, convergence time, and energy consumption. Simulation results are presented to demonstrate the effectiveness of the mosaic placement and the SI-based movement.
NASA Technical Reports Server (NTRS)
Gorospe, George E., Jr.; Daigle, Matthew J.; Sankararaman, Shankar; Kulkarni, Chetan S.; Ng, Eley
2017-01-01
Prognostic methods enable operators and maintainers to predict the future performance for critical systems. However, these methods can be computationally expensive and may need to be performed each time new information about the system becomes available. In light of these computational requirements, we have investigated the application of graphics processing units (GPUs) as a computational platform for real-time prognostics. Recent advances in GPU technology have reduced cost and increased the computational capability of these highly parallel processing units, making them more attractive for the deployment of prognostic software. We present a survey of model-based prognostic algorithms with considerations for leveraging the parallel architecture of the GPU and a case study of GPU-accelerated battery prognostics with computational performance results.
Stent deployment protocol for optimized real-time visualization during endovascular neurosurgery.
Silva, Michael A; See, Alfred P; Dasenbrock, Hormuzdiyar H; Ashour, Ramsey; Khandelwal, Priyank; Patel, Nirav J; Frerichs, Kai U; Aziz-Sultan, Mohammad A
2017-05-01
Successful application of endovascular neurosurgery depends on high-quality imaging to define the pathology and the devices as they are being deployed. This is especially challenging in the treatment of complex cases, particularly in proximity to the skull base or in patients who have undergone prior endovascular treatment. The authors sought to optimize real-time image guidance using a simple algorithm that can be applied to any existing fluoroscopy system. Exposure management (exposure level, pulse management) and image post-processing parameters (edge enhancement) were modified from traditional fluoroscopy to improve visualization of device position and material density during deployment. Examples include the deployment of coils in small aneurysms, coils in giant aneurysms, the Pipeline embolization device (PED), the Woven EndoBridge (WEB) device, and carotid artery stents. The authors report on the development of the protocol and their experience using representative cases. The stent deployment protocol is an image capture and post-processing algorithm that can be applied to existing fluoroscopy systems to improve real-time visualization of device deployment without hardware modifications. Improved image guidance facilitates aneurysm coil packing and proper positioning and deployment of carotid artery stents, flow diverters, and the WEB device, especially in the context of complex anatomy and an obscured field of view.
Analytical investigation of the dynamics of tethered constellations in Earth orbit (phase 2)
NASA Technical Reports Server (NTRS)
Lorenzini, E.; Arnold, D. A.; Grossi, M. D.; Gullahorn, G. E.
1985-01-01
The deployment maneuver of three axis vertical constellations with elastic tethers is analyzed. The deployment strategy devised previously was improved. Dampers were added to the system. Effective algorithms for damping out the fundamental vibrational modes of the system were implemented. Simulations of a complete deployment and a subsequent station keeping phase of a three mass constellation is shown.
NASA Technical Reports Server (NTRS)
Powell, Richard W.
1998-01-01
This paper describes the development and evaluation of a numerical roll reversal predictor-corrector guidance algorithm for the atmospheric flight portion of the Mars Surveyor Program 2001 Orbiter and Lander missions. The Lander mission utilizes direct entry and has a demanding requirement to deploy its parachute within 10 km of the target deployment point. The Orbiter mission utilizes aerocapture to achieve a precise captured orbit with a single atmospheric pass. Detailed descriptions of these predictor-corrector algorithms are given. Also, results of three and six degree-of-freedom Monte Carlo simulations which include navigation, aerodynamics, mass properties and atmospheric density uncertainties are presented.
Spagnolo, Daniel M; Al-Kofahi, Yousef; Zhu, Peihong; Lezon, Timothy R; Gough, Albert; Stern, Andrew M; Lee, Adrian V; Ginty, Fiona; Sarachan, Brion; Taylor, D Lansing; Chennubhotla, S Chakra
2017-11-01
We introduce THRIVE (Tumor Heterogeneity Research Interactive Visualization Environment), an open-source tool developed to assist cancer researchers in interactive hypothesis testing. The focus of this tool is to quantify spatial intratumoral heterogeneity (ITH), and the interactions between different cell phenotypes and noncellular constituents. Specifically, we foresee applications in phenotyping cells within tumor microenvironments, recognizing tumor boundaries, identifying degrees of immune infiltration and epithelial/stromal separation, and identification of heterotypic signaling networks underlying microdomains. The THRIVE platform provides an integrated workflow for analyzing whole-slide immunofluorescence images and tissue microarrays, including algorithms for segmentation, quantification, and heterogeneity analysis. THRIVE promotes flexible deployment, a maintainable code base using open-source libraries, and an extensible framework for customizing algorithms with ease. THRIVE was designed with highly multiplexed immunofluorescence images in mind, and, by providing a platform to efficiently analyze high-dimensional immunofluorescence signals, we hope to advance these data toward mainstream adoption in cancer research. Cancer Res; 77(21); e71-74. ©2017 AACR . ©2017 American Association for Cancer Research.
PROGRESS REPORT OF FY 2004 ACTIVITIES: IMPROVED WATER VAPOR AND CLOUD RETRIEVALS AT THE NSA/AAO
DOE Office of Scientific and Technical Information (OSTI.GOV)
E. R. Westwater; V. V. Leuskiy; M. Klein
2004-11-01
The basic goals of the research are to develop and test algorithms and deploy instruments that improve measurements of water vapor, cloud liquid, and cloud coverage, with a focus on the Arctic conditions of cold temperatures and low concentrations of water vapor. The importance of accurate measurements of column amounts of water vapor and cloud liquid has been well documented by scientists within the Atmospheric Radiation Measurement Program. Although several technologies have been investigated to measure these column amounts, microwave radiometers (MWR) have been used operationally by the ARM program for passive retrievals of these quantities: precipitable water vapor (PWV)more » and integrated water liquid (IWL). The technology of PWV and IWL retrievals has advanced steadily since the basic 2-channel MWR was first deployed at ARM CART sites Important advances are the development and refinement of the tipcal calibration method [1,2], and improvement of forward model radiative transfer algorithms [3,4]. However, the concern still remains that current instruments deployed by ARM may be inadequate to measure low amounts of PWV and IWL. In the case of water vapor, this is especially important because of the possibility of scaling and/or quality control of radiosondes by the water amount. Extremely dry conditions, with PWV less than 3 mm, commonly occur in Polar Regions during the winter months. Accurate measurements of the PWV during such dry conditions are needed to improve our understanding of the regional radiation energy budgets. The results of a 1999 experiment conducted at the ARM North Slope of Alaska/Adjacent Arctic Ocean (NSA/AAO) site during March of 1999 [5] have shown that the strength associated with the 183 GHz water vapor absorption line makes radiometry in this frequency regime suitable for measuring low amounts of PWV. As a portion of our research, we conducted another millimeter wave radiometric experiment at the NSA/AAO in March-April 2004. This experiment relied heavily on our experiences of the 1999 experiment. Particular attention was paid to issues of radiometric calibration and radiosonde intercomparisons. Our theoretical and experimental work also supplements efforts by industry (F. Solheim, Private Communication) to develop sub-millimeter radiometers for ARM deployment. In addition to quantitative improvement of water vapor measurements at cold temperature, the impact of adding millimeter-wave window channels to improve the sensitivity to arctic clouds was studied. We also deployed an Infrared Cloud Imager (ICI) during this experiment, both for measuring continuous day-night statistics of the study of cloud coverage and identifying conditions suitable for tipcal analysis. This system provided the first capability of determining spatial cloud statistics continuously in both day and night at the NSA site and has been used to demonstrate that biases exist in inferring cloud statistics from either zenith-pointing active sensors (lidars or radars) or sky imagers that rely on scattered sunlight in daytime and star maps at night [6].« less
Sensing Home: A Cost-Effective Design for Smart Home via Heterogeneous Wireless Networks
Fan, Xiaohu; Huang, Hao; Qi, Shipeng; Luo, Xincheng; Zeng, Jing; Xie, Qubo; Xie, Changsheng
2015-01-01
The aging population has inspired the marketing of advanced real time devices for home health care, more and more wearable devices and mobile applications, which have emerged in this field. However, to properly collect behavior information, accurately recognize human activities, and deploy the whole system in a real living environment is a challenging task. In this paper, we propose a feasible wireless-based solution to deploy a data collection scheme, activity recognition model, feedback control and mobile integration via heterogeneous networks. We compared and found a suitable algorithm that can be run on cost-efficient embedded devices. Specifically, we use the Super Set Transformation method to map the raw data into a sparse binary matrix. Furthermore, designed front-end devices of low power consumption gather the living data of the habitant via ZigBee to reduce the burden of wiring work. Finally, we evaluated our approach and show it can achieve a theoretical time-slice accuracy of 98%. The mapping solution we propose is compatible with more wearable devices and mobile apps. PMID:26633424
An Advanced Sensor Network Design For Subglacial Sensing
NASA Astrophysics Data System (ADS)
Martinez, K.; Hart, J. K.; Elsaify, A.; Zou, G.; Padhy, P.; Riddoch, A.
2006-12-01
In the Glacsweb project a sensor network has been designed to take sensor measurements inside glaciers and send the data back to a web server autonomously. A wide range of experience was gained in the deployment of the earlier systems and this has been used to develop new hardware and software to better meet the needs of glaciologists using the data from the system. The system was reduced in size, new sensors (compass, light sensor) were added and the radio communications system completely changed. The new 173MHz radio system was designed with an antenna tuned to work in ice and a new network algorithm written to provide better data security. Probes can communicate data through each other (ad-hoc network) and store many months of data in a large buffer to cope with long term communications failures. New sensors include a light reflection measurement in order to provide data on the surrounding material. This paper will discuss the design decisions, the effectiveness of the final system and generic outcomes of use to sensor network designers deploying in difficult environments.
Sensing Home: A Cost-Effective Design for Smart Home via Heterogeneous Wireless Networks.
Fan, Xiaohu; Huang, Hao; Qi, Shipeng; Luo, Xincheng; Zeng, Jing; Xie, Qubo; Xie, Changsheng
2015-12-03
The aging population has inspired the marketing of advanced real time devices for home health care, more and more wearable devices and mobile applications, which have emerged in this field. However, to properly collect behavior information, accurately recognize human activities, and deploy the whole system in a real living environment is a challenging task. In this paper, we propose a feasible wireless-based solution to deploy a data collection scheme, activity recognition model, feedback control and mobile integration via heterogeneous networks. We compared and found a suitable algorithm that can be run on cost-efficient embedded devices. Specifically, we use the Super Set Transformation method to map the raw data into a sparse binary matrix. Furthermore, designed front-end devices of low power consumption gather the living data of the habitant via ZigBee to reduce the burden of wiring work. Finally, we evaluated our approach and show it can achieve a theoretical time-slice accuracy of 98%. The mapping solution we propose is compatible with more wearable devices and mobile apps.
Control algorithms of SONET integrated self-healing networks
NASA Astrophysics Data System (ADS)
Hasegawa, Satoshi; Okaoue, Yasuyo; Egawa, Takashi; Sakauchi, Hideki
1994-01-01
As the deployment of high-speed fiber transmission systems has been accelerated, they are widely recognized as a firm infrastructure of information society. Under this circumstance, the importance of network survivability has been increasing rapidly in these days. In SONET, the self-healing networks have been highlighted as one of the most advanced mechanisms to realize SONET survivable networks. Several schemes have been proposed and studied actively due to a rapid progress on the development of highly intelligent NE's. Among them in this paper, a DCS based distributed self-healing network is discussed from a viewpoint of its control algorithms. Specifically, our self-healing algorithm called TRANS is explained in detail, which possesses such desirable features as providing fast and flexible restoration with line and path level restoration applied to an individual STS-1 channel, capability to handle multiple and even node failures, and so on. Both software simulation and hardware experiment verify that TRANS works properly in a real distributed environment, the result of which is shown in the paper. In addition, the combined use of TRANS and the ring restoration control is proposed taking into account the use in a practical SONET.
DOT National Transportation Integrated Search
2001-06-01
One hundred seventy five fatalities - primarily children and small women - have been attributed to the deployment of an air bag in relatively low-speed crashes as of April 2001. Advanced air bag systems tailor the deployment of the bags to the charac...
Fuel Reforming Technologies (BRIEFING SLIDES)
2009-09-01
Heat and Mass Transfer , Catalysis...Gallons Of Fuel/Day/1100men Deployment To Reduce Noise/Thermal Signature And 4 Environmental Emissions Advanced Heat and Mass Transfer 5 Advanced... Heat and Mass & Transfer Technologies Objective Identify And Develop New Technologies To Enhance Heat And Mass Transfer In Deployed Energy
Planning and Execution: The Spirit of Opportunity for Robust Autonomous Systems
NASA Technical Reports Server (NTRS)
Muscettola, Nicola
2004-01-01
One of the most exciting endeavors pursued by human kind is the search for life in the Solar System and the Universe at large. NASA is leading this effort by designing, deploying and operating robotic systems that will reach planets, planet moons, asteroids and comets searching for water, organic building blocks and signs of past or present microbial life. None of these missions will be achievable without substantial advances in.the design, implementation and validation of autonomous control agents. These agents must be capable of robustly controlling a robotic explorer in a hostile environment with very limited or no communication with Earth. The talk focuses on work pursued at the NASA Ames Research center ranging from basic research on algorithm to deployed mission support systems. We will start by discussing how planning and scheduling technology derived from the Remote Agent experiment is being used daily in the operations of the Spirit and Opportunity rovers. Planning and scheduling is also used as the fundamental paradigm at the core of our research in real-time autonomous agents. In particular, we will describe our efforts in the Intelligent Distributed Execution Architecture (IDEA), a multi-agent real-time architecture that exploits artificial intelligence planning as the core reasoning engine of an autonomous agent. We will also describe how the issue of plan robustness at execution can be addressed by novel constraint propagation algorithms capable of giving the tightest exact bounds on resource consumption or all possible executions of a flexible plan.
DOT National Transportation Integrated Search
2008-03-14
This report contains the results, findings and conclusions generated from the evaluation and field testing of a specific subset of ITS Standards applicable to the center-to-center exchange of advanced traveler information as deployed by the Nebraska ...
DOT National Transportation Integrated Search
This report demonstrates the benefits of deploying and operating an integrated highway/rail system, along with the potential barriers to implementation. In particular, it discusses the lessons learned associated with the Advanced Warning to Avoid Rai...
Evolutionary Multiobjective Query Workload Optimization of Cloud Data Warehouses
Dokeroglu, Tansel; Sert, Seyyit Alper; Cinar, Muhammet Serkan
2014-01-01
With the advent of Cloud databases, query optimizers need to find paretooptimal solutions in terms of response time and monetary cost. Our novel approach minimizes both objectives by deploying alternative virtual resources and query plans making use of the virtual resource elasticity of the Cloud. We propose an exact multiobjective branch-and-bound and a robust multiobjective genetic algorithm for the optimization of distributed data warehouse query workloads on the Cloud. In order to investigate the effectiveness of our approach, we incorporate the devised algorithms into a prototype system. Finally, through several experiments that we have conducted with different workloads and virtual resource configurations, we conclude remarkable findings of alternative deployments as well as the advantages and disadvantages of the multiobjective algorithms we propose. PMID:24892048
Sao Paulo Lightning Mapping Array (SP-LMA): Deployment and Plans
NASA Technical Reports Server (NTRS)
Bailey, J. C.; Carey, L. D.; Blakeslee, R. J.; Albrecht, R.; Morales, C. A.; Pinto, O., Jr.
2011-01-01
An 8-10 station Lightning Mapping Array (LMA) network is being deployed in the vicinity of Sao Paulo to create the SP-LMA for total lightning measurements in association with the international CHUVA [Cloud processes of tHe main precipitation systems in Brazil: A contribUtion to cloud resolVing modeling and to the GPM (GlobAl Precipitation Measurement)] field campaign. Besides supporting CHUVA science/mission objectives and the Sao Luz Paraitinga intensive operation period (IOP) in December 2011-January 2012, the SP-LMA will support the generation of unique proxy data for the Geostationary Lightning Mapper (GLM) and Advanced Baseline Imager (ABI), both sensors on the NOAA Geostationary Operational Environmental Satellite-R (GOES-R), presently under development and scheduled for a 2015 launch. The proxy data will be used to develop and validate operational algorithms so that they will be ready for use on "day1" following the launch of GOES-R. A preliminary survey of potential sites in the vicinity of Sao Paulo was conducted in December 2009 and January 2010, followed up by a detailed survey in July 2010, with initial network deployment scheduled for October 2010. However, due to a delay in the Sa Luz Paraitinga IOP, the SP-LMA will now be installed in July 2011 and operated for one year. Spacing between stations is on the order of 15-30 km, with the network "diameter" being on the order of 30-40 km, which provides good 3-D lightning mapping 150 km from the network center. Optionally, 1-3 additional stations may be deployed in the vicinity of Sa Jos dos Campos.
Sao Paulo Lightning Mapping Array (SP-LMA): Deployment, Operation and Initial Data Analysis
NASA Technical Reports Server (NTRS)
Blakeslee, R.; Bailey, J. C.; Carey, L. D.; Rudlosky, S.; Goodman, S. J.; Albrecht, R.; Morales, C. A.; Anseimo, E. M.; Pinto, O.
2012-01-01
An 8-10 station Lightning Mapping Array (LMA) network is being deployed in the vicinity of Sao Paulo to create the SP-LMA for total lightning measurements in association with the international CHUVA [Cloud processes of the main precipitation systems in Brazil: A contribution to cloud resolving modeling and to the GPM (Global Precipitation Measurement)] field campaign. Besides supporting CHUVA science/mission objectives and the Sao Luiz do Paraitinga intensive operation period (IOP) in November-December 2011, the SP-LMA will support the generation of unique proxy data for the Geostationary Lightning Mapper (GLM) and Advanced Baseline Imager (ABI), both sensors on the NOAA Geostationary Operational Environmental Satellite-R (GOES-R), presently under development and scheduled for a 2015 launch. The proxy data will be used to develop and validate operational algorithms so that they will be ready for use on "day1" following the launch of GOES-R. A preliminary survey of potential sites in the vicinity of Sao Paulo was conducted in December 2009 and January 2010, followed up by a detailed survey in July 2010, with initial network deployment scheduled for October 2010. However, due to a delay in the Sao Luiz do Paraitinga IOP, the SP-LMA will now be installed in July 2011 and operated for one year. Spacing between stations is on the order of 15-30 km, with the network "diameter" being on the order of 30-40 km, which provides good 3-D lightning mapping 150 km from the network center. Optionally, 1-3 additional stations may be deployed in the vicinity of Sao Jos dos Campos.
Recent advances and remaining challenges for the spectroscopic detection of explosive threats.
Fountain, Augustus W; Christesen, Steven D; Moon, Raphael P; Guicheteau, Jason A; Emmons, Erik D
2014-01-01
In 2010, the U.S. Army initiated a program through the Edgewood Chemical Biological Center to identify viable spectroscopic signatures of explosives and initiate environmental persistence, fate, and transport studies for trace residues. These studies were ultimately designed to integrate these signatures into algorithms and experimentally evaluate sensor performance for explosives and precursor materials in existing chemical point and standoff detection systems. Accurate and validated optical cross sections and signatures are critical in benchmarking spectroscopic-based sensors. This program has provided important information for the scientists and engineers currently developing trace-detection solutions to the homemade explosive problem. With this information, the sensitivity of spectroscopic methods for explosives detection can now be quantitatively evaluated before the sensor is deployed and tested.
Deployment Optimization for Embedded Flight Avionics Systems
2011-11-01
the iterations, the best solution(s) that evolved out from the group is output as the result. Although metaheuristic algorithms are powerful, they...that other design constraints are met—ScatterD uses metaheuristic algorithms to seed the bin-packing algorithm . In particular, metaheuristic ... metaheuristic algorithms to search the design space—and then using bin-packing to allocate software tasks to processors—ScatterD can generate
Smartphone-Based Indoor Localization with Bluetooth Low Energy Beacons
Zhuang, Yuan; Yang, Jun; Li, You; Qi, Longning; El-Sheimy, Naser
2016-01-01
Indoor wireless localization using Bluetooth Low Energy (BLE) beacons has attracted considerable attention after the release of the BLE protocol. In this paper, we propose an algorithm that uses the combination of channel-separate polynomial regression model (PRM), channel-separate fingerprinting (FP), outlier detection and extended Kalman filtering (EKF) for smartphone-based indoor localization with BLE beacons. The proposed algorithm uses FP and PRM to estimate the target’s location and the distances between the target and BLE beacons respectively. We compare the performance of distance estimation that uses separate PRM for three advertisement channels (i.e., the separate strategy) with that use an aggregate PRM generated through the combination of information from all channels (i.e., the aggregate strategy). The performance of FP-based location estimation results of the separate strategy and the aggregate strategy are also compared. It was found that the separate strategy can provide higher accuracy; thus, it is preferred to adopt PRM and FP for each BLE advertisement channel separately. Furthermore, to enhance the robustness of the algorithm, a two-level outlier detection mechanism is designed. Distance and location estimates obtained from PRM and FP are passed to the first outlier detection to generate improved distance estimates for the EKF. After the EKF process, the second outlier detection algorithm based on statistical testing is further performed to remove the outliers. The proposed algorithm was evaluated by various field experiments. Results show that the proposed algorithm achieved the accuracy of <2.56 m at 90% of the time with dense deployment of BLE beacons (1 beacon per 9 m), which performs 35.82% better than <3.99 m from the Propagation Model (PM) + EKF algorithm and 15.77% more accurate than <3.04 m from the FP + EKF algorithm. With sparse deployment (1 beacon per 18 m), the proposed algorithm achieves the accuracies of <3.88 m at 90% of the time, which performs 49.58% more accurate than <8.00 m from the PM + EKF algorithm and 21.41% better than <4.94 m from the FP + EKF algorithm. Therefore, the proposed algorithm is especially useful to improve the localization accuracy in environments with sparse beacon deployment. PMID:27128917
Smartphone-Based Indoor Localization with Bluetooth Low Energy Beacons.
Zhuang, Yuan; Yang, Jun; Li, You; Qi, Longning; El-Sheimy, Naser
2016-04-26
Indoor wireless localization using Bluetooth Low Energy (BLE) beacons has attracted considerable attention after the release of the BLE protocol. In this paper, we propose an algorithm that uses the combination of channel-separate polynomial regression model (PRM), channel-separate fingerprinting (FP), outlier detection and extended Kalman filtering (EKF) for smartphone-based indoor localization with BLE beacons. The proposed algorithm uses FP and PRM to estimate the target's location and the distances between the target and BLE beacons respectively. We compare the performance of distance estimation that uses separate PRM for three advertisement channels (i.e., the separate strategy) with that use an aggregate PRM generated through the combination of information from all channels (i.e., the aggregate strategy). The performance of FP-based location estimation results of the separate strategy and the aggregate strategy are also compared. It was found that the separate strategy can provide higher accuracy; thus, it is preferred to adopt PRM and FP for each BLE advertisement channel separately. Furthermore, to enhance the robustness of the algorithm, a two-level outlier detection mechanism is designed. Distance and location estimates obtained from PRM and FP are passed to the first outlier detection to generate improved distance estimates for the EKF. After the EKF process, the second outlier detection algorithm based on statistical testing is further performed to remove the outliers. The proposed algorithm was evaluated by various field experiments. Results show that the proposed algorithm achieved the accuracy of <2.56 m at 90% of the time with dense deployment of BLE beacons (1 beacon per 9 m), which performs 35.82% better than <3.99 m from the Propagation Model (PM) + EKF algorithm and 15.77% more accurate than <3.04 m from the FP + EKF algorithm. With sparse deployment (1 beacon per 18 m), the proposed algorithm achieves the accuracies of <3.88 m at 90% of the time, which performs 49.58% more accurate than <8.00 m from the PM + EKF algorithm and 21.41% better than <4.94 m from the FP + EKF algorithm. Therefore, the proposed algorithm is especially useful to improve the localization accuracy in environments with sparse beacon deployment.
Connected Vehicle Pilot Deployment Program Phase 1, Outreach Plan – Tampa (THEA).
DOT National Transportation Integrated Search
2016-07-06
This document presents the Outreach Plan for the Tampa Hillsborough Expressway Authority (THEA) Connected Vehicle (CV) Pilot Deployment. The goal of the pilot deployment is to advance and enable safe, interoperable, networked wireless communications ...
47 CFR 51.231 - Provision of information on advanced services deployment.
Code of Federal Regulations, 2010 CFR
2010-10-01
... rejection; and (3) Information with respect to the number of loops using advanced services technology within... incumbent LEC information on the type of technology that the requesting carrier seeks to deploy. (1) Where... spectral density (PSD) mask, it also must provide Spectrum Class information for the technology. (2) Where...
47 CFR 51.231 - Provision of information on advanced services deployment.
Code of Federal Regulations, 2011 CFR
2011-10-01
... the requesting carrier asserts that the technology it seeks to deploy fits within a generic power... technology, it must provide the incumbent LEC with information on the speed and power at which the signal... rejection; and (3) Information with respect to the number of loops using advanced services technology within...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Melin, Alexander M.; Kisner, Roger A.; Drira, Anis
Embedded instrumentation and control systems that can operate in extreme environments are challenging due to restrictions on sensors and materials. As a part of the Department of Energy's Nuclear Energy Enabling Technology cross-cutting technology development programs Advanced Sensors and Instrumentation topic, this report details the design of a bench-scale embedded instrumentation and control testbed. The design goal of the bench-scale testbed is to build a re-configurable system that can rapidly deploy and test advanced control algorithms in a hardware in the loop setup. The bench-scale testbed will be designed as a fluid pump analog that uses active magnetic bearings tomore » support the shaft. The testbed represents an application that would improve the efficiency and performance of high temperature (700 C) pumps for liquid salt reactors that operate in an extreme environment and provide many engineering challenges that can be overcome with embedded instrumentation and control. This report will give details of the mechanical design, electromagnetic design, geometry optimization, power electronics design, and initial control system design.« less
Connected vehicle pilot deployment program phase 1, safety management plan – Tampa (THEA).
DOT National Transportation Integrated Search
2016-04-01
This document presents the Safety Management Plan for the THEA Connected Vehicle (CV) Pilot Deployment. The THEA CV Pilot Deployment goal is to advance and enable safe, interoperable, networked wireless communications among vehicles, the infrastructu...
NASA Astrophysics Data System (ADS)
Nightingale, James; Wang, Qi; Grecos, Christos
2011-03-01
Users of the next generation wireless paradigm known as multihomed mobile networks expect satisfactory quality of service (QoS) when accessing streamed multimedia content. The recent H.264 Scalable Video Coding (SVC) extension to the Advanced Video Coding standard (AVC), offers the facility to adapt real-time video streams in response to the dynamic conditions of multiple network paths encountered in multihomed wireless mobile networks. Nevertheless, preexisting streaming algorithms were mainly proposed for AVC delivery over multipath wired networks and were evaluated by software simulation. This paper introduces a practical, hardware-based testbed upon which we implement and evaluate real-time H.264 SVC streaming algorithms in a realistic multihomed wireless mobile networks environment. We propose an optimised streaming algorithm with multi-fold technical contributions. Firstly, we extended the AVC packet prioritisation schemes to reflect the three-dimensional granularity of SVC. Secondly, we designed a mechanism for evaluating the effects of different streamer 'read ahead window' sizes on real-time performance. Thirdly, we took account of the previously unconsidered path switching and mobile networks tunnelling overheads encountered in real-world deployments. Finally, we implemented a path condition monitoring and reporting scheme to facilitate the intelligent path switching. The proposed system has been experimentally shown to offer a significant improvement in PSNR of the received stream compared with representative existing algorithms.
NASA Technical Reports Server (NTRS)
Blakeslee, R. J.; Bailey, J. C.; Carey, L. D.; Goodman, S. J.; Rudlosky, S. D.; Albrecht, R.; Morales, C. A.; Anselmo, E. M.; Neves, J. R.
2013-01-01
A 12 station Lightning Mapping Array (LMA) network was deployed during October 2011in the vicinity of São Paulo, Brazil (SP-LMA) to contribute total lightning measurements to an international field campaign [CHUVA - Cloud processes of tHe main precipitation systems in Brazil: A contribUtion to cloud resolVing modeling and to the GPM (GlobAl Precipitation Measurement)]. The SP-LMA was operational from November 2011 through March 2012. Sensor spacing was on the order of 15-30 km, with a network diameter on the order of 40-50km. The SP-LMA provides good 3-D lightning mapping out to150 km from the network center, with 2-D coverage considerably farther. In addition to supporting CHUVA science/mission objectives, the SP-LMA is supporting the generation of unique proxy data for the Geostationary Lightning Mapper (GLM) and Advanced Baseline Imager (ABI), on NOAA's Geostationary Operational Environmental Satellite-R (GOES-R: scheduled for a 2015 launch). These proxy data will be used to develop and validate operational algorithms so that they will be ready to use on "day1" following the GOES-R launch. The SP-LMA data also will be intercompared with lightning observations from other deployed lightning networks to advance our understanding of the capabilities/contributions of each of these networks toward GLM proxy and validation activities. This paper addresses the network assessment and analyses for intercomparison studies and GOES-R proxy activities
DOT National Transportation Integrated Search
2010-03-17
The attempted bombing of Northwest flight 253 highlighted the importance of detecting improvised explosive devices on passengers. This testimony focuses on (1) the Transportation Security Administrations (TSA) efforts to procure and deploy advance...
Customization of UWB 3D-RTLS Based on the New Uncertainty Model of the AoA Ranging Technique
Jachimczyk, Bartosz; Dziak, Damian; Kulesza, Wlodek J.
2017-01-01
The increased potential and effectiveness of Real-time Locating Systems (RTLSs) substantially influence their application spectrum. They are widely used, inter alia, in the industrial sector, healthcare, home care, and in logistic and security applications. The research aims to develop an analytical method to customize UWB-based RTLS, in order to improve their localization performance in terms of accuracy and precision. The analytical uncertainty model of Angle of Arrival (AoA) localization in a 3D indoor space, which is the foundation of the customization concept, is established in a working environment. Additionally, a suitable angular-based 3D localization algorithm is introduced. The paper investigates the following issues: the influence of the proposed correction vector on the localization accuracy; the impact of the system’s configuration and LS’s relative deployment on the localization precision distribution map. The advantages of the method are verified by comparing them with a reference commercial RTLS localization engine. The results of simulations and physical experiments prove the value of the proposed customization method. The research confirms that the analytical uncertainty model is the valid representation of RTLS’ localization uncertainty in terms of accuracy and precision and can be useful for its performance improvement. The research shows, that the Angle of Arrival localization in a 3D indoor space applying the simple angular-based localization algorithm and correction vector improves of localization accuracy and precision in a way that the system challenges the reference hardware advanced localization engine. Moreover, the research guides the deployment of location sensors to enhance the localization precision. PMID:28125056
Reisner, A. T.; Khitrov, M. Y.; Chen, L.; Blood, A.; Wilkins, K.; Doyle, W.; Wilcox, S.; Denison, T.; Reifman, J.
2013-01-01
Summary Background Advanced decision-support capabilities for prehospital trauma care may prove effective at improving patient care. Such functionality would be possible if an analysis platform were connected to a transport vital-signs monitor. In practice, there are technical challenges to implementing such a system. Not only must each individual component be reliable, but, in addition, the connectivity between components must be reliable. Objective We describe the development, validation, and deployment of the Automated Processing of Physiologic Registry for Assessment of Injury Severity (APPRAISE) platform, intended to serve as a test bed to help evaluate the performance of decision-support algorithms in a prehospital environment. Methods We describe the hardware selected and the software implemented, and the procedures used for laboratory and field testing. Results The APPRAISE platform met performance goals in both laboratory testing (using a vital-sign data simulator) and initial field testing. After its field testing, the platform has been in use on Boston MedFlight air ambulances since February of 2010. Conclusion These experiences may prove informative to other technology developers and to healthcare stakeholders seeking to invest in connected electronic systems for prehospital as well as in-hospital use. Our experiences illustrate two sets of important questions: are the individual components reliable (e.g., physical integrity, power, core functionality, and end-user interaction) and is the connectivity between components reliable (e.g., communication protocols and the metadata necessary for data interpretation)? While all potential operational issues cannot be fully anticipated and eliminated during development, thoughtful design and phased testing steps can reduce, if not eliminate, technical surprises. PMID:24155791
Customization of UWB 3D-RTLS Based on the New Uncertainty Model of the AoA Ranging Technique.
Jachimczyk, Bartosz; Dziak, Damian; Kulesza, Wlodek J
2017-01-25
The increased potential and effectiveness of Real-time Locating Systems (RTLSs) substantially influence their application spectrum. They are widely used, inter alia, in the industrial sector, healthcare, home care, and in logistic and security applications. The research aims to develop an analytical method to customize UWB-based RTLS, in order to improve their localization performance in terms of accuracy and precision. The analytical uncertainty model of Angle of Arrival (AoA) localization in a 3D indoor space, which is the foundation of the customization concept, is established in a working environment. Additionally, a suitable angular-based 3D localization algorithm is introduced. The paper investigates the following issues: the influence of the proposed correction vector on the localization accuracy; the impact of the system's configuration and LS's relative deployment on the localization precision distribution map. The advantages of the method are verified by comparing them with a reference commercial RTLS localization engine. The results of simulations and physical experiments prove the value of the proposed customization method. The research confirms that the analytical uncertainty model is the valid representation of RTLS' localization uncertainty in terms of accuracy and precision and can be useful for its performance improvement. The research shows, that the Angle of Arrival localization in a 3D indoor space applying the simple angular-based localization algorithm and correction vector improves of localization accuracy and precision in a way that the system challenges the reference hardware advanced localization engine. Moreover, the research guides the deployment of location sensors to enhance the localization precision.
Enhanced In-Pile Instrumentation at the Advanced Test Reactor
NASA Astrophysics Data System (ADS)
Rempe, Joy L.; Knudson, Darrell L.; Daw, Joshua E.; Unruh, Troy; Chase, Benjamin M.; Palmer, Joe; Condie, Keith G.; Davis, Kurt L.
2012-08-01
Many of the sensors deployed at materials and test reactors cannot withstand the high flux/high temperature test conditions often requested by users at U.S. test reactors, such as the Advanced Test Reactor (ATR) at the Idaho National Laboratory. To address this issue, an instrumentation development effort was initiated as part of the ATR National Scientific User Facility in 2007 to support the development and deployment of enhanced in-pile sensors. This paper provides an update on this effort. Specifically, this paper identifies the types of sensors currently available to support in-pile irradiations and those sensors currently available to ATR users. Accomplishments from new sensor technology deployment efforts are highlighted by describing new temperature and thermal conductivity sensors now available to ATR users. Efforts to deploy enhanced in-pile sensors for detecting elongation and real-time flux detectors are also reported, and recently-initiated research to evaluate the viability of advanced technologies to provide enhanced accuracy for measuring key parameters during irradiation testing are noted.
Surveillance of a 2D Plane Area with 3D Deployed Cameras
Fu, Yi-Ge; Zhou, Jie; Deng, Lei
2014-01-01
As the use of camera networks has expanded, camera placement to satisfy some quality assurance parameters (such as a good coverage ratio, an acceptable resolution constraints, an acceptable cost as low as possible, etc.) has become an important problem. The discrete camera deployment problem is NP-hard and many heuristic methods have been proposed to solve it, most of which make very simple assumptions. In this paper, we propose a probability inspired binary Particle Swarm Optimization (PI-BPSO) algorithm to solve a homogeneous camera network placement problem. We model the problem under some more realistic assumptions: (1) deploy the cameras in the 3D space while the surveillance area is restricted to a 2D ground plane; (2) deploy the minimal number of cameras to get a maximum visual coverage under more constraints, such as field of view (FOV) of the cameras and the minimum resolution constraints. We can simultaneously optimize the number and the configuration of the cameras through the introduction of a regulation item in the cost function. The simulation results showed the effectiveness of the proposed PI-BPSO algorithm. PMID:24469353
2014-02-11
ISS038-E-044916 (11 Feb. 2014) --- A set of NanoRacks CubeSats is photographed by an Expedition 38 crew member after the deployment by the Small Satellite Orbital Deployer (SSOD). The CubeSats program contains a variety of experiments such as Earth observations and advanced electronics testing.
Automatic Multi-sensor Data Quality Checking and Event Detection for Environmental Sensing
NASA Astrophysics Data System (ADS)
LIU, Q.; Zhang, Y.; Zhao, Y.; Gao, D.; Gallaher, D. W.; Lv, Q.; Shang, L.
2017-12-01
With the advances in sensing technologies, large-scale environmental sensing infrastructures are pervasively deployed to continuously collect data for various research and application fields, such as air quality study and weather condition monitoring. In such infrastructures, many sensor nodes are distributed in a specific area and each individual sensor node is capable of measuring several parameters (e.g., humidity, temperature, and pressure), providing massive data for natural event detection and analysis. However, due to the dynamics of the ambient environment, sensor data can be contaminated by errors or noise. Thus, data quality is still a primary concern for scientists before drawing any reliable scientific conclusions. To help researchers identify potential data quality issues and detect meaningful natural events, this work proposes a novel algorithm to automatically identify and rank anomalous time windows from multiple sensor data streams. More specifically, (1) the algorithm adaptively learns the characteristics of normal evolving time series and (2) models the spatial-temporal relationship among multiple sensor nodes to infer the anomaly likelihood of a time series window for a particular parameter in a sensor node. Case studies using different data sets are presented and the experimental results demonstrate that the proposed algorithm can effectively identify anomalous time windows, which may resulted from data quality issues and natural events.
NASA Technical Reports Server (NTRS)
Abbott, Mark R.
1998-01-01
The objectives of the last six months were: (1) Revise the algorithms for the Fluorescence Line Height (FLH) and Chlorophyll Fluorescence Efficiency (CFE) products, especially the data quality flags; (2) Revise the MOCEAN validation plan; (3) Deploy and recover bio-optical instrumentation at the Hawaii Ocean Time-series (HOT) site as part of the Joint Global Ocean Flux Study (JGOFS); (4) Prepare for field work in the Antarctic Polar Frontal Zone as part of JGOFS; (5) Submit manuscript on bio-optical time scales as estimated from Lagrangian drifters; (6) Conduct chemostat experiments on fluorescence; (7) Interface with the Global Imager (GLI) science team; and (8) Continue development of advanced data system browser. We are responsible for the delivery of two at-launch products for AM-1: Fluorescence line height (FLH) and chlorophyll fluorescence efficiency (CFE). We also considered revising the input chlorophyll, which is used to determine the degree of binning. We have refined the quality flags for the Version 2 algorithms. We have acquired and installed a Silicon Graphics Origin 200. We are working with the University of Miami team to develop documentation that will describe how the MODIS ocean components are linked together.
A robustness test of the braided device foreshortening algorithm
NASA Astrophysics Data System (ADS)
Moyano, Raquel Kale; Fernandez, Hector; Macho, Juan M.; Blasco, Jordi; San Roman, Luis; Narata, Ana Paula; Larrabide, Ignacio
2017-11-01
Different computational methods have been recently proposed to simulate the virtual deployment of a braided stent inside a patient vasculature. Those methods are primarily based on the segmentation of the region of interest to obtain the local vessel morphology descriptors. The goal of this work is to evaluate the influence of the segmentation quality on the method named "Braided Device Foreshortening" (BDF). METHODS: We used the 3DRA images of 10 aneurysmatic patients (cases). The cases were segmented by applying a marching cubes algorithm with a broad range of thresholds in order to generate 10 surface models each. We selected a braided device to apply the BDF algorithm to each surface model. The range of the computed flow diverter lengths for each case was obtained to calculate the variability of the method against the threshold segmentation values. RESULTS: An evaluation study over 10 clinical cases indicates that the final length of the deployed flow diverter in each vessel model is stable, shielding maximum difference of 11.19% in vessel diameter and maximum of 9.14% in the simulated stent length for the threshold values. The average coefficient of variation was found to be 4.08 %. CONCLUSION: A study evaluating how the threshold segmentation affects the simulated length of the deployed FD, was presented. The segmentation algorithm used to segment intracranial aneurysm 3D angiography images presents small variation in the resulting stent simulation.
Deploy Nalu/Kokkos algorithmic infrastructure with performance benchmarking.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Domino, Stefan P.; Ananthan, Shreyas; Knaus, Robert C.
The former Nalu interior heterogeneous algorithm design, which was originally designed to manage matrix assembly operations over all elemental topology types, has been modified to operate over homogeneous collections of mesh entities. This newly templated kernel design allows for removal of workset variable resize operations that were formerly required at each loop over a Sierra ToolKit (STK) bucket (nominally, 512 entities in size). Extensive usage of the Standard Template Library (STL) std::vector has been removed in favor of intrinsic Kokkos memory views. In this milestone effort, the transition to Kokkos as the underlying infrastructure to support performance and portability onmore » many-core architectures has been deployed for key matrix algorithmic kernels. A unit-test driven design effort has developed a homogeneous entity algorithm that employs a team-based thread parallelism construct. The STK Single Instruction Multiple Data (SIMD) infrastructure is used to interleave data for improved vectorization. The collective algorithm design, which allows for concurrent threading and SIMD management, has been deployed for the core low-Mach element- based algorithm. Several tests to ascertain SIMD performance on Intel KNL and Haswell architectures have been carried out. The performance test matrix includes evaluation of both low- and higher-order methods. The higher-order low-Mach methodology builds on polynomial promotion of the core low-order control volume nite element method (CVFEM). Performance testing of the Kokkos-view/SIMD design indicates low-order matrix assembly kernel speed-up ranging between two and four times depending on mesh loading and node count. Better speedups are observed for higher-order meshes (currently only P=2 has been tested) especially on KNL. The increased workload per element on higher-order meshes bene ts from the wide SIMD width on KNL machines. Combining multiple threads with SIMD on KNL achieves a 4.6x speedup over the baseline, with assembly timings faster than that observed on Haswell architecture. The computational workload of higher-order meshes, therefore, seems ideally suited for the many-core architecture and justi es further exploration of higher-order on NGP platforms. A Trilinos/Tpetra-based multi-threaded GMRES preconditioned by symmetric Gauss Seidel (SGS) represents the core solver infrastructure for the low-Mach advection/diffusion implicit solves. The threaded solver stack has been tested on small problems on NREL's Peregrine system using the newly developed and deployed Kokkos-view/SIMD kernels. fforts are underway to deploy the Tpetra-based solver stack on NERSC Cori system to benchmark its performance at scale on KNL machines.« less
NASA Technical Reports Server (NTRS)
Bailey, J. C.; Blakeslee, R. J.; Carey, L. D.; Goodman, S. J.; Rudlosky, S. D.; Albrecht, R.; Morales, C. A.; Anselmo, E. M.; Neves, J. R.; Buechler, D. E.
2014-01-01
A 12 station Lightning Mapping Array (LMA) network was deployed during October 2011 in the vicinity of Sao Paulo, Brazil (SP-LMA) to contribute total lightning measurements to an international field campaign [CHUVA - Cloud processes of tHe main precipitation systems in Brazil: A contribUtion to cloud resolVing modeling and to the GPM (GlobAl Precipitation Measurement)]. The SP-LMA was operational from November 2011 through March 2012 during the Vale do Paraiba campaign. Sensor spacing was on the order of 15-30 km, with a network diameter on the order of 40-50km. The SP-LMA provides good 3-D lightning mapping out to 150 km from the network center, with 2-D coverage considerably farther. In addition to supporting CHUVA science/mission objectives, the SP-LMA is supporting the generation of unique proxy data for the Geostationary Lightning Mapper (GLM) and Advanced Baseline Imager (ABI), on NOAA's Geostationary Operational Environmental Satellite-R (GOES-R: scheduled for a 2015 launch). These proxy data will be used to develop and validate operational algorithms so that they will be ready to use on "day1" following the GOES-R launch. As the CHUVA Vale do Paraiba campaign opportunity was formulated, a broad community-based interest developed for a comprehensive Lightning Location System (LLS) intercomparison and assessment study, leading to the participation and/or deployment of eight other ground-based networks and the space-based Lightning Imaging Sensor (LIS). The SP-LMA data is being intercompared with lightning observations from other deployed lightning networks to advance our understanding of the capabilities/contributions of each of these networks toward GLM proxy and validation activities. This paper addresses the network assessment including noise reduction criteria, detection efficiency estimates, and statistical and climatological (both temporal and spatially) analyses for intercomparison studies and GOES-R proxy activities.
Cloud-enabled large-scale land surface model simulations with the NASA Land Information System
NASA Astrophysics Data System (ADS)
Duffy, D.; Vaughan, G.; Clark, M. P.; Peters-Lidard, C. D.; Nijssen, B.; Nearing, G. S.; Rheingrover, S.; Kumar, S.; Geiger, J. V.
2017-12-01
Developed by the Hydrological Sciences Laboratory at NASA Goddard Space Flight Center (GSFC), the Land Information System (LIS) is a high-performance software framework for terrestrial hydrology modeling and data assimilation. LIS provides the ability to integrate satellite and ground-based observational products and advanced modeling algorithms to extract land surface states and fluxes. Through a partnership with the National Center for Atmospheric Research (NCAR) and the University of Washington, the LIS model is currently being extended to include the Structure for Unifying Multiple Modeling Alternatives (SUMMA). With the addition of SUMMA in LIS, meaningful simulations containing a large multi-model ensemble will be enabled and can provide advanced probabilistic continental-domain modeling capabilities at spatial scales relevant for water managers. The resulting LIS/SUMMA application framework is difficult for non-experts to install due to the large amount of dependencies on specific versions of operating systems, libraries, and compilers. This has created a significant barrier to entry for domain scientists that are interested in using the software on their own systems or in the cloud. In addition, the requirement to support multiple run time environments across the LIS community has created a significant burden on the NASA team. To overcome these challenges, LIS/SUMMA has been deployed using Linux containers, which allows for an entire software package along with all dependences to be installed within a working runtime environment, and Kubernetes, which orchestrates the deployment of a cluster of containers. Within a cloud environment, users can now easily create a cluster of virtual machines and run large-scale LIS/SUMMA simulations. Installations that have taken weeks and months can now be performed in minutes of time. This presentation will discuss the steps required to create a cloud-enabled large-scale simulation, present examples of its use, and describe the potential deployment of this information technology with other NASA applications.
Requirements Definition for ORNL Trusted Corridors Project
DOE Office of Scientific and Technical Information (OSTI.GOV)
Walker, Randy M; Hill, David E; Smith, Cyrus M
2008-02-01
The ORNL Trusted Corridors Project has several other names: SensorNet Transportation Pilot; Identification and Monitoring of Radiation (in commerce) Shipments (IMR(ic)S); and Southeastern Transportation Corridor Pilot (SETCP). The project involves acquisition and analysis of transportation data at two mobile and three fixed inspection stations in five states (Kentucky, Mississippi, South Carolina, Tennessee, and Washington DC). Collaborators include the State Police organizations that are responsible for highway safety, law enforcement, and incident response. The three states with fixed weigh-station deployments (KY, SC, TN) are interested in coordination of this effort for highway safety, law enforcement, and sorting/targeting/interdiction of potentially non-compliant vehicles/persons/cargo.more » The Domestic Nuclear Detection Office (DNDO) in the U.S. Department of Homeland Security (DHS) is interested in these deployments, as a Pilot test (SETCP) to identify Improvised Nuclear Devices (INDs) in highway transport. However, the level of DNDO integration among these state deployments is presently uncertain. Moreover, DHS issues are considered secondary by the states, which perceive this work as an opportunity to leverage these (new) dual-use technologies for state needs. In addition, present experience shows that radiation detectors alone cannot detect DHS-identified IND threats. Continued SETCP success depends on the level of integration of current state/local police operations with the new DHS task of detecting IND threats, in addition to emergency preparedness and homeland security. This document describes the enabling components for continued SETCP development and success, including: sensors and their use at existing deployments (Section 1); personnel training (Section 2); concept of operations (Section 3); knowledge discovery from the copious data (Section 4); smart data collection, integration and database development, advanced algorithms for multiple sensors, and network communications (Section 5); and harmonization of local, state, and Federal procedures and protocols (Section 6).« less
Novel Visual Sensor Coverage and Deployment in Time Aware PTZ Wireless Visual Sensor Networks.
Yap, Florence G H; Yen, Hong-Hsu
2016-12-30
In this paper, we consider the visual sensor deployment algorithm in Pan-Tilt-Zoom (PTZ) Wireless Visual Sensor Networks (WVSNs). With PTZ capability, a sensor's visual coverage can be extended to reduce the number of visual sensors that need to be deployed. The coverage zone of a visual sensor in PTZ WVSN is composed of two regions, a Direct Coverage Region (DCR) and a PTZ Coverage Region (PTZCR). In the PTZCR, a visual sensor needs a mechanical pan-tilt-zoom operation to cover an object. This mechanical operation can take seconds, so the sensor might not be able to adjust the camera in time to capture the visual data. In this paper, for the first time, we study this PTZ time-aware PTZ WVSN deployment problem. We formulate this PTZ time-aware PTZ WVSN deployment problem as an optimization problem where the objective is to minimize the total visual sensor deployment cost so that each area is either covered in the DCR or in the PTZCR while considering the PTZ time constraint. The proposed Time Aware Coverage Zone (TACZ) model successfully captures the PTZ visual sensor coverage in terms of camera focal range, angle span zone coverage and camera PTZ time. Then a novel heuristic, called Time Aware Deployment with PTZ camera (TADPTZ) algorithm, is proposed to solve the problem. From our computational experiments, we found out that TACZ model outperforms the existing M coverage model under all network scenarios. In addition, as compared to the optimal solutions, the TACZ model is scalable and adaptable to the different PTZ time requirements when deploying large PTZ WVSNs.
Novel Visual Sensor Coverage and Deployment in Time Aware PTZ Wireless Visual Sensor Networks
Yap, Florence G. H.; Yen, Hong-Hsu
2016-01-01
In this paper, we consider the visual sensor deployment algorithm in Pan-Tilt-Zoom (PTZ) Wireless Visual Sensor Networks (WVSNs). With PTZ capability, a sensor’s visual coverage can be extended to reduce the number of visual sensors that need to be deployed. The coverage zone of a visual sensor in PTZ WVSN is composed of two regions, a Direct Coverage Region (DCR) and a PTZ Coverage Region (PTZCR). In the PTZCR, a visual sensor needs a mechanical pan-tilt-zoom operation to cover an object. This mechanical operation can take seconds, so the sensor might not be able to adjust the camera in time to capture the visual data. In this paper, for the first time, we study this PTZ time-aware PTZ WVSN deployment problem. We formulate this PTZ time-aware PTZ WVSN deployment problem as an optimization problem where the objective is to minimize the total visual sensor deployment cost so that each area is either covered in the DCR or in the PTZCR while considering the PTZ time constraint. The proposed Time Aware Coverage Zone (TACZ) model successfully captures the PTZ visual sensor coverage in terms of camera focal range, angle span zone coverage and camera PTZ time. Then a novel heuristic, called Time Aware Deployment with PTZ camera (TADPTZ) algorithm, is proposed to solve the problem. From our computational experiments, we found out that TACZ model outperforms the existing M coverage model under all network scenarios. In addition, as compared to the optimal solutions, the TACZ model is scalable and adaptable to the different PTZ time requirements when deploying large PTZ WVSNs. PMID:28042829
Simulating Operation of a Complex Sensor Network
NASA Technical Reports Server (NTRS)
Jennings, Esther; Clare, Loren; Woo, Simon
2008-01-01
Simulation Tool for ASCTA Microsensor Network Architecture (STAMiNA) ["ASCTA" denotes the Advanced Sensors Collaborative Technology Alliance.] is a computer program for evaluating conceptual sensor networks deployed over terrain to provide military situational awareness. This or a similar program is needed because of the complexity of interactions among such diverse phenomena as sensing and communication portions of a network, deployment of sensor nodes, effects of terrain, data-fusion algorithms, and threat characteristics. STAMiNA is built upon a commercial network-simulator engine, with extensions to include both sensing and communication models in a discrete-event simulation environment. Users can define (1) a mission environment, including terrain features; (2) objects to be sensed; (3) placements and modalities of sensors, abilities of sensors to sense objects of various types, and sensor false alarm rates; (4) trajectories of threatening objects; (5) means of dissemination and fusion of data; and (6) various network configurations. By use of STAMiNA, one can simulate detection of targets through sensing, dissemination of information by various wireless communication subsystems under various scenarios, and fusion of information, incorporating such metrics as target-detection probabilities, false-alarm rates, and communication loads, and capturing effects of terrain and threat.
Deployable wavelength optimizer for multi-laser sensing and communication undersea
NASA Astrophysics Data System (ADS)
Neuner, Burton; Hening, Alexandru; Pascoguin, B. Melvin; Dick, Brian; Miller, Martin; Tran, Nghia; Pfetsch, Michael
2017-05-01
This effort develops and tests algorithms and a user-portable optical system designed to autonomously optimize the laser communication wavelength in open and coastal oceans. In situ optical meteorology and oceanography (METOC) data gathered and analyzed as part of the auto-selection process can be stored and forwarded. The system performs closedloop optimization of three visible-band lasers within one minute by probing the water column via passive retroreflector and polarization optics, selecting the ideal wavelength, and enabling high-speed communication. Backscattered and stray light is selectively blocked by employing polarizers and wave plates, thus increasing the signal-to-noise ratio. As an advancement in instrumentation, we present autonomy software and portable hardware, and demonstrate this new system in two environments: ocean bay seawater and outdoor test pool freshwater. The next generation design is also presented. Once fully miniaturized, the optical payload and software will be ready for deployment on manned and unmanned platforms such as buoys and vehicles. Gathering timely and accurate ocean sensing data in situ will dramatically increase the knowledge base and capabilities for environmental sensing, defense, and industrial applications. Furthermore, communicating on the optimal channel increases transfer rates, propagation range, and mission length, all while reducing power consumption in undersea platforms.
SkinScan©: A PORTABLE LIBRARY FOR MELANOMA DETECTION ON HANDHELD DEVICES
Wadhawan, Tarun; Situ, Ning; Lancaster, Keith; Yuan, Xiaojing; Zouridakis, George
2011-01-01
We have developed a portable library for automated detection of melanoma termed SkinScan© that can be used on smartphones and other handheld devices. Compared to desktop computers, embedded processors have limited processing speed, memory, and power, but they have the advantage of portability and low cost. In this study we explored the feasibility of running a sophisticated application for automated skin cancer detection on an Apple iPhone 4. Our results demonstrate that the proposed library with the advanced image processing and analysis algorithms has excellent performance on handheld and desktop computers. Therefore, deployment of smartphones as screening devices for skin cancer and other skin diseases can have a significant impact on health care delivery in underserved and remote areas. PMID:21892382
NASA Astrophysics Data System (ADS)
Harris, A. T.; Ramachandran, R.; Maskey, M.
2013-12-01
The Exelis-developed IDL and ENVI software are ubiquitous tools in Earth science research environments. The IDL Workbench is used by the Earth science community for programming custom data analysis and visualization modules. ENVI is a software solution for processing and analyzing geospatial imagery that combines support for multiple Earth observation scientific data types (optical, thermal, multi-spectral, hyperspectral, SAR, LiDAR) with advanced image processing and analysis algorithms. The ENVI & IDL Services Engine (ESE) is an Earth science data processing engine that allows researchers to use open standards to rapidly create, publish and deploy advanced Earth science data analytics within any existing enterprise infrastructure. Although powerful in many ways, the tools lack collaborative features out-of-box. Thus, as part of the NASA funded project, Collaborative Workbench to Accelerate Science Algorithm Development, researchers at the University of Alabama in Huntsville and Exelis have developed plugins that allow seamless research collaboration from within IDL workbench. Such additional features within IDL workbench are possible because IDL workbench is built using the Eclipse Rich Client Platform (RCP). RCP applications allow custom plugins to be dropped in for extended functionalities. Specific functionalities of the plugins include creating complex workflows based on IDL application source code, submitting workflows to be executed by ESE in the cloud, and sharing and cloning of workflows among collaborators. All these functionalities are available to scientists without leaving their IDL workbench. Because ESE can interoperate with any middleware, scientific programmers can readily string together IDL processing tasks (or tasks written in other languages like C++, Java or Python) to create complex workflows for deployment within their current enterprise architecture (e.g. ArcGIS Server, GeoServer, Apache ODE or SciFlo from JPL). Using the collaborative IDL Workbench, coupled with ESE for execution in the cloud, asynchronous workflows could be executed in batch mode on large data in the cloud. We envision that a scientist will initially develop a scientific workflow locally on a small set of data. Once tested, the scientist will deploy the workflow to the cloud for execution. Depending on the results, the scientist may share the workflow and results, allowing them to be stored in a community catalog and instantly loaded into the IDL Workbench of other scientists. Thereupon, scientists can clone and modify or execute the workflow with different input parameters. The Collaborative Workbench will provide a platform for collaboration in the cloud, helping Earth scientists solve big-data problems in the Earth and planetary sciences.
A 13 and 35 GHz Interferometer for Hydrologic, Cryospheric and Vegetation Applications
NASA Astrophysics Data System (ADS)
Siqueira, P. R.; Swochak, A.
2011-12-01
As part of the NASA sponsored technology development for the Surface Water and Ocean Topography (SWOT) Mission, we have developed a high performance two-channel RF downconverter with high bandwidth, interchannel isolation, and phase accuracy for use as a core component in the mission's interferometer. To advance the technology readiness level of the system, the downconverters at Ku- and Ka-band have been incorporated into a ground-based interferometric system, and deployed from local mountain-ranges. Imaged targets include river systems, open fields, and forest stands. Because the system is ground-based, full-day measurements have been made of regions where 120 degree sector scan times can be as short as 20 minutes; sufficient for characterizing soil moisture and changes in the atmospheric water vapor. With the ability to deploy both interferometric systems at the same time, it is capable of exploring penetration depth differences between the two frequencies. Such measurements will provide a new way to estimate the structural and grain size characteristics of surface snow and ice. In this paper we present details about the interferometric systems and results derived from their deployment in the Connecticut River valley of Western Massachusetts. A description of the hardware, observing strategy and processing algorithms will be given. It is shown how the two systems have been used to measure the local topography at high resolution and to observe the diurnal behaviour of moisture and water vapor over fields and forests.
Advanced Deployable Structural Systems for Small Satellites
NASA Technical Reports Server (NTRS)
Belvin, W. Keith; Straubel, Marco; Wilkie, W. Keats; Zander, Martin E.; Fernandez, Juan M.; Hillebrandt, Martin F.
2016-01-01
One of the key challenges for small satellites is packaging and reliable deployment of structural booms and arrays used for power, communication, and scientific instruments. The lack of reliable and efficient boom and membrane deployment concepts for small satellites is addressed in this work through a collaborative project between NASA and DLR. The paper provides a state of the art overview on existing spacecraft deployable appendages, the special requirements for small satellites, and initial concepts for deployable booms and arrays needed for various small satellite applications. The goal is to enhance deployable boom predictability and ground testability, develop designs that are tolerant of manufacturing imperfections, and incorporate simple and reliable deployment systems.
Wang, Chang; Qi, Fei; Shi, Guangming; Wang, Xiaotian
2013-01-01
Deployment is a critical issue affecting the quality of service of camera networks. The deployment aims at adopting the least number of cameras to cover the whole scene, which may have obstacles to occlude the line of sight, with expected observation quality. This is generally formulated as a non-convex optimization problem, which is hard to solve in polynomial time. In this paper, we propose an efficient convex solution for deployment optimizing the observation quality based on a novel anisotropic sensing model of cameras, which provides a reliable measurement of the observation quality. The deployment is formulated as the selection of a subset of nodes from a redundant initial deployment with numerous cameras, which is an ℓ0 minimization problem. Then, we relax this non-convex optimization to a convex ℓ1 minimization employing the sparse representation. Therefore, the high quality deployment is efficiently obtained via convex optimization. Simulation results confirm the effectiveness of the proposed camera deployment algorithms. PMID:23989826
A new real-time tsunami detection algorithm
NASA Astrophysics Data System (ADS)
Chierici, F.; Embriaco, D.; Pignagnoli, L.
2016-12-01
Real-time tsunami detection algorithms play a key role in any Tsunami Early Warning System. We have developed a new algorithm for tsunami detection based on the real-time tide removal and real-time band-pass filtering of sea-bed pressure recordings. The algorithm greatly increases the tsunami detection probability, shortens the detection delay and enhances detection reliability, at low computational cost. The algorithm is designed to be used also in autonomous early warning systems with a set of input parameters and procedures which can be reconfigured in real time. We have also developed a methodology based on Monte Carlo simulations to test the tsunami detection algorithms. The algorithm performance is estimated by defining and evaluating statistical parameters, namely the detection probability, the detection delay, which are functions of the tsunami amplitude and wavelength, and the occurring rate of false alarms. Pressure data sets acquired by Bottom Pressure Recorders in different locations and environmental conditions have been used in order to consider real working scenarios in the test. We also present an application of the algorithm to the tsunami event which occurred at Haida Gwaii on October 28th, 2012 using data recorded by the Bullseye underwater node of Ocean Networks Canada. The algorithm successfully ran for test purpose in year-long missions onboard the GEOSTAR stand-alone multidisciplinary abyssal observatory, deployed in the Gulf of Cadiz during the EC project NEAREST and on NEMO-SN1 cabled observatory deployed in the Western Ionian Sea, operational node of the European research infrastructure EMSO.
Planning Flight Paths of Autonomous Aerobots
NASA Technical Reports Server (NTRS)
Kulczycki, Eric; Elfes, Alberto; Sharma, Shivanjli
2009-01-01
Algorithms for planning flight paths of autonomous aerobots (robotic blimps) to be deployed in scientific exploration of remote planets are undergoing development. These algorithms are also adaptable to terrestrial applications involving robotic submarines as well as aerobots and other autonomous aircraft used to acquire scientific data or to perform surveying or monitoring functions.
Intelligent video storage of visual evidences on site in fast deployment
NASA Astrophysics Data System (ADS)
Desurmont, Xavier; Bastide, Arnaud; Delaigle, Jean-Francois
2004-07-01
In this article we present a generic, flexible, scalable and robust approach for an intelligent real-time forensic visual system. The proposed implementation could be rapidly deployable and integrates minimum logistic support as it embeds low complexity devices (PCs and cameras) that communicate through wireless network. The goal of these advanced tools is to provide intelligent video storage of potential video evidences for fast intervention during deployment around a hazardous sector after a terrorism attack, a disaster, an air crash or before attempt of it. Advanced video analysis tools, such as segmentation and tracking are provided to support intelligent storage and annotation.
Emergency EDAPTS retainer support.
DOT National Transportation Integrated Search
2007-06-01
The Efficient Deployment of Advanced Transportation Systems (EDAPTS) Smart Transit System Project : required various quick-response deployment support activities over the 26-month period from April 18, 2005 : to June 30, 2007. These activities requir...
An advanced technique for the prediction of decelerator system dynamics.
NASA Technical Reports Server (NTRS)
Talay, T. A.; Morris, W. D.; Whitlock, C. H.
1973-01-01
An advanced two-body six-degree-of-freedom computer model employing an indeterminate structures approach has been developed for the parachute deployment process. The program determines both vehicular and decelerator responses to aerodynamic and physical property inputs. A better insight into the dynamic processes that occur during parachute deployment has been developed. The model is of value in sensitivity studies to isolate important parameters that affect the vehicular response.
NASA Astrophysics Data System (ADS)
Krawczyk, Rafał D.; Czarski, Tomasz; Kolasiński, Piotr; Linczuk, Paweł; Poźniak, Krzysztof T.; Chernyshova, Maryna; Kasprowicz, Grzegorz; Wojeński, Andrzej; Zabolotny, Wojciech; Zienkiewicz, Paweł
2016-09-01
This article is an overview of what has been implemented in the process of development and testing the GEM detector based acquisition system in terms of post-processing algorithms. Information is given on mex functions for extended statistics collection, unified hex topology and optimized S-DAQ algorithm for splitting overlapped signals. Additional discussion on bottlenecks and major factors concerning optimization is presented.
QUEST: Eliminating Online Supervised Learning for Efficient Classification Algorithms.
Zwartjes, Ardjan; Havinga, Paul J M; Smit, Gerard J M; Hurink, Johann L
2016-10-01
In this work, we introduce QUEST (QUantile Estimation after Supervised Training), an adaptive classification algorithm for Wireless Sensor Networks (WSNs) that eliminates the necessity for online supervised learning. Online processing is important for many sensor network applications. Transmitting raw sensor data puts high demands on the battery, reducing network life time. By merely transmitting partial results or classifications based on the sampled data, the amount of traffic on the network can be significantly reduced. Such classifications can be made by learning based algorithms using sampled data. An important issue, however, is the training phase of these learning based algorithms. Training a deployed sensor network requires a lot of communication and an impractical amount of human involvement. QUEST is a hybrid algorithm that combines supervised learning in a controlled environment with unsupervised learning on the location of deployment. Using the SITEX02 dataset, we demonstrate that the presented solution works with a performance penalty of less than 10% in 90% of the tests. Under some circumstances, it even outperforms a network of classifiers completely trained with supervised learning. As a result, the need for on-site supervised learning and communication for training is completely eliminated by our solution.
Cui, Yong; Wang, Qiusheng; Yuan, Haiwen; Song, Xiao; Hu, Xuemin; Zhao, Luxing
2015-01-01
In the wireless sensor networks (WSNs) for electric field measurement system under the High-Voltage Direct Current (HVDC) transmission lines, it is necessary to obtain the electric field distribution with multiple sensors. The location information of each sensor is essential to the correct analysis of measurement results. Compared with the existing approach which gathers the location information by manually labelling sensors during deployment, the automatic localization can reduce the workload and improve the measurement efficiency. A novel and practical range-free localization algorithm for the localization of one-dimensional linear topology wireless networks in the electric field measurement system is presented. The algorithm utilizes unknown nodes' neighbor lists based on the Received Signal Strength Indicator (RSSI) values to determine the relative locations of nodes. The algorithm is able to handle the exceptional situation of the output permutation which can effectively improve the accuracy of localization. The performance of this algorithm under real circumstances has been evaluated through several experiments with different numbers of nodes and different node deployments in the China State Grid HVDC test base. Results show that the proposed algorithm achieves an accuracy of over 96% under different conditions. PMID:25658390
Cui, Yong; Wang, Qiusheng; Yuan, Haiwen; Song, Xiao; Hu, Xuemin; Zhao, Luxing
2015-02-04
In the wireless sensor networks (WSNs) for electric field measurement system under the High-Voltage Direct Current (HVDC) transmission lines, it is necessary to obtain the electric field distribution with multiple sensors. The location information of each sensor is essential to the correct analysis of measurement results. Compared with the existing approach which gathers the location information by manually labelling sensors during deployment, the automatic localization can reduce the workload and improve the measurement efficiency. A novel and practical range-free localization algorithm for the localization of one-dimensional linear topology wireless networks in the electric field measurement system is presented. The algorithm utilizes unknown nodes' neighbor lists based on the Received Signal Strength Indicator (RSSI) values to determine the relative locations of nodes. The algorithm is able to handle the exceptional situation of the output permutation which can effectively improve the accuracy of localization. The performance of this algorithm under real circumstances has been evaluated through several experiments with different numbers of nodes and different node deployments in the China State Grid HVDC test base. Results show that the proposed algorithm achieves an accuracy of over 96% under different conditions.
Space station structures development
NASA Technical Reports Server (NTRS)
Teller, V. B.
1986-01-01
A study of three interrelated tasks focusing on deployable Space Station truss structures is discussed. Task 1, the development of an alternate deployment system for linear truss, resulted in the preliminary design of an in-space reloadable linear motor deployer. Task 2, advanced composites deployable truss development, resulted in the testing and evaluation of composite materials for struts used in a deployable linear truss. Task 3, assembly of structures in space/erectable structures, resulted in the preliminary design of Space Station pressurized module support structures. An independent, redundant support system was developed for the common United States modules.
2014-02-11
ISS038-E-044887 (11 Feb. 2014) --- The Small Satellite Orbital Deployer (SSOD), in the grasp of the Kibo laboratory robotic arm, is photographed by an Expedition 38 crew member on the International Space Station as it deploys a set of NanoRacks CubeSats. The CubeSats program contains a variety of experiments such as Earth observations and advanced electronics testing.
2014-02-11
ISS038-E-044889 (11 Feb. 2014) --- The Small Satellite Orbital Deployer (SSOD), in the grasp of the Kibo laboratory robotic arm, is photographed by an Expedition 38 crew member on the International Space Station as it deploys a set of NanoRacks CubeSats. The CubeSats program contains a variety of experiments such as Earth observations and advanced electronics testing.
2014-02-11
ISS038-E-044890 (11 Feb. 2014) --- The Small Satellite Orbital Deployer (SSOD), in the grasp of the Kibo laboratory robotic arm, is photographed by an Expedition 38 crew member on the International Space Station as it deploys a set of NanoRacks CubeSats. The CubeSats program contains a variety of experiments such as Earth observations and advanced electronics testing.
Advanced public transportation system deployment in the United States
DOT National Transportation Integrated Search
1999-01-01
This report documents work performed under FTA's Advanced Public Transportation Systems (APTS) Program, a program structured to undertake research and development of innovative applications of advanced navigation, information, and communication techn...
Advanced Opto-Electronics (LIDAR and Microsensor Development)
NASA Technical Reports Server (NTRS)
Vanderbilt, Vern C. (Technical Monitor); Spangler, Lee H.
2005-01-01
Our overall intent in this aspect of the project were to establish a collaborative effort between several departments at Montana State University for developing advanced optoelectronic technology for advancing the state-of-the-art in optical remote sensing of the environment. Our particular focus was on development of small systems that can eventually be used in a wide variety of applications that might include ground-, air-, and space deployments, possibly in sensor networks. Specific objectives were to: 1) Build a field-deployable direct-detection lidar system for use in measurements of clouds, aerosols, fish, and vegetation; 2) Develop a breadboard prototype water vapor differential absorption lidar (DIAL) system based on highly stable, tunable diode laser technology developed previously at MSU. We accomplished both primary objectives of this project, in developing a field-deployable direct-detection lidar and a breadboard prototype of a water vapor DIAL system. Paper summarizes each of these accomplishments.
The USAID-NREL Partnership: Delivering Clean, Reliable, and Affordable Power in the Developing World
DOE Office of Scientific and Technical Information (OSTI.GOV)
Watson, Andrea C; Leisch, Jennifer E
The U.S. Agency for International Development (USAID) and the National Renewable Energy Laboratory (NREL) are partnering to support clean, reliable, and affordable power in the developing world. The USAID-NREL Partnership helps countries with policy, planning, and deployment support for advanced energy technologies. Through this collaboration, USAID is accessing advanced energy expertise and analysis pioneered by the U.S. National Laboratory system. The Partnership addresses critical aspects of advanced energy systems including renewable energy deployment, grid modernization, distributed energy resources and storage, power sector resilience, and the data and analytical tools needed to support them.
The ADVANCE project : formal evaluation of the targeted deployment. Volume 1
DOT National Transportation Integrated Search
1997-01-01
The Advanced Driver and Vehicle Advisory Navigation ConcEpt (ADVANCE) was an in-vehicle advanced traveler information system (ATIS) that operated in the northwest suburbs of Chicago, Illinois. It was designed to provide origin-destination shortest-ti...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kurt Derr
Mobile Ad hoc NETworks (MANETs) are distributed self-organizing networks that can change locations and configure themselves on the fly. This paper focuses on an algorithmic approach for the deployment of a MANET within an enclosed area, such as a building in a disaster scenario, which can provide a robust communication infrastructure for search and rescue operations. While a virtual spring mesh (VSM) algorithm provides scalable, self-organizing, and fault-tolerant capabilities required by aMANET, the VSM lacks the MANET's capabilities of deployment mechanisms for blanket coverage of an area and does not provide an obstacle avoidance mechanism. This paper presents a newmore » technique, an extended VSM (EVSM) algorithm that provides the following novelties: (1) new control laws for exploration and expansion to provide blanket coverage, (2) virtual adaptive springs enabling the mesh to expand as necessary, (3) adapts to communications disturbances by varying the density and movement of mobile nodes, and (4) new metrics to assess the performance of the EVSM algorithm. Simulation results show that EVSM provides up to 16% more coverage and is 3.5 times faster than VSM in environments with eight obstacles.« less
al-Rifaie, Mohammad Majid; Aber, Ahmed; Hemanth, Duraiswamy Jude
2015-12-01
This study proposes an umbrella deployment of swarm intelligence algorithm, such as stochastic diffusion search for medical imaging applications. After summarising the results of some previous works which shows how the algorithm assists in the identification of metastasis in bone scans and microcalcifications on mammographs, for the first time, the use of the algorithm in assessing the CT images of the aorta is demonstrated along with its performance in detecting the nasogastric tube in chest X-ray. The swarm intelligence algorithm presented in this study is adapted to address these particular tasks and its functionality is investigated by running the swarms on sample CT images and X-rays whose status have been determined by senior radiologists. In addition, a hybrid swarm intelligence-learning vector quantisation (LVQ) approach is proposed in the context of magnetic resonance (MR) brain image segmentation. The particle swarm optimisation is used to train the LVQ which eliminates the iteration-dependent nature of LVQ. The proposed methodology is used to detect the tumour regions in the abnormal MR brain images.
Next Generation Seismic Imaging; High Fidelity Algorithms and High-End Computing
NASA Astrophysics Data System (ADS)
Bevc, D.; Ortigosa, F.; Guitton, A.; Kaelin, B.
2007-05-01
The rich oil reserves of the Gulf of Mexico are buried in deep and ultra-deep waters up to 30,000 feet from the surface. Minerals Management Service (MMS), the federal agency in the U.S. Department of the Interior that manages the nation's oil, natural gas and other mineral resources on the outer continental shelf in federal offshore waters, estimates that the Gulf of Mexico holds 37 billion barrels of "undiscovered, conventionally recoverable" oil, which, at 50/barrel, would be worth approximately 1.85 trillion. These reserves are very difficult to find and reach due to the extreme depths. Technological advances in seismic imaging represent an opportunity to overcome this obstacle by providing more accurate models of the subsurface. Among these technological advances, Reverse Time Migration (RTM) yields the best possible images. RTM is based on the solution of the two-way acoustic wave-equation. This technique relies on the velocity model to image turning waves. These turning waves are particularly important to unravel subsalt reservoirs and delineate salt-flanks, a natural trap for oil and gas. Because it relies on an accurate velocity model, RTM opens new frontier in designing better velocity estimation algorithms. RTM has been widely recognized as the next chapter in seismic exploration, as it can overcome the limitations of current migration methods in imaging complex geologic structures that exist in the Gulf of Mexico. The chief impediment to the large-scale, routine deployment of RTM has been a lack of sufficient computer power. RTM needs thirty times the computing power used in exploration today to be commercially viable and widely usable. Therefore, advancing seismic imaging to the next level of precision poses a multi-disciplinary challenge. To overcome these challenges, the Kaleidoscope project, a partnership between Repsol YPF, Barcelona Supercomputing Center, 3DGeo Inc., and IBM brings together the necessary components of modeling, algorithms and the uniquely powerful computing power of the MareNostrum supercomputer in Barcelona to realize the promise of RTM, incorporate it into daily processing flows, and to help solve exploration problems in a highly cost-effective way. Uniquely, the Kaleidoscope Project is simultaneously integrating software (algorithms) and hardware (Cell BE), steps that are traditionally taken sequentially. This unique integration of software and hardware will accelerate seismic imaging by several orders of magnitude compared to conventional solutions running on standard Linux Clusters.
Development of a verification program for deployable truss advanced technology
NASA Technical Reports Server (NTRS)
Dyer, Jack E.
1988-01-01
Use of large deployable space structures to satisfy the growth demands of space systems is contingent upon reducing the associated risks that pervade many related technical disciplines. The overall objectives of this program was to develop a detailed plan to verify deployable truss advanced technology applicable to future large space structures and to develop a preliminary design of a deployable truss reflector/beam structure for use a a technology demonstration test article. The planning is based on a Shuttle flight experiment program using deployable 5 and 15 meter aperture tetrahedral truss reflections and a 20 m long deployable truss beam structure. The plan addresses validation of analytical methods, the degree to which ground testing adequately simulates flight and in-space testing requirements for large precision antenna designs. Based on an assessment of future NASA and DOD space system requirements, the program was developed to verify four critical technology areas: deployment, shape accuracy and control, pointing and alignment, and articulation and maneuvers. The flight experiment technology verification objectives can be met using two shuttle flights with the total experiment integrated on a single Shuttle Test Experiment Platform (STEP) and a Mission Peculiar Experiment Support Structure (MPESS). First flight of the experiment can be achieved 60 months after go-ahead with a total program duration of 90 months.
Technology readiness levels for advanced nuclear fuels and materials development
Carmack, W. J.; Braase, L. A.; Wigeland, R. A.; ...
2016-12-23
The Technology Readiness Level (TRL) process is used to quantitatively assess the maturity of a given technology. It was pioneered by the National Aeronautics and Space Administration (NASA) in the 1980s to develop and deploy new systems for space applications. The process was subsequently adopted by the Department of Defense (DoD) to develop and deploy new technology and systems for defense applications as well as the Department of Energy (DOE) to evaluate the maturity of new technologies in major construction projects. Advanced nuclear fuels and materials development is a critical technology needed for improving the performance and safety of currentmore » and advanced reactors, and ultimately closing the nuclear fuel cycle. Because deployment of new nuclear fuel forms requires a lengthy and expensive research, development, and demonstration program, applying the TRL concept to the advanced fuel development program is very useful as a management, communication and tracking tool. Furthermore, this article provides examples regarding the methods by which TRLs are currently used to assess the maturity of nuclear fuels and materials under development in the DOE Fuel Cycle Research and Development (FCRD) Program within the Advanced Fuels Campaign (AFC).« less
Technology readiness levels for advanced nuclear fuels and materials development
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carmack, W. J.; Braase, L. A.; Wigeland, R. A.
The Technology Readiness Level (TRL) process is used to quantitatively assess the maturity of a given technology. It was pioneered by the National Aeronautics and Space Administration (NASA) in the 1980s to develop and deploy new systems for space applications. The process was subsequently adopted by the Department of Defense (DoD) to develop and deploy new technology and systems for defense applications as well as the Department of Energy (DOE) to evaluate the maturity of new technologies in major construction projects. Advanced nuclear fuels and materials development is a critical technology needed for improving the performance and safety of currentmore » and advanced reactors, and ultimately closing the nuclear fuel cycle. Because deployment of new nuclear fuel forms requires a lengthy and expensive research, development, and demonstration program, applying the TRL concept to the advanced fuel development program is very useful as a management, communication and tracking tool. Furthermore, this article provides examples regarding the methods by which TRLs are currently used to assess the maturity of nuclear fuels and materials under development in the DOE Fuel Cycle Research and Development (FCRD) Program within the Advanced Fuels Campaign (AFC).« less
Demonstrations of Deployable Systems for Robotic Precursor Missions
NASA Technical Reports Server (NTRS)
Dervan, J.; Johnson, L.; Lockett, T.; Carr, J.; Boyd, D.
2017-01-01
NASA is developing thin-film based, deployable propulsion, power, and communication systems for small spacecraft that serve as enabling technologies for exploration of the solar system. By leveraging recent advancements in thin films, photovoltaics, deployment systems, and miniaturized electronics, new mission-level capabilities will be demonstrated aboard small spacecraft enabling a new generation of frequent, inexpensive, and highly capable robotic precursor missions with goals extensible to future human exploration. Specifically, thin-film technologies are allowing the development and use of solar sails for propulsion, small, lightweight photovoltaics for power, and omnidirectional antennas for communication as demonstrated by recent advances on the Near Earth Asteroid (NEA) Scout and Lightweight Integrated Solar Array and anTenna (LISA-T) projects.
Finite-Time Performance of Local Search Algorithms: Theory and Application
2010-06-10
security devices deployed at airport security checkpoints are used to detect prohibited items (e.g., guns, knives, explosives). Each security device...security devices are deployed, the practical issue of determining how to optimally use them can be difficult. For an airport security system design...checked baggage), explosive detection systems (designed to detect explosives in checked baggage), and detailed hand search by an airport security official
NASA Technical Reports Server (NTRS)
Stokes, J. W.; Pruett, E. C.
1980-01-01
A cost algorithm for predicting assembly costs for large space structures is given. Assembly scenarios are summarized which describe the erection, deployment, and fabrication tasks for five large space structures. The major activities that impact total costs for structure assembly from launch through deployment and assembly to scientific instrument installation and checkout are described. Individual cost elements such as assembly fixtures, handrails, or remote minipulators are also presented.
2014-02-11
ISS038-E-044883 (11 Feb. 2014) --- The Small Satellite Orbital Deployer (SSOD), in the grasp of the Kibo laboratory robotic arm, is photographed by an Expedition 38 crew member on the International Space Station as it begins the deployment of a set of NanoRacks CubeSats. The CubeSats program contains a variety of experiments such as Earth observations and advanced electronics testing.
2014-02-11
ISS038-E-044994 (11 Feb. 2014) --- The Small Satellite Orbital Deployer (SSOD), in the grasp of the Kibo laboratory robotic arm, is photographed by an Expedition 38 crew member on the International Space Station prior to the deployment of a set of NanoRacks CubeSats. The CubeSats program contains a variety of experiments such as Earth observations and advanced electronics testing.
Advanced public transportation systems deployment in the United States : year 2002 update
DOT National Transportation Integrated Search
2003-06-01
This report documents work performed under the Federal Transit Administration's Advanced Public Transportation Systems (APTS) Program, a program structured to undertake research and development of innovative applications of advanced navigation, infor...
Advanced public transportation systems deployment in the United States : year 2000 update
DOT National Transportation Integrated Search
2002-05-01
This report documents work performed under the Federal Transit Administration's Advanced Public Transportation Systems (APTS) Program, a program structured to undertake research and development of innovative applications of advanced navigation, infor...
Advanced public transportation systems deployment in the United States : year 2004 update
DOT National Transportation Integrated Search
2005-06-01
This report documents work performed under the Federal Transit Administration's Advanced Public Transportation Systems (APTS) Program, a program structured to undertake research and development of innovative applications of advanced navigation, infor...
Advanced Public Transportation Systems Deployment in the United States, Year 2000, Update
DOT National Transportation Integrated Search
2002-05-01
This report documents work performed under the Federal Transit Administration's Advanced Public Transportation Systems (APTS) Program, a program structured to undertake research and development of innovative applications of advanced navigation, infor...
Advanced Public Transportation Systems Deployment in the United States. Update, January 1999
DOT National Transportation Integrated Search
1999-01-01
This report documents work performed under FTA's Advanced Public Transportation Systems (APTS) Program, a program structured to undertake research and development of innovative applications of advanced navigation, information, and communication techn...
Advanced public transportation systems deployment in the United States : update, January 1999
DOT National Transportation Integrated Search
1999-01-01
This report documents work performed under FTA's Advanced Public Transportation Systems (APTS) Program, a program structured to undertake research and development of innovative applications of advances navigation, information, and communication techn...
Shape accuracy optimization for cable-rib tension deployable antenna structure with tensioned cables
NASA Astrophysics Data System (ADS)
Liu, Ruiwei; Guo, Hongwei; Liu, Rongqiang; Wang, Hongxiang; Tang, Dewei; Song, Xiaoke
2017-11-01
Shape accuracy is of substantial importance in deployable structures as the demand for large-scale deployable structures in various fields, especially in aerospace engineering, increases. The main purpose of this paper is to present a shape accuracy optimization method to find the optimal pretensions for the desired shape of cable-rib tension deployable antenna structure with tensioned cables. First, an analysis model of the deployable structure is established by using finite element method. In this model, geometrical nonlinearity is considered for the cable element and beam element. Flexible deformations of the deployable structure under the action of cable network and tensioned cables are subsequently analyzed separately. Moreover, the influence of pretension of tensioned cables on natural frequencies is studied. Based on the results, a genetic algorithm is used to find a set of reasonable pretension and thus minimize structural deformation under the first natural frequency constraint. Finally, numerical simulations are presented to analyze the deployable structure under two kinds of constraints. Results show that the shape accuracy and natural frequencies of deployable structure can be effectively improved by pretension optimization.
NASA Astrophysics Data System (ADS)
Baránek, M.; Běhal, J.; Bouchal, Z.
2018-01-01
In the phase retrieval applications, the Gerchberg-Saxton (GS) algorithm is widely used for the simplicity of implementation. This iterative process can advantageously be deployed in the combination with a spatial light modulator (SLM) enabling simultaneous correction of optical aberrations. As recently demonstrated, the accuracy and efficiency of the aberration correction using the GS algorithm can be significantly enhanced by a vortex image spot used as the target intensity pattern in the iterative process. Here we present an optimization of the spiral phase modulation incorporated into the GS algorithm.
Biomorphic Explorers Leading Towards a Robotic Ecology
NASA Technical Reports Server (NTRS)
Thakoor, Sarita; Miralles, Carlos; Chao, Tien-Hsin
1999-01-01
This paper presents viewgraphs of biomorphic explorers as they provide extended survival and useful life of robots in ecology. The topics include: 1) Biomorphic Explorers; 2) Advanced Mobility for Biomorphic Explorers; 3) Biomorphic Explorers: Size Based Classification; 4) Biomorphic Explorers: Classification (Based on Mobility and Ambient Environment); 5) Biomorphic Flight Systems: Vision; 6) Biomorphic Glider Deployment Concept: Larger Glider Deploy/Local Relay; 7) Biomorphic Glider Deployment Concept: Balloon Deploy/Dual Relay; 8) Biomorphic Exlplorer: Conceptual Design; 9) Biomorphic Gliders; and 10) Applications.
Global Combat Support System Army Increment 1 (GCSS-A Inc 1)
2016-03-01
Acquisition Executive DoD - Department of Defense DoDAF - DoD Architecture Framework FD - Full Deployment FDD - Full Deployment Decision FY - Fiscal Year...another economic anaylsis was completed on November 14, 2012, in advance of a successful FDD . The program is now in the O&S Phase. GCSS-A Inc 1 2016...Increment I Feb 2011 Aug 2011 Full Deployment Decision ( FDD )1 Feb 2012 Dec 2012 Full Deployment (FD)2 Sep 2017 Mar 2018 Memo 1/ GCSS-A Increment 1
High-Capacity Communications from Martian Distances Part 2: Spacecraft Antennas and Power Systems
NASA Technical Reports Server (NTRS)
Hodges, Richard E.; Kodis, Mary Anne; Epp, Larry W.; Orr, Richard; Schuchman, Leonard; Collins, Michael; Sands, O. Scott; Vyas, Hemali; Williams, W. Dan
2006-01-01
This paper summarizes recent advances in antenna and power systems technology to enable a high data rate Ka-band Mars-to-Earth telecommunications system. Promising antenna technologies are lightweight, deployable space qualified structures at least 12-m in diameter (potentially up to 25-m). These technologies include deployable mesh reflectors, inflatable reflectarray and folded thermosetting composite. Advances in 1kW-class RF power amplifiers include both TWTA and SSPA technologies.
DOT National Transportation Integrated Search
1997-01-01
Intelligent transportation systems (ITS) are systems that utilize advanced technologies, including computer, communications and process control technologies, to improve the efficiency and safety of the transportation system. These systems encompass a...
Recommendations of the National Mayday Readiness Initiative
DOT National Transportation Integrated Search
2000-10-23
Automobile companies are rapidly deploying millions of vehicles with increasingly advanced : abilities to detect, collect and wirelessly transmit crisis-related voice and crash data at the push of a button or the deployment of an airbag. The next gen...
Intelligent transportation systems for work zones : deployment benefits and lessons learned
DOT National Transportation Integrated Search
2000-12-01
This paper presents what has been learned in four principal areas of arterial management: 1) adaptive control strategies; 2) advanced traveler information systems; 3) automated enforcement; and 4) integration. The levels of deployment, benefits, depl...
Very low cost real time histogram-based contrast enhancer utilizing fixed-point DSP processing
NASA Astrophysics Data System (ADS)
McCaffrey, Nathaniel J.; Pantuso, Francis P.
1998-03-01
A real time contrast enhancement system utilizing histogram- based algorithms has been developed to operate on standard composite video signals. This low-cost DSP based system is designed with fixed-point algorithms and an off-chip look up table (LUT) to reduce the cost considerably over other contemporary approaches. This paper describes several real- time contrast enhancing systems advanced at the Sarnoff Corporation for high-speed visible and infrared cameras. The fixed-point enhancer was derived from these high performance cameras. The enhancer digitizes analog video and spatially subsamples the stream to qualify the scene's luminance. Simultaneously, the video is streamed through a LUT that has been programmed with the previous calculation. Reducing division operations by subsampling reduces calculation- cycles and also allows the processor to be used with cameras of nominal resolutions. All values are written to the LUT during blanking so no frames are lost. The enhancer measures 13 cm X 6.4 cm X 3.2 cm, operates off 9 VAC and consumes 12 W. This processor is small and inexpensive enough to be mounted with field deployed security cameras and can be used for surveillance, video forensics and real- time medical imaging.
NASA Astrophysics Data System (ADS)
Strippoli, L. S.; Gonzalez-Arjona, D. G.
2018-04-01
GMV extensively worked in many activities aimed at developing, validating, and verifying up to TRL-6 advanced GNC and IP algorithms for Mars Sample Return rendezvous working under different ESA contracts on the development of advanced algorithms for VBN sensor.
Energy storage deployment and innovation for the clean energy transition
NASA Astrophysics Data System (ADS)
Kittner, Noah; Lill, Felix; Kammen, Daniel M.
2017-09-01
The clean energy transition requires a co-evolution of innovation, investment, and deployment strategies for emerging energy storage technologies. A deeply decarbonized energy system research platform needs materials science advances in battery technology to overcome the intermittency challenges of wind and solar electricity. Simultaneously, policies designed to build market growth and innovation in battery storage may complement cost reductions across a suite of clean energy technologies. Further integration of R&D and deployment of new storage technologies paves a clear route toward cost-effective low-carbon electricity. Here we analyse deployment and innovation using a two-factor model that integrates the value of investment in materials innovation and technology deployment over time from an empirical dataset covering battery storage technology. Complementary advances in battery storage are of utmost importance to decarbonization alongside improvements in renewable electricity sources. We find and chart a viable path to dispatchable US$1 W-1 solar with US$100 kWh-1 battery storage that enables combinations of solar, wind, and storage to compete directly with fossil-based electricity options.
A Revolute Joint With Linear Load-Displacement Response for Precision Deployable Structures
NASA Technical Reports Server (NTRS)
Lake, Mark S.; Warren, Peter A.; Peterson, Lee D.
1996-01-01
NASA Langley Research center is developing key structures and mechanisms technologies for micron-accuracy, in-space deployment of future space instruments. Achieving micron-accuracy deployment requires significant advancements in deployment mechanism design such as the revolute joint presented herein. The joint presented herein exhibits a load-cycling response that is essentially linear with less than two percent hysteresis, and the joint rotates with less than one in.-oz. of resistance. A prototype reflector metering truss incorporating the joint exhibits only a few microns of kinematic error under repeated deployment and impulse loading. No other mechanically deployable structure found in literature has been demonstrated to be this kinematically accurate.
Expanding the role of the nurse practitioner in the deployed setting.
Dargis, Julie; Horne, Theresa; Tillman-Ortiz, Sophie; Scherr, Diane; Yackel, Edward E
2006-08-01
Today's military is experiencing rapid advances in technology and in manpower utilization. The Army Medical Department is redesigning the structure and function of deployable hospital systems as part of this effort. The transformation of deployable hospital systems requires that a critical analysis of manpower utilization be undertaken to optimize the employment of soldier-medics. The objective of this article was to describe the use of nurse practitioners as primary care providers during deployment. The lived experiences of five nurse practitioners deployed to Operation Iraqi Freedom are presented. Data gathered during the deployment and an analysis of the literature clearly support expanded and legitimized roles for these health care professionals in future conflicts and peacekeeping operations.
Navy Pre-Deployment Training at Eglin AFB, Florida Final Environmental Assessment
2004-02-10
Only-Radar ROW Rest of the World RUR Range Utilization Report SACEX Supporting Arms Coordination Exercise LIST OF ACRONYMS AND ABBREVIATIONS...Wildlife Service USGS U.S. Geological Survey UXO Unexploded Ordnance VOC Volatile Organic Compounds WHO World Health Organization Purpose and...Assessment While most deployments are scheduled long in advance, short-notice deployments often occur in response to world crises. The Atlantic Fleet’s
Mechanism Design and Testing of a Self-Deploying Structure Using Flexible Composite Tape Springs
NASA Technical Reports Server (NTRS)
Footdale, Joseph N.; Murphey, Thomas W.
2014-01-01
The detailed mechanical design of a novel deployable support structure that positions and tensions a membrane optic for space imagining applications is presented. This is a complex three-dimensional deployment using freely deploying rollable composite tape spring booms that become load bearing structural members at full deployment. The deployment tests successfully demonstrate a new architecture based on rolled and freely deployed composite tape spring members that achieve simultaneous deployment without mechanical synchronization. Proper design of the flexible component mounting interface and constraint systems, which were critical in achieving a functioning unit, are described. These flexible composite components have much potential for advancing the state of the art in deployable structures, but have yet to be widely adopted. This paper demonstrates the feasibility and advantages of implementing flexible composite components, including the design details on how to integrate with required traditional mechanisms.
Fully Automated Detection of Cloud and Aerosol Layers in the CALIPSO Lidar Measurements
NASA Technical Reports Server (NTRS)
Vaughan, Mark A.; Powell, Kathleen A.; Kuehn, Ralph E.; Young, Stuart A.; Winker, David M.; Hostetler, Chris A.; Hunt, William H.; Liu, Zhaoyan; McGill, Matthew J.; Getzewich, Brian J.
2009-01-01
Accurate knowledge of the vertical and horizontal extent of clouds and aerosols in the earth s atmosphere is critical in assessing the planet s radiation budget and for advancing human understanding of climate change issues. To retrieve this fundamental information from the elastic backscatter lidar data acquired during the Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observations (CALIPSO) mission, a selective, iterated boundary location (SIBYL) algorithm has been developed and deployed. SIBYL accomplishes its goals by integrating an adaptive context-sensitive profile scanner into an iterated multiresolution spatial averaging scheme. This paper provides an in-depth overview of the architecture and performance of the SIBYL algorithm. It begins with a brief review of the theory of target detection in noise-contaminated signals, and an enumeration of the practical constraints levied on the retrieval scheme by the design of the lidar hardware, the geometry of a space-based remote sensing platform, and the spatial variability of the measurement targets. Detailed descriptions are then provided for both the adaptive threshold algorithm used to detect features of interest within individual lidar profiles and the fully automated multiresolution averaging engine within which this profile scanner functions. The resulting fusion of profile scanner and averaging engine is specifically designed to optimize the trade-offs between the widely varying signal-to-noise ratio of the measurements and the disparate spatial resolutions of the detection targets. Throughout the paper, specific algorithm performance details are illustrated using examples drawn from the existing CALIPSO dataset. Overall performance is established by comparisons to existing layer height distributions obtained by other airborne and space-based lidars.
Real-Time Detection of Infusion Site Failures in a Closed-Loop Artificial Pancreas.
Howsmon, Daniel P; Baysal, Nihat; Buckingham, Bruce A; Forlenza, Gregory P; Ly, Trang T; Maahs, David M; Marcal, Tatiana; Towers, Lindsey; Mauritzen, Eric; Deshpande, Sunil; Huyett, Lauren M; Pinsker, Jordan E; Gondhalekar, Ravi; Doyle, Francis J; Dassau, Eyal; Hahn, Juergen; Bequette, B Wayne
2018-05-01
As evidence emerges that artificial pancreas systems improve clinical outcomes for patients with type 1 diabetes, the burden of this disease will hopefully begin to be alleviated for many patients and caregivers. However, reliance on automated insulin delivery potentially means patients will be slower to act when devices stop functioning appropriately. One such scenario involves an insulin infusion site failure, where the insulin that is recorded as delivered fails to affect the patient's glucose as expected. Alerting patients to these events in real time would potentially reduce hyperglycemia and ketosis associated with infusion site failures. An infusion site failure detection algorithm was deployed in a randomized crossover study with artificial pancreas and sensor-augmented pump arms in an outpatient setting. Each arm lasted two weeks. Nineteen participants wore infusion sets for up to 7 days. Clinicians contacted patients to confirm infusion site failures detected by the algorithm and instructed on set replacement if failure was confirmed. In real time and under zone model predictive control, the infusion site failure detection algorithm achieved a sensitivity of 88.0% (n = 25) while issuing only 0.22 false positives per day, compared with a sensitivity of 73.3% (n = 15) and 0.27 false positives per day in the SAP arm (as indicated by retrospective analysis). No association between intervention strategy and duration of infusion sets was observed ( P = .58). As patient burden is reduced by each generation of advanced diabetes technology, fault detection algorithms will help ensure that patients are alerted when they need to manually intervene. Clinical Trial Identifier: www.clinicaltrials.gov,NCT02773875.
Sniffer Channel Selection for Monitoring Wireless LANs
NASA Astrophysics Data System (ADS)
Song, Yuan; Chen, Xian; Kim, Yoo-Ah; Wang, Bing; Chen, Guanling
Wireless sniffers are often used to monitor APs in wireless LANs (WLANs) for network management, fault detection, traffic characterization, and optimizing deployment. It is cost effective to deploy single-radio sniffers that can monitor multiple nearby APs. However, since nearby APs often operate on orthogonal channels, a sniffer needs to switch among multiple channels to monitor its nearby APs. In this paper, we formulate and solve two optimization problems on sniffer channel selection. Both problems require that each AP be monitored by at least one sniffer. In addition, one optimization problem requires minimizing the maximum number of channels that a sniffer listens to, and the other requires minimizing the total number of channels that the sniffers listen to. We propose a novel LP-relaxation based algorithm, and two simple greedy heuristics for the above two optimization problems. Through simulation, we demonstrate that all the algorithms are effective in achieving their optimization goals, and the LP-based algorithm outperforms the greedy heuristics.
P2MP MPLS-Based Hierarchical Service Management System
NASA Astrophysics Data System (ADS)
Kumaki, Kenji; Nakagawa, Ikuo; Nagami, Kenichi; Ogishi, Tomohiko; Ano, Shigehiro
This paper proposes a point-to-multipoint (P2MP) Multi-Protocol Label Switching (MPLS) based hierarchical service management system. Traditionally, general management systems deployed in some service providers control MPLS Label Switched Paths (LSPs) (e.g., RSVP-TE and LDP) and services (e.g., L2VPN, L3VPN and IP) separately. In order for dedicated management systems for MPLS LSPs and services to cooperate with each other automatically, a hierarchical service management system has been proposed with the main focus on point-to-point (P2P) TE LSPs in MPLS path management. In the case where P2MP TE LSPs and services are deployed in MPLS networks, the dedicated management systems for P2MP TE LSPs and services must work together automatically. Therefore, this paper proposes a new algorithm that uses a correlation between P2MP TE LSPs and multicast VPN services based on a P2MP MPLS-based hierarchical service management architecture. Also, the capacity and performance of the proposed algorithm are evaluated by simulations, which are actually based on certain real MPLS production networks, and are compared to that of the algorithm for P2P TE LSPs. Results show this system is very scalable within real MPLS production networks. This system, with the automatic correlation, appears to be deployable in real MPLS production networks.
on how to understand and plan for transportation advancements, including the increasing deployment of topics: Transportation electrification and the infrastructure necessary to support the increasing increasing deployment of these technologies; Impact of on-demand transit and mobility services on public
Flexible and evolutionary optical access networks
NASA Astrophysics Data System (ADS)
Hsueh, Yu-Li
Passive optical networks (PONs) are promising solutions that will open the first-mile bottleneck. Current PONs employ time division multiplexing (TDM) to share bandwidth among users, leading to low cost but limited capacity. In the future, wavelength division multiplexing (WDM) technologies will be deployed to achieve high performance. This dissertation describes several advanced technologies to enhance PON systems. A spectral shaping line coding scheme is developed to allow a simple and cost-effective overlay of high data-rate services in existing PONs, leaving field-deployed fibers and existing services untouched. Spectral shapes of coded signals can be manipulated to adapt to different systems. For a specific tolerable interference level, the optimal line code can be found which maximizes the data throughput. Experiments are conducted to demonstrate and compare several optimized line codes. A novel PON employing dynamic wavelength allocation to provide bandwidth sharing across multiple physical PONs is designed and experimentally demonstrated. Tunable lasers, arrayed waveguide gratings, and coarse/fine filtering combine to create a flexible optical access solution. The network's excellent scalability can bridge the gap between conventional TDM PONs and WDM PONs. Scheduling algorithms with quality of service support are also investigated. Simulation results show that the proposed architecture exhibits significant performance gain over conventional PON systems. Streaming video transmission is demonstrated on the prototype experimental testbed. The powerful architecture is a promising candidate for next-generation optical access networks. A new CDR circuit for receiving the bursty traffic in PONs is designed and analyzed. It detects data transition edges upon arrival of the data burst and quickly selects the best clock phase by a control logic circuit. Then, an analog delay-locked loop (DLL) keeps track of data transitions and removes phase errors throughout the burst. The combination of the fast phase detection mechanism and a feedback loop based on DLL allows both fast response and manageable jitter performance in the burst-mode application. A new efficient numerical algorithm is developed to analyze holey optical fibers. The algorithm has been verified against experimental data, and is exploited to design holey optical fibers optimized for the discrete Raman amplification.
Scalable Parallel Methods for Analyzing Metagenomics Data at Extreme Scale
DOE Office of Scientific and Technical Information (OSTI.GOV)
Daily, Jeffrey A.
2015-05-01
The field of bioinformatics and computational biology is currently experiencing a data revolution. The exciting prospect of making fundamental biological discoveries is fueling the rapid development and deployment of numerous cost-effective, high-throughput next-generation sequencing technologies. The result is that the DNA and protein sequence repositories are being bombarded with new sequence information. Databases are continuing to report a Moore’s law-like growth trajectory in their database sizes, roughly doubling every 18 months. In what seems to be a paradigm-shift, individual projects are now capable of generating billions of raw sequence data that need to be analyzed in the presence of alreadymore » annotated sequence information. While it is clear that data-driven methods, such as sequencing homology detection, are becoming the mainstay in the field of computational life sciences, the algorithmic advancements essential for implementing complex data analytics at scale have mostly lagged behind. Sequence homology detection is central to a number of bioinformatics applications including genome sequencing and protein family characterization. Given millions of sequences, the goal is to identify all pairs of sequences that are highly similar (or “homologous”) on the basis of alignment criteria. While there are optimal alignment algorithms to compute pairwise homology, their deployment for large-scale is currently not feasible; instead, heuristic methods are used at the expense of quality. In this dissertation, we present the design and evaluation of a parallel implementation for conducting optimal homology detection on distributed memory supercomputers. Our approach uses a combination of techniques from asynchronous load balancing (viz. work stealing, dynamic task counters), data replication, and exact-matching filters to achieve homology detection at scale. Results for a collection of 2.56M sequences show parallel efficiencies of ~75-100% on up to 8K cores, representing a time-to-solution of 33 seconds. We extend this work with a detailed analysis of single-node sequence alignment performance using the latest CPU vector instruction set extensions. Preliminary results reveal that current sequence alignment algorithms are unable to fully utilize widening vector registers.« less
Stochastic Optimization for Nuclear Facility Deployment Scenarios
NASA Astrophysics Data System (ADS)
Hays, Ross Daniel
Single-use, low-enriched uranium oxide fuel, consumed through several cycles in a light-water reactor (LWR) before being disposed, has become the dominant source of commercial-scale nuclear electric generation in the United States and throughout the world. However, it is not without its drawbacks and is not the only potential nuclear fuel cycle available. Numerous alternative fuel cycles have been proposed at various times which, through the use of different reactor and recycling technologies, offer to counteract many of the perceived shortcomings with regards to waste management, resource utilization, and proliferation resistance. However, due to the varying maturity levels of these technologies, the complicated material flow feedback interactions their use would require, and the large capital investments in the current technology, one should not deploy these advanced designs without first investigating the potential costs and benefits of so doing. As the interactions among these systems can be complicated, and the ways in which they may be deployed are many, the application of automated numerical optimization to the simulation of the fuel cycle could potentially be of great benefit to researchers and interested policy planners. To investigate the potential of these methods, a computational program has been developed that applies a parallel, multi-objective simulated annealing algorithm to a computational optimization problem defined by a library of relevant objective functions applied to the Ver ifiable Fuel Cycle Simulati on Model (VISION, developed at the Idaho National Laboratory). The VISION model, when given a specified fuel cycle deployment scenario, computes the numbers and types of, and construction, operation, and utilization schedules for, the nuclear facilities required to meet a predetermined electric power demand function. Additionally, it calculates the location and composition of the nuclear fuels within the fuel cycle, from initial mining through to eventual disposal. By varying the specifications of the deployment scenario, the simulated annealing algorithm will seek to either minimize the value of a single objective function, or enumerate the trade-off surface between multiple competing objective functions. The available objective functions represent key stakeholder values, minimizing such important factors as high-level waste disposal burden, required uranium ore supply, relative proliferation potential, and economic cost and uncertainty. The optimization program itself is designed to be modular, allowing for continued expansion and exploration as research needs and curiosity indicate. The utility and functionality of this optimization program are demonstrated through its application to one potential fuel cycle scenario of interest. In this scenario, an existing legacy LWR fleet is assumed at the year 2000. The electric power demand grows exponentially at a rate of 1.8% per year through the year 2100. Initially, new demand is met by the construction of 1-GW(e) LWRs. However, beginning in the year 2040, 600-MW(e) sodium-cooled, fast-spectrum reactors operating in a transuranic burning regime with full recycling of spent fuel become available to meet demand. By varying the fraction of new capacity allocated to each reactor type, the optimization program is able to explicitly show the relationships that exist between uranium utilization, long-term heat for geologic disposal, and cost-of-electricity objective functions. The trends associated with these trade-off surfaces tend to confirm many common expectations about the use of nuclear power, namely that while overall it is quite insensitive to variations in the cost of uranium ore, it is quite sensitive to changes in the capital costs of facilities. The optimization algorithm has shown itself to be robust and extensible, with possible extensions to many further fuel cycle optimization problems of interest.
Cell-tower deployment of counter-sniper sensors
NASA Astrophysics Data System (ADS)
Storch, Michael T.
2004-09-01
Cellular telephone antenna towers are evaluated as sites for rapid, effective & efficient deployment of counter-sniper sensors, especially in urban environments. They are expected to offer a suitable density, excellent LOS, and a generally limited variety of known or readily-characterized mechanical interfaces. Their precise locations are easily mapped in advance of deployment, are easily accessible by ground and air, and are easily spotted by deployment teams in real-time. We survey issues of EMI & RFI, susceptibility to denial & ambush in military scenarios, and the impact of trends in cell tower design & construction.
CARA Status and Upcoming Enhancements
NASA Technical Reports Server (NTRS)
Johnson, Megan
2017-01-01
CAS 8.4.3 was deployed to operations on 13 June 2017. Discrepancies Between 3D Pc Estimates and advanced Monte Carlo Equinoctial-Sampling Pc Estimates discovered and discussed at 23 May 2017 Useras (Registered Trademark) Forum. The patch created the Reporting Pc, which is the greater value between the calculated 2D and 3D Pc values This changed the Pc reported in the CDMs to the Reporting Pc This changed the Pc reported on the Summary Report to the Reporting Pc This changed the Pc reported on Maneuver Screening Analysis (MSA) Report to the Reporting Pc. Both the 2D and 3D Pc added to the Summary Report details section The patch also updated the 3D Pc algorithm to eliminate velocity covariance from the Pc calculation This will bring 2D and 3D Pc into close alignment for vast majority of events, particularly for the events in which the 2D/3D discrepancy was found.
NASA Technical Reports Server (NTRS)
Komendera, Erik E.; Adhikari, Shaurav; Glassner, Samantha; Kishen, Ashwin; Quartaro, Amy
2017-01-01
Autonomous robotic assembly by mobile field robots has seen significant advances in recent decades, yet practicality remains elusive. Identified challenges include better use of state estimation to and reasoning with uncertainty, spreading out tasks to specialized robots, and implementing representative joining methods. This paper proposes replacing 1) self-correcting mechanical linkages with generalized joints for improved applicability, 2) assembly serial manipulators with parallel manipulators for higher precision and stability, and 3) all-in-one robots with a heterogeneous team of specialized robots for agent simplicity. This paper then describes a general assembly algorithm utilizing state estimation. Finally, these concepts are tested in the context of solar array assembly, requiring a team of robots to assemble, bond, and deploy a set of solar panel mockups to a backbone truss to an accuracy not built into the parts. This paper presents the results of these tests.
NASA Technical Reports Server (NTRS)
Shahidi, Anoosh K.; Schlegelmilch, Richard F.; Petrik, Edward J.; Walters, Jerry L.
1991-01-01
A software application to assist end-users of the link evaluation terminal (LET) for satellite communications is being developed. This software application incorporates artificial intelligence (AI) techniques and will be deployed as an interface to LET. The high burst rate (HBR) LET provides 30 GHz transmitting/20 GHz receiving (220/110 Mbps) capability for wideband communications technology experiments with the Advanced Communications Technology Satellite (ACTS). The HBR LET can monitor and evaluate the integrity of the HBR communications uplink and downlink to the ACTS satellite. The uplink HBR transmission is performed by bursting the bit-pattern as a modulated signal to the satellite. The HBR LET can determine the bit error rate (BER) under various atmospheric conditions by comparing the transmitted bit pattern with the received bit pattern. An algorithm for power augmentation will be applied to enhance the system's BER performance at reduced signal strength caused by adverse conditions.
Node Scheduling Strategies for Achieving Full-View Area Coverage in Camera Sensor Networks.
Wu, Peng-Fei; Xiao, Fu; Sha, Chao; Huang, Hai-Ping; Wang, Ru-Chuan; Xiong, Nai-Xue
2017-06-06
Unlike conventional scalar sensors, camera sensors at different positions can capture a variety of views of an object. Based on this intrinsic property, a novel model called full-view coverage was proposed. We study the problem that how to select the minimum number of sensors to guarantee the full-view coverage for the given region of interest (ROI). To tackle this issue, we derive the constraint condition of the sensor positions for full-view neighborhood coverage with the minimum number of nodes around the point. Next, we prove that the full-view area coverage can be approximately guaranteed, as long as the regular hexagons decided by the virtual grid are seamlessly stitched. Then we present two solutions for camera sensor networks in two different deployment strategies. By computing the theoretically optimal length of the virtual grids, we put forward the deployment pattern algorithm (DPA) in the deterministic implementation. To reduce the redundancy in random deployment, we come up with a local neighboring-optimal selection algorithm (LNSA) for achieving the full-view coverage. Finally, extensive simulation results show the feasibility of our proposed solutions.
Node Scheduling Strategies for Achieving Full-View Area Coverage in Camera Sensor Networks
Wu, Peng-Fei; Xiao, Fu; Sha, Chao; Huang, Hai-Ping; Wang, Ru-Chuan; Xiong, Nai-Xue
2017-01-01
Unlike conventional scalar sensors, camera sensors at different positions can capture a variety of views of an object. Based on this intrinsic property, a novel model called full-view coverage was proposed. We study the problem that how to select the minimum number of sensors to guarantee the full-view coverage for the given region of interest (ROI). To tackle this issue, we derive the constraint condition of the sensor positions for full-view neighborhood coverage with the minimum number of nodes around the point. Next, we prove that the full-view area coverage can be approximately guaranteed, as long as the regular hexagons decided by the virtual grid are seamlessly stitched. Then we present two solutions for camera sensor networks in two different deployment strategies. By computing the theoretically optimal length of the virtual grids, we put forward the deployment pattern algorithm (DPA) in the deterministic implementation. To reduce the redundancy in random deployment, we come up with a local neighboring-optimal selection algorithm (LNSA) for achieving the full-view coverage. Finally, extensive simulation results show the feasibility of our proposed solutions. PMID:28587304
Sampling Approaches for Multi-Domain Internet Performance Measurement Infrastructures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Calyam, Prasad
2014-09-15
The next-generation of high-performance networks being developed in DOE communities are critical for supporting current and emerging data-intensive science applications. The goal of this project is to investigate multi-domain network status sampling techniques and tools to measure/analyze performance, and thereby provide “network awareness” to end-users and network operators in DOE communities. We leverage the infrastructure and datasets available through perfSONAR, which is a multi-domain measurement framework that has been widely deployed in high-performance computing and networking communities; the DOE community is a core developer and the largest adopter of perfSONAR. Our investigations include development of semantic scheduling algorithms, measurement federationmore » policies, and tools to sample multi-domain and multi-layer network status within perfSONAR deployments. We validate our algorithms and policies with end-to-end measurement analysis tools for various monitoring objectives such as network weather forecasting, anomaly detection, and fault-diagnosis. In addition, we develop a multi-domain architecture for an enterprise-specific perfSONAR deployment that can implement monitoring-objective based sampling and that adheres to any domain-specific measurement policies.« less
Design of an Autonomous Underwater Vehicle to Calibrate the Europa Clipper Ice-Penetrating Radar
NASA Astrophysics Data System (ADS)
Stone, W.; Siegel, V.; Kimball, P.; Richmond, K.; Flesher, C.; Hogan, B.; Lelievre, S.
2013-12-01
Jupiter's moon Europa has been prioritized as the target for the Europa Clipper flyby mission. A key science objective for the mission is to remotely characterize the ice shell and any subsurface water, including their heterogeneity, and the nature of surface-ice-ocean exchange. This objective is a critical component of the mission's overarching goal of assessing the habitability of Europa. The instrument targeted for addressing key aspects of this goal is an ice-penetrating radar (IPR). As a primary goal of our work, we will tightly couple airborne IPR studies of the Ross Ice Shelf by the Europa Clipper radar team with ground-truth data to be obtained from sub-glacial sonar and bio-geochemical mapping of the corresponding ice-water and water-rock interfaces using an advanced autonomous underwater vehicle (AUV). The ARTEMIS vehicle - a heavily morphed long-range, low drag variant of the highly successful 4-degree-of-freedom hovering sub-ice ENDURANCE bot -- will be deployed from a sea-ice drill hole adjacent the McMurdo Ice Shelf (MIS) and will perform three classes of missions. The first includes original exploration and high definition mapping of both the ice-water interface and the benthic interface on a length scale (approximately 10 kilometers under-ice penetration radius) that will definitively tie it to the synchronous airborne IPR over-flights. These exploration and mapping missions will be conducted at up to 10 different locations along the MIS in order to capture varying ice thickness and seawater intrusion into the ice shelf. Following initial mapping characterization, the vehicle will conduct astrobiology-relevant proximity operations using bio-assay sensors (custom-designed UV fluorescence and machine-vision-processed optical imagery) followed by point-targeted studies at regions of interest. Sample returns from the ice-water interface will be triggered autonomously using real-time-processed instrument data and onboard decision-to-collect algorithms. ARTEMIS will be capable of conducting precision hovering proximity science in an unexplored environment, followed by high speed (1.5 m/s) return to the melt hole. The navigation system will significantly advance upon the successes of the prior DEPTHX and ENDURANCE systems and several novel pose-drift correction technologies will be developed and tested under ice during the project. The method of down-hole deployment and auto-docking return will be extended to a vertically-deployed, horizontally-recovered concept that is depth independent and highly relevant to an ice-water deployment on an icy moon. The presentation will discuss the mission down-select architecture for the ARTEMIS vehicle and its implications for the design of a Europa 'fast mover' carrier AUV, the onboard instrument suite, and the Antarctic mission CONOPS. The vehicle and crew will deploy to Antarctica in the 2015/2016 season.
Advanced Passive Microwave Radiometer Technology for GPM Mission
NASA Technical Reports Server (NTRS)
Smith, Eric A.; Im, Eastwood; Kummerow, Christian; Principe, Caleb; Ruf, Christoper; Wilheit, Thomas; Starr, David (Technical Monitor)
2002-01-01
An interferometer-type passive microwave radiometer based on MMIC receiver technology and a thinned array antenna design is being developed under the Instrument Incubator Program (TIP) on a project entitled the Lightweight Rainfall Radiometer (LRR). The prototype single channel aircraft instrument will be ready for first testing in 2nd quarter 2003, for deployment on the NASA DC-8 aircraft and in a ground configuration manner; this version measures at 10.7 GHz in a crosstrack imaging mode. The design for a two (2) frequency preliminary space flight model at 19 and 35 GHz (also in crosstrack imaging mode) has also been completed, in which the design features would enable it to fly in a bore-sighted configuration with a new dual-frequency space radar (DPR) under development at the Communications Research Laboratory (CRL) in Tokyo, Japan. The DPR will be flown as one of two primary instruments on the Global Precipitation Measurement (GPM) mission's core satellite in the 2007 time frame. The dual frequency space flight design of the ERR matches the APR frequencies and will be proposed as an ancillary instrument on the GPM core satellite to advance space-based precipitation measurement by enabling better microphysical characterization and coincident volume data gathering for exercising combined algorithm techniques which make use of both radar backscatter and radiometer attenuation information to constrain rainrate solutions within a physical algorithm context. This talk will discuss the design features, performance capabilities, applications plans, and conical/polarametric imaging possibilities for the LRR, as well as a brief summary of the project status and schedule.
DOT National Transportation Integrated Search
1995-12-01
This final report is a synthesis of the findings, conclusions, and recommendations of a series of task reports prepared under a major study that addresses how to overcome the institutional barriers to the deployment of Advanced Traffic Management Sys...
DOT National Transportation Integrated Search
1998-11-01
This document describes the strategy used to evaluate the Intelligent Transportation Systems (ITS) Joint Program Offices Metropolitan Model Deployment Initiative (MMDI). The MMDI is an aggressive deployment of ITS at four urban sites: New York/New...
I3Mote: An Open Development Platform for the Intelligent Industrial Internet
Martinez, Borja; Vilajosana, Xavier; Kim, Il Han; Zhou, Jianwei; Tuset-Peiró, Pere; Xhafa, Ariton; Poissonnier, Dominique; Lu, Xiaolin
2017-01-01
In this article we present the Intelligent Industrial Internet (I3) Mote, an open hardware platform targeting industrial connectivity and sensing deployments. The I3Mote features the most advanced low-power components to tackle sensing, on-board computing and wireless/wired connectivity for demanding industrial applications. The platform has been designed to fill the gap in the industrial prototyping and early deployment market with a compact form factor, low-cost and robust industrial design. I3Mote is an advanced and compact prototyping system integrating the required components to be deployed as a product, leveraging the need for adopting industries to build their own tailored solution. This article describes the platform design, firmware and software ecosystem and characterizes its performance in terms of energy consumption. PMID:28452945
75 FR 12807 - Agency Information Collection Activity Under OMB Review
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-17
... and advanced propulsion technologies. The Federal Register notice with a 60-day comment period... program supports the development and deployment of clean fuel and advanced propulsion technologies for...
TU-H-206-01: An Automated Approach for Identifying Geometric Distortions in Gamma Cameras
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mann, S; Nelson, J; Samei, E
2016-06-15
Purpose: To develop a clinically-deployable, automated process for detecting artifacts in routine nuclear medicine (NM) quality assurance (QA) bar phantom images. Methods: An artifact detection algorithm was created to analyze bar phantom images as part of an ongoing QA program. A low noise, high resolution reference image was acquired from an x-ray of the bar phantom with a Philips Digital Diagnost system utilizing image stitching. NM bar images, acquired for 5 million counts over a 512×512 matrix, were registered to the template image by maximizing mutual information (MI). The MI index was used as an initial test for artifacts; lowmore » values indicate an overall presence of distortions regardless of their spatial location. Images with low MI scores were further analyzed for bar linearity, periodicity, alignment, and compression to locate differences with respect to the template. Findings from each test were spatially correlated and locations failing multiple tests were flagged as potential artifacts requiring additional visual analysis. The algorithm was initially deployed for GE Discovery 670 and Infinia Hawkeye gamma cameras. Results: The algorithm successfully identified clinically relevant artifacts from both systems previously unnoticed by technologists performing the QA. Average MI indices for artifact-free images are 0.55. Images with MI indices < 0.50 have shown 100% sensitivity and specificity for artifact detection when compared with a thorough visual analysis. Correlation of geometric tests confirms the ability to spatially locate the most likely image regions containing an artifact regardless of initial phantom orientation. Conclusion: The algorithm shows the potential to detect gamma camera artifacts that may be missed by routine technologist inspections. Detection and subsequent correction of artifacts ensures maximum image quality and may help to identify failing hardware before it impacts clinical workflow. Going forward, the algorithm is being deployed to monitor data from all gamma cameras within our health system.« less
Base Station Placement Algorithm for Large-Scale LTE Heterogeneous Networks.
Lee, Seungseob; Lee, SuKyoung; Kim, Kyungsoo; Kim, Yoon Hyuk
2015-01-01
Data traffic demands in cellular networks today are increasing at an exponential rate, giving rise to the development of heterogeneous networks (HetNets), in which small cells complement traditional macro cells by extending coverage to indoor areas. However, the deployment of small cells as parts of HetNets creates a key challenge for operators' careful network planning. In particular, massive and unplanned deployment of base stations can cause high interference, resulting in highly degrading network performance. Although different mathematical modeling and optimization methods have been used to approach various problems related to this issue, most traditional network planning models are ill-equipped to deal with HetNet-specific characteristics due to their focus on classical cellular network designs. Furthermore, increased wireless data demands have driven mobile operators to roll out large-scale networks of small long term evolution (LTE) cells. Therefore, in this paper, we aim to derive an optimum network planning algorithm for large-scale LTE HetNets. Recently, attempts have been made to apply evolutionary algorithms (EAs) to the field of radio network planning, since they are characterized as global optimization methods. Yet, EA performance often deteriorates rapidly with the growth of search space dimensionality. To overcome this limitation when designing optimum network deployments for large-scale LTE HetNets, we attempt to decompose the problem and tackle its subcomponents individually. Particularly noting that some HetNet cells have strong correlations due to inter-cell interference, we propose a correlation grouping approach in which cells are grouped together according to their mutual interference. Both the simulation and analytical results indicate that the proposed solution outperforms the random-grouping based EA as well as an EA that detects interacting variables by monitoring the changes in the objective function algorithm in terms of system throughput performance.
Remote health monitoring system for detecting cardiac disorders.
Bansal, Ayush; Kumar, Sunil; Bajpai, Anurag; Tiwari, Vijay N; Nayak, Mithun; Venkatesan, Shankar; Narayanan, Rangavittal
2015-12-01
Remote health monitoring system with clinical decision support system as a key component could potentially quicken the response of medical specialists to critical health emergencies experienced by their patients. A monitoring system, specifically designed for cardiac care with electrocardiogram (ECG) signal analysis as the core diagnostic technique, could play a vital role in early detection of a wide range of cardiac ailments, from a simple arrhythmia to life threatening conditions such as myocardial infarction. The system that the authors have developed consists of three major components, namely, (a) mobile gateway, deployed on patient's mobile device, that receives 12-lead ECG signals from any ECG sensor, (b) remote server component that hosts algorithms for accurate annotation and analysis of the ECG signal and (c) point of care device of the doctor to receive a diagnostic report from the server based on the analysis of ECG signals. In the present study, their focus has been toward developing a system capable of detecting critical cardiac events well in advance using an advanced remote monitoring system. A system of this kind is expected to have applications ranging from tracking wellness/fitness to detection of symptoms leading to fatal cardiac events.
Monitoring Precipitation from Space: targeting Hydrology Community?
NASA Astrophysics Data System (ADS)
Hong, Y.; Turk, J.
2005-12-01
During the past decades, advances in space, sensor and computer technology have made it possible to estimate precipitation nearly globally from a variety of observations in a relatively direct manner. The success of Tropical Precipitation Measuring Mission (TRMM) has been a significant advance for modern precipitation estimation algorithms to move toward daily quarter degree measurements, while the need for precipitation data at temporal-spatial resolutions compatible with hydrologic modeling has been emphasized by the end user: hydrology community. Can the future deployment of Global Precipitation Measurement constellation of low-altitude orbiting satellites (covering 90% of the global with a sampling interval of less than 3-hours), in conjunction with the existing suite of geostationary satellites, results in significant improvements in scale and accuracy of precipitation estimates suitable for hydrology applications? This presentation will review the current state of satellite-derived precipitation estimation and demonstrate the early results and primary barriers to full global high-resolution precipitation coverage. An attempt to facilitate the communication between data producers and users will be discussed by developing an 'end-to-end' uncertainty propagation analysis framework to quantify both the precipitation estimation error structure and the error influence on hydrological modeling.
DOT National Transportation Integrated Search
1997-01-01
This document reports on the formal evaluation of the targeted (limited but highly focused) deployment of the Advanced Driver and Vehicle Advisory Navigation ConcEpt (ADVANCE), an in-vehicle advanced traveler information system designed to provide sh...
NASA Technical Reports Server (NTRS)
Wang, Xu; Shi, Fang; Sigrist, Norbert; Seo, Byoung-Joon; Tang, Hong; Bikkannavar, Siddarayappa; Basinger, Scott; Lay, Oliver
2012-01-01
Large aperture telescope commonly features segment mirrors and a coarse phasing step is needed to bring these individual segments into the fine phasing capture range. Dispersed Fringe Sensing (DFS) is a powerful coarse phasing technique and its alteration is currently being used for JWST.An Advanced Dispersed Fringe Sensing (ADFS) algorithm is recently developed to improve the performance and robustness of previous DFS algorithms with better accuracy and unique solution. The first part of the paper introduces the basic ideas and the essential features of the ADFS algorithm and presents the some algorithm sensitivity study results. The second part of the paper describes the full details of algorithm validation process through the advanced wavefront sensing and correction testbed (AWCT): first, the optimization of the DFS hardware of AWCT to ensure the data accuracy and reliability is illustrated. Then, a few carefully designed algorithm validation experiments are implemented, and the corresponding data analysis results are shown. Finally the fiducial calibration using Range-Gate-Metrology technique is carried out and a <10nm or <1% algorithm accuracy is demonstrated.
Nonlinear estimation for arrays of chemical sensors
NASA Astrophysics Data System (ADS)
Yosinski, Jason; Paffenroth, Randy
2010-04-01
Reliable detection of hazardous materials is a fundamental requirement of any national security program. Such materials can take a wide range of forms including metals, radioisotopes, volatile organic compounds, and biological contaminants. In particular, detection of hazardous materials in highly challenging conditions - such as in cluttered ambient environments, where complex collections of analytes are present, and with sensors lacking specificity for the analytes of interest - is an important part of a robust security infrastructure. Sophisticated single sensor systems provide good specificity for a limited set of analytes but often have cumbersome hardware and environmental requirements. On the other hand, simple, broadly responsive sensors are easily fabricated and efficiently deployed, but such sensors individually have neither the specificity nor the selectivity to address analyte differentiation in challenging environments. However, arrays of broadly responsive sensors can provide much of the sensitivity and selectivity of sophisticated sensors but without the substantial hardware overhead. Unfortunately, arrays of simple sensors are not without their challenges - the selectivity of such arrays can only be realized if the data is first distilled using highly advanced signal processing algorithms. In this paper we will demonstrate how the use of powerful estimation algorithms, based on those commonly used within the target tracking community, can be extended to the chemical detection arena. Herein our focus is on algorithms that not only provide accurate estimates of the mixture of analytes in a sample, but also provide robust measures of ambiguity, such as covariances.
Simulation of Locking Space Truss Deployments for a Large Deployable Sparse Aperture Reflector
2015-03-01
Dr. Alan Jennings, for his unending patience with my struggles through this entire process . Without his expertise, guidance, and trust I would have...engineer since they are not automatically meshed. Fortunately, the mesh process is quite swift. Figure 13 shows both a linear hexahedral element as well...less than that of the serial process . Therefore, COMSOL’s partially parallelized algorithms will not be sped up as a function of cores added and is
Liu, Wen; Fu, Xiao; Deng, Zhongliang
2016-12-02
Indoor positioning technologies has boomed recently because of the growing commercial interest in indoor location-based service (ILBS). Due to the absence of satellite signal in Global Navigation Satellite System (GNSS), various technologies have been proposed for indoor applications. Among them, Wi-Fi fingerprinting has been attracting much interest from researchers because of its pervasive deployment, flexibility and robustness to dense cluttered indoor environments. One challenge, however, is the deployment of Access Points (AP), which would bring a significant influence on the system positioning accuracy. This paper concentrates on WLAN based fingerprinting indoor location by analyzing the AP deployment influence, and studying the advantages of coordinate-based clustering compared to traditional RSS-based clustering. A coordinate-based clustering method for indoor fingerprinting location, named Smallest-Enclosing-Circle-based (SEC), is then proposed aiming at reducing the positioning error lying in the AP deployment and improving robustness to dense cluttered environments. All measurements are conducted in indoor public areas, such as the National Center For the Performing Arts (as Test-bed 1) and the XiDan Joy City (Floors 1 and 2, as Test-bed 2), and results show that SEC clustering algorithm can improve system positioning accuracy by about 32.7% for Test-bed 1, 71.7% for Test-bed 2 Floor 1 and 73.7% for Test-bed 2 Floor 2 compared with traditional RSS-based clustering algorithms such as K-means.
Liu, Wen; Fu, Xiao; Deng, Zhongliang
2016-01-01
Indoor positioning technologies has boomed recently because of the growing commercial interest in indoor location-based service (ILBS). Due to the absence of satellite signal in Global Navigation Satellite System (GNSS), various technologies have been proposed for indoor applications. Among them, Wi-Fi fingerprinting has been attracting much interest from researchers because of its pervasive deployment, flexibility and robustness to dense cluttered indoor environments. One challenge, however, is the deployment of Access Points (AP), which would bring a significant influence on the system positioning accuracy. This paper concentrates on WLAN based fingerprinting indoor location by analyzing the AP deployment influence, and studying the advantages of coordinate-based clustering compared to traditional RSS-based clustering. A coordinate-based clustering method for indoor fingerprinting location, named Smallest-Enclosing-Circle-based (SEC), is then proposed aiming at reducing the positioning error lying in the AP deployment and improving robustness to dense cluttered environments. All measurements are conducted in indoor public areas, such as the National Center For the Performing Arts (as Test-bed 1) and the XiDan Joy City (Floors 1 and 2, as Test-bed 2), and results show that SEC clustering algorithm can improve system positioning accuracy by about 32.7% for Test-bed 1, 71.7% for Test-bed 2 Floor 1 and 73.7% for Test-bed 2 Floor 2 compared with traditional RSS-based clustering algorithms such as K-means. PMID:27918454
DOT National Transportation Integrated Search
2016-11-01
The U.S. Department of Transportation Integrated Corridor Management (ICM) Initiative aims to advance the state of the practice in transportation corridor operations to manage congestion. Through the deployment of ICM at the two selected Demonstratio...
DOT National Transportation Integrated Search
2016-12-01
The U.S. Department of Transportation Integrated Corridor Management (ICM) Initiative aims to advance the state of the practice in transportation corridor operations to manage congestion. Through the deployment of ICM at the two selected Demonstratio...
2014-02-11
ISS038-E-045009 (11 Feb. 2014) --- The Small Satellite Orbital Deployer (SSOD), in the grasp of the Kibo laboratory robotic arm, is photographed by an Expedition 38 crew member on the International Space Station as it deploys a set of NanoRacks CubeSats. The CubeSats program contains a variety of experiments such as Earth observations and advanced electronics testing. Station solar array panels, Earth's horizon and the blackness of space provide the backdrop for the scene.
Advances in modeling aerodynamic decelerator dynamics.
NASA Technical Reports Server (NTRS)
Whitlock, C. H.
1973-01-01
The Viking entry vehicle uses a lines-first type of deployment in which the parachute, packed in a deployment bag, gets ejected rearward from the vehicle by a mortar. As the bag moves rearward, first the lines are unfurled and then the canopy. An analysis of the unfurling process is conducted, giving attention to longitudinal and rotational dynamics. It is shown that analytical modeling of aerodynamic systems provides significant information for a better understanding of the physics of the deployment process.
Energy Aware Clustering Algorithms for Wireless Sensor Networks
NASA Astrophysics Data System (ADS)
Rakhshan, Noushin; Rafsanjani, Marjan Kuchaki; Liu, Chenglian
2011-09-01
The sensor nodes deployed in wireless sensor networks (WSNs) are extremely power constrained, so maximizing the lifetime of the entire networks is mainly considered in the design. In wireless sensor networks, hierarchical network structures have the advantage of providing scalable and energy efficient solutions. In this paper, we investigate different clustering algorithms for WSNs and also compare these clustering algorithms based on metrics such as clustering distribution, cluster's load balancing, Cluster Head's (CH) selection strategy, CH's role rotation, node mobility, clusters overlapping, intra-cluster communications, reliability, security and location awareness.
Robotic disaster recovery efforts with ad-hoc deployable cloud computing
NASA Astrophysics Data System (ADS)
Straub, Jeremy; Marsh, Ronald; Mohammad, Atif F.
2013-06-01
Autonomous operations of search and rescue (SaR) robots is an ill posed problem, which is complexified by the dynamic disaster recovery environment. In a typical SaR response scenario, responder robots will require different levels of processing capabilities during various parts of the response effort and will need to utilize multiple algorithms. Placing these capabilities onboard the robot is a mediocre solution that precludes algorithm specific performance optimization and results in mediocre performance. Architecture for an ad-hoc, deployable cloud environment suitable for use in a disaster response scenario is presented. Under this model, each service provider is optimized for the task and maintains a database of situation-relevant information. This service-oriented architecture (SOA 3.0) compliant framework also serves as an example of the efficient use of SOA 3.0 in an actual cloud application.
Model-Based Infrared Metrology for Advanced Technology Nodes and 300 mm Wafer Processing
NASA Astrophysics Data System (ADS)
Rosenthal, Peter A.; Duran, Carlos; Tower, Josh; Mazurenko, Alex; Mantz, Ulrich; Weidner, Peter; Kasic, Alexander
2005-09-01
The use of infrared spectroscopy for production semiconductor process monitoring has evolved recently from primarily unpatterned, i.e. blanket test wafer measurements in a limited historical application space of blanket epitaxial, BPSG, and FSG layers to new applications involving patterned product wafer measurements, and new measurement capabilities. Over the last several years, the semiconductor industry has adopted a new set of materials associated with copper/low-k interconnects, and new structures incorporating exotic materials including silicon germanium, SOI substrates and high aspect ratio trenches. The new device architectures and more chemically sophisticated materials have raised new process control and metrology challenges that are not addressed by current measurement technology. To address the challenges we have developed a new infrared metrology tool designed for emerging semiconductor production processes, in a package compatible with modern production and R&D environments. The tool incorporates recent advances in reflectance instrumentation including highly accurate signal processing, optimized reflectometry optics, and model-based calibration and analysis algorithms. To meet the production requirements of the modern automated fab, the measurement hardware has been integrated with a fully automated 300 mm platform incorporating front opening unified pod (FOUP) interfaces, automated pattern recognition and high throughput ultra clean robotics. The tool employs a suite of automated dispersion-model analysis algorithms capable of extracting a variety of layer properties from measured spectra. The new tool provides excellent measurement precision, tool matching, and a platform for deploying many new production and development applications. In this paper we will explore the use of model based infrared analysis as a tool for characterizing novel bottle capacitor structures employed in high density dynamic random access memory (DRAM) chips. We will explore the capability of the tool for characterizing multiple geometric parameters associated with the manufacturing process that are important to the yield and performance of advanced bottle DRAM devices.
Art concept of Magellan spacecraft and inertial upper stage (IUS) deployment
NASA Technical Reports Server (NTRS)
1988-01-01
Magellan spacecraft mounted on inertial upper stage drifts above Atlantis, Orbiter Vehicle (OV) 104, after its deployment during mission STS-30 in this artist concept. Solar panels are deployed and in OV-104's open payload bay (PLB) the airborne support equipment (ASE) is visible. Both spacecraft are orbiting the Earth. Magellan, named after the 16th century Portuguese explorer, will orbit Venus about once every three hours, acquiring radar data for 37 minutes of each orbit when it is closest to the surface. Using an advanced instrument called a synthetic aperture radar (SAR), it will map more than 90 per cent of the surface with resolution ten times better than the best from prior spacecraft. Magellan is managed by the Jet Propulsion Laboratory (JPL); Martin Marietta Aerospace is developing the spacecraft and Hughes Aircraft Company, the advanced imaging radar.
Collaborative Localization Algorithms for Wireless Sensor Networks with Reduced Localization Error
Sahoo, Prasan Kumar; Hwang, I-Shyan
2011-01-01
Localization is an important research issue in Wireless Sensor Networks (WSNs). Though Global Positioning System (GPS) can be used to locate the position of the sensors, unfortunately it is limited to outdoor applications and is costly and power consuming. In order to find location of sensor nodes without help of GPS, collaboration among nodes is highly essential so that localization can be accomplished efficiently. In this paper, novel localization algorithms are proposed to find out possible location information of the normal nodes in a collaborative manner for an outdoor environment with help of few beacons and anchor nodes. In our localization scheme, at most three beacon nodes should be collaborated to find out the accurate location information of any normal node. Besides, analytical methods are designed to calculate and reduce the localization error using probability distribution function. Performance evaluation of our algorithm shows that there is a tradeoff between deployed number of beacon nodes and localization error, and average localization time of the network can be increased with increase in the number of normal nodes deployed over a region. PMID:22163738
Impacts of an advanced public transportation system : demonstration project
DOT National Transportation Integrated Search
1999-10-01
In 1997, the Ann Arbor (Michigan) Transportation Authority began deploying a set of integrated : advanced public transportation system technologies in its vehicles, stations and control center. This paper summarizes selected findings of a multidimens...
Early deployment of ATMS/ATIS for metropolitan Detroit
DOT National Transportation Integrated Search
1994-09-26
The Michigan Department of Transportation (MDOT) is currently planning for the expansion of their current Advanced Traffic Management and Advanced Traveler Information Systems (ATMS and ATIS, respectively). Current ATMS and ATIS coverage include 3...
DOT National Transportation Integrated Search
1999-01-01
In 1997, the Ann Arbor (Michigan) Transportation Authority began deploying advanced public transportation systems (APTS) technologies in its fixed route and paratransit operations. The project's concept is the integration of a range of such technolog...
Salcedo-Sanz, S; Del Ser, J; Landa-Torres, I; Gil-López, S; Portilla-Figueras, J A
2014-01-01
This paper presents a novel bioinspired algorithm to tackle complex optimization problems: the coral reefs optimization (CRO) algorithm. The CRO algorithm artificially simulates a coral reef, where different corals (namely, solutions to the optimization problem considered) grow and reproduce in coral colonies, fighting by choking out other corals for space in the reef. This fight for space, along with the specific characteristics of the corals' reproduction, produces a robust metaheuristic algorithm shown to be powerful for solving hard optimization problems. In this research the CRO algorithm is tested in several continuous and discrete benchmark problems, as well as in practical application scenarios (i.e., optimum mobile network deployment and off-shore wind farm design). The obtained results confirm the excellent performance of the proposed algorithm and open line of research for further application of the algorithm to real-world problems.
Salcedo-Sanz, S.; Del Ser, J.; Landa-Torres, I.; Gil-López, S.; Portilla-Figueras, J. A.
2014-01-01
This paper presents a novel bioinspired algorithm to tackle complex optimization problems: the coral reefs optimization (CRO) algorithm. The CRO algorithm artificially simulates a coral reef, where different corals (namely, solutions to the optimization problem considered) grow and reproduce in coral colonies, fighting by choking out other corals for space in the reef. This fight for space, along with the specific characteristics of the corals' reproduction, produces a robust metaheuristic algorithm shown to be powerful for solving hard optimization problems. In this research the CRO algorithm is tested in several continuous and discrete benchmark problems, as well as in practical application scenarios (i.e., optimum mobile network deployment and off-shore wind farm design). The obtained results confirm the excellent performance of the proposed algorithm and open line of research for further application of the algorithm to real-world problems. PMID:25147860
Global Sensor Management: Military Asset Allocation
2009-10-06
solution (referred to as moves). A similar approach has been suggested by Zweben et al. (1993), who use a local search base metaheuristic , specifically...trapped in a local optimum. Hansen and Mladenovic (1998) describe the concept of variable neighborhood local search algorithms , and describe an...Mataric and G.S. Sukhatme (2002). “An incremental deployment algorithm for mobile robot teams,” Proceedings of the 2002 IEEE/RSJ Intl. Conference on
Lykiardopoulos, Byron; Hagström, Hannes; Fredrikson, Mats; Ignatova, Simone; Stål, Per; Hultcrantz, Rolf; Ekstedt, Mattias; Kechagias, Stergios
2016-01-01
Detection of advanced fibrosis (F3-F4) in nonalcoholic fatty liver disease (NAFLD) is important for ascertaining prognosis. Serum markers have been proposed as alternatives to biopsy. We attempted to develop a novel algorithm for detection of advanced fibrosis based on a more efficient combination of serological markers and to compare this with established algorithms. We included 158 patients with biopsy-proven NAFLD. Of these, 38 had advanced fibrosis. The following fibrosis algorithms were calculated: NAFLD fibrosis score, BARD, NIKEI, NASH-CRN regression score, APRI, FIB-4, King´s score, GUCI, Lok index, Forns score, and ELF. Study population was randomly divided in a training and a validation group. A multiple logistic regression analysis using bootstrapping methods was applied to the training group. Among many variables analyzed age, fasting glucose, hyaluronic acid and AST were included, and a model (LINKI-1) for predicting advanced fibrosis was created. Moreover, these variables were combined with platelet count in a mathematical way exaggerating the opposing effects, and alternative models (LINKI-2) were also created. Models were compared using area under the receiver operator characteristic curves (AUROC). Of established algorithms FIB-4 and King´s score had the best diagnostic accuracy with AUROCs 0.84 and 0.83, respectively. Higher accuracy was achieved with the novel LINKI algorithms. AUROCs in the total cohort for LINKI-1 was 0.91 and for LINKI-2 models 0.89. The LINKI algorithms for detection of advanced fibrosis in NAFLD showed better accuracy than established algorithms and should be validated in further studies including larger cohorts.
Shuttle performance enhancements using an OMS payload bay kit
NASA Technical Reports Server (NTRS)
Templin, Kevin C.; Mallini, Charles J.
1991-01-01
The study focuses on the use of an orbital maneuvering system (OMS) payload bay kit (PBK) designed to utilize OMS tanks identical to those currently employed in the Orbiter OMS pods. Emphasis is placed on payload deployment capability and payload servicing/reboost capability augmentation from the point of view of payload mass, maximum deployment altitudes, and initial retrieval and final deployment altitudes. The deployment, servicing, and reboost requirements of the Hubble Space Telescope and Advanced X-ray and Astrophysics Facility are analyzed in order to show the benefits an OMS PBK can provide for these missions. It is shown that OMS PBKs can provide the required capability enhancement necessary to support deployment, reboost, and servicing of payloads requiring altitudes greater than 325 nautical miles.
Bio-Inspired Genetic Algorithms with Formalized Crossover Operators for Robotic Applications.
Zhang, Jie; Kang, Man; Li, Xiaojuan; Liu, Geng-Yang
2017-01-01
Genetic algorithms are widely adopted to solve optimization problems in robotic applications. In such safety-critical systems, it is vitally important to formally prove the correctness when genetic algorithms are applied. This paper focuses on formal modeling of crossover operations that are one of most important operations in genetic algorithms. Specially, we for the first time formalize crossover operations with higher-order logic based on HOL4 that is easy to be deployed with its user-friendly programing environment. With correctness-guaranteed formalized crossover operations, we can safely apply them in robotic applications. We implement our technique to solve a path planning problem using a genetic algorithm with our formalized crossover operations, and the results show the effectiveness of our technique.
AAFE large deployable antenna development program: Executive summary
NASA Technical Reports Server (NTRS)
1977-01-01
The large deployable antenna development program sponsored by the Advanced Applications Flight Experiments of the Langley Research Center is summarized. Projected user requirements for large diameter deployable reflector antennas were reviewed. Trade-off studies for the selection of a design concept for 10-meter diameter reflectors were made. A hoop/column concept was selected as the baseline concept. Parametric data are presented for 15-meter, 30-meter, and 100-meter diameters. A 1.82-meter diameter engineering model which demonstrated the feasiblity of the concept is described.
Computing Generalized Matrix Inverse on Spiking Neural Substrate.
Shukla, Rohit; Khoram, Soroosh; Jorgensen, Erik; Li, Jing; Lipasti, Mikko; Wright, Stephen
2018-01-01
Emerging neural hardware substrates, such as IBM's TrueNorth Neurosynaptic System, can provide an appealing platform for deploying numerical algorithms. For example, a recurrent Hopfield neural network can be used to find the Moore-Penrose generalized inverse of a matrix, thus enabling a broad class of linear optimizations to be solved efficiently, at low energy cost. However, deploying numerical algorithms on hardware platforms that severely limit the range and precision of representation for numeric quantities can be quite challenging. This paper discusses these challenges and proposes a rigorous mathematical framework for reasoning about range and precision on such substrates. The paper derives techniques for normalizing inputs and properly quantizing synaptic weights originating from arbitrary systems of linear equations, so that solvers for those systems can be implemented in a provably correct manner on hardware-constrained neural substrates. The analytical model is empirically validated on the IBM TrueNorth platform, and results show that the guarantees provided by the framework for range and precision hold under experimental conditions. Experiments with optical flow demonstrate the energy benefits of deploying a reduced-precision and energy-efficient generalized matrix inverse engine on the IBM TrueNorth platform, reflecting 10× to 100× improvement over FPGA and ARM core baselines.
2014-02-13
ISS038-E-046586 (13 Feb. 2014) --- A set of NanoRacks CubeSats is photographed by an Expedition 38 crew member after the deployment by the NanoRacks Launcher attached to the end of the Japanese robotic arm. The CubeSats program contains a variety of experiments such as Earth observations and advanced electronics testing.
2014-02-13
ISS038-E-046579 (13 Feb. 2014) --- A set of NanoRacks CubeSats is photographed by an Expedition 38 crew member after the deployment by the NanoRacks Launcher attached to the end of the Japanese robotic arm. The CubeSats program contains a variety of experiments such as Earth observations and advanced electronics testing.
DOT National Transportation Integrated Search
1996-12-20
IT IS WIDELY BELIEVED THAT BARRIERS TO AN AUTOMATED HIGHWAY SYSTEM (AHS) : DEPLOYMENT ARE DUE MORE TO INSTITUTIONAL, ECONOMIC, AND LEGAL ISSUES THAN TECHNOLOGY LIMITATIONS. IN ORDER TO SUSTAIN AND ACCELERATE THE AHS DEPLOYMENT PROCESS, IT IS DESIRABL...
Evaluation of the advanced operating system of the Ann Arbor Transit Authority
DOT National Transportation Integrated Search
1999-10-01
These reports constitute an evaluation of the intelligent transportation system deployment efforts of the Ann Arbor Transportation Authority. These efforts, collectively termed "Advanced Operating System" (AOS), represent a vision of an integrated ad...
2017-01-01
The continuous technological advances in favor of mHealth represent a key factor in the improvement of medical emergency services. This systematic review presents the identification, study, and classification of the most up-to-date approaches surrounding the deployment of architectures for mHealth. Our review includes 25 articles obtained from databases such as IEEE Xplore, Scopus, SpringerLink, ScienceDirect, and SAGE. This review focused on studies addressing mHealth systems for outdoor emergency situations. In 60% of the articles, the deployment architecture relied in the connective infrastructure associated with emergent technologies such as cloud services, distributed services, Internet-of-things, machine-to-machine, vehicular ad hoc network, and service-oriented architecture. In 40% of the literature review, the deployment architecture for mHealth considered traditional connective infrastructure. Only 20% of the studies implemented an energy consumption protocol to extend system lifetime. We concluded that there is a need for more integrated solutions specifically for outdoor scenarios. Energy consumption protocols are needed to be implemented and evaluated. Emergent connective technologies are redefining the information management and overcome traditional technologies. PMID:29075430
Gonzalez, Enrique; Peña, Raul; Avila, Alfonso; Vargas-Rosales, Cesar; Munoz-Rodriguez, David
2017-01-01
The continuous technological advances in favor of mHealth represent a key factor in the improvement of medical emergency services. This systematic review presents the identification, study, and classification of the most up-to-date approaches surrounding the deployment of architectures for mHealth. Our review includes 25 articles obtained from databases such as IEEE Xplore, Scopus, SpringerLink, ScienceDirect, and SAGE. This review focused on studies addressing mHealth systems for outdoor emergency situations. In 60% of the articles, the deployment architecture relied in the connective infrastructure associated with emergent technologies such as cloud services, distributed services, Internet-of-things, machine-to-machine, vehicular ad hoc network, and service-oriented architecture. In 40% of the literature review, the deployment architecture for mHealth considered traditional connective infrastructure. Only 20% of the studies implemented an energy consumption protocol to extend system lifetime. We concluded that there is a need for more integrated solutions specifically for outdoor scenarios. Energy consumption protocols are needed to be implemented and evaluated. Emergent connective technologies are redefining the information management and overcome traditional technologies.
Advanced Commercial Buildings Initiative Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roberts, Sydney G.
The Southface Advanced Commercial Buildings Initiative has developed solutions to overcome market barriers to energy reductions in small commercial buildings by building on the success of four local and Southeast regional energy efficiency deployment programs. These programs address a variety of small commercial building types, efficiency levels, owners, facility manager skills and needs for financing. The deployment programs also reach critical private sector, utility, nonprofit and government submarkets, and have strong potential to be replicated at scale. During the grant period, 200 small commercial buildings participated in Southface-sponsored energy upgrade programs, saving 166,736,703 kBtu of source energy.
NASA Astrophysics Data System (ADS)
Tartakovsky, A.; Brown, A.; Brown, J.
The paper describes the development and evaluation of a suite of advanced algorithms which provide significantly-improved capabilities for finding, fixing, and tracking multiple ballistic and flying low observable objects in highly stressing cluttered environments. The algorithms have been developed for use in satellite-based staring and scanning optical surveillance suites for applications including theatre and intercontinental ballistic missile early warning, trajectory prediction, and multi-sensor track handoff for midcourse discrimination and intercept. The functions performed by the algorithms include electronic sensor motion compensation providing sub-pixel stabilization (to 1/100 of a pixel), as well as advanced temporal-spatial clutter estimation and suppression to below sensor noise levels, followed by statistical background modeling and Bayesian multiple-target track-before-detect filtering. The multiple-target tracking is performed in physical world coordinates to allow for multi-sensor fusion, trajectory prediction, and intercept. Output of detected object cues and data visualization are also provided. The algorithms are designed to handle a wide variety of real-world challenges. Imaged scenes may be highly complex and infinitely varied -- the scene background may contain significant celestial, earth limb, or terrestrial clutter. For example, when viewing combined earth limb and terrestrial scenes, a combination of stationary and non-stationary clutter may be present, including cloud formations, varying atmospheric transmittance and reflectance of sunlight and other celestial light sources, aurora, glint off sea surfaces, and varied natural and man-made terrain features. The targets of interest may also appear to be dim, relative to the scene background, rendering much of the existing deployed software useless for optical target detection and tracking. Additionally, it may be necessary to detect and track a large number of objects in the threat cloud, and these objects may not always be resolvable in individual data frames. In the present paper, the performance of the developed algorithms is demonstrated using real-world data containing resident space objects observed from the MSX platform, with backgrounds varying from celestial to combined celestial and earth limb, with instances of extremely bright aurora clutter. Simulation results are also presented for parameterized variations in signal-to-clutter levels (down to 1/1000) and signal-to-noise levels (down to 1/6) for simulated targets against real-world terrestrial clutter backgrounds. We also discuss algorithm processing requirements and C++ software processing capabilities from our on-going MDA- and AFRL-sponsored development of an image processing toolkit (iPTK). In the current effort, the iPTK is being developed to a Technology Readiness Level (TRL) of 6 by mid-2010, in preparation for possible integration with STSS-like, SBIRS high-like and SBSS-like surveillance suites.
Hipp, Jason D; Cheng, Jerome Y; Toner, Mehmet; Tompkins, Ronald G; Balis, Ulysses J
2011-02-26
HISTORICALLY, EFFECTIVE CLINICAL UTILIZATION OF IMAGE ANALYSIS AND PATTERN RECOGNITION ALGORITHMS IN PATHOLOGY HAS BEEN HAMPERED BY TWO CRITICAL LIMITATIONS: 1) the availability of digital whole slide imagery data sets and 2) a relative domain knowledge deficit in terms of application of such algorithms, on the part of practicing pathologists. With the advent of the recent and rapid adoption of whole slide imaging solutions, the former limitation has been largely resolved. However, with the expectation that it is unlikely for the general cohort of contemporary pathologists to gain advanced image analysis skills in the short term, the latter problem remains, thus underscoring the need for a class of algorithm that has the concurrent properties of image domain (or organ system) independence and extreme ease of use, without the need for specialized training or expertise. In this report, we present a novel, general case pattern recognition algorithm, Spatially Invariant Vector Quantization (SIVQ), that overcomes the aforementioned knowledge deficit. Fundamentally based on conventional Vector Quantization (VQ) pattern recognition approaches, SIVQ gains its superior performance and essentially zero-training workflow model from its use of ring vectors, which exhibit continuous symmetry, as opposed to square or rectangular vectors, which do not. By use of the stochastic matching properties inherent in continuous symmetry, a single ring vector can exhibit as much as a millionfold improvement in matching possibilities, as opposed to conventional VQ vectors. SIVQ was utilized to demonstrate rapid and highly precise pattern recognition capability in a broad range of gross and microscopic use-case settings. With the performance of SIVQ observed thus far, we find evidence that indeed there exist classes of image analysis/pattern recognition algorithms suitable for deployment in settings where pathologists alone can effectively incorporate their use into clinical workflow, as a turnkey solution. We anticipate that SIVQ, and other related class-independent pattern recognition algorithms, will become part of the overall armamentarium of digital image analysis approaches that are immediately available to practicing pathologists, without the need for the immediate availability of an image analysis expert.
NASA Astrophysics Data System (ADS)
Dehne, Hans J.
1991-05-01
NASA has initiated technology development programs to develop advanced solar dynamic power systems and components for space applications beyond 2000. Conceptual design work that was performed is described. The main efforts were the: (1) conceptual design of self-deploying, high-performance parabolic concentrator; and (2) materials selection for a lightweight, shape-stable concentrator. The deployment concept utilizes rigid gore-shaped reflective panels. The assembled concentrator takes an annular shape with a void in the center. This deployable concentrator concept is applicable to a range of solar dynamic power systems of 25 kW sub e to in excess of 75 kW sub e. The concept allows for a family of power system sizes all using the same packaging and deployment technique. The primary structural material selected for the concentrator is a polyethyl ethylketone/carbon fiber composite also referred to as APC-2 or Vitrex. This composite has a nearly neutral coefficient of thermal expansion which leads to shape stable characteristics under thermal gradient conditions. Substantial efforts were undertaken to produce a highly specular surface on the composite. The overall coefficient of thermal expansion of the composite laminate is near zero, but thermally induced stresses due to micro-movement of the fibers and matrix in relation to each other cause the surface to become nonspecular.
NASA Technical Reports Server (NTRS)
Dehne, Hans J.
1991-01-01
NASA has initiated technology development programs to develop advanced solar dynamic power systems and components for space applications beyond 2000. Conceptual design work that was performed is described. The main efforts were the: (1) conceptual design of self-deploying, high-performance parabolic concentrator; and (2) materials selection for a lightweight, shape-stable concentrator. The deployment concept utilizes rigid gore-shaped reflective panels. The assembled concentrator takes an annular shape with a void in the center. This deployable concentrator concept is applicable to a range of solar dynamic power systems of 25 kW sub e to in excess of 75 kW sub e. The concept allows for a family of power system sizes all using the same packaging and deployment technique. The primary structural material selected for the concentrator is a polyethyl ethylketone/carbon fiber composite also referred to as APC-2 or Vitrex. This composite has a nearly neutral coefficient of thermal expansion which leads to shape stable characteristics under thermal gradient conditions. Substantial efforts were undertaken to produce a highly specular surface on the composite. The overall coefficient of thermal expansion of the composite laminate is near zero, but thermally induced stresses due to micro-movement of the fibers and matrix in relation to each other cause the surface to become nonspecular.
Deployable System for Crash-Load Attenuation
NASA Technical Reports Server (NTRS)
Kellas, Sotiris; Jackson, Karen E.
2007-01-01
An externally deployable honeycomb structure is investigated with respect to crash energy management for light aircraft. The new concept utilizes an expandable honeycomb-like structure to absorb impact energy by crushing. Distinguished by flexible hinges between cell wall junctions that enable effortless deployment, the new energy absorber offers most of the desirable features of an external airbag system without the limitations of poor shear stability, system complexity, and timing sensitivity. Like conventional honeycomb, once expanded, the energy absorber is transformed into a crush efficient and stable cellular structure. Other advantages, afforded by the flexible hinge feature, include a variety of deployment options such as linear, radial, and/or hybrid deployment methods. Radial deployment is utilized when omnidirectional cushioning is required. Linear deployment offers better efficiency, which is preferred when the impact orientation is known in advance. Several energy absorbers utilizing different deployment modes could also be combined to optimize overall performance and/or improve system reliability as outlined in the paper. Results from a series of component and full scale demonstration tests are presented as well as typical deployment techniques and mechanisms. LS-DYNA analytical simulations of selected tests are also presented.
Communication of military couples during deployment predicting generalized anxiety upon reunion.
Knobloch, Leanne K; Knobloch-Fedders, Lynne M; Yorgason, Jeremy B
2018-02-01
This study draws on the emotional cycle of deployment model (Pincus, House, Christenson, & Adler, 2001) to consider how the valence of communication between military personnel and at-home partners during deployment predicts their generalized anxiety upon reunion. Online survey data were collected from 555 military couples (N = 1,110 individuals) once per month for 8 consecutive months beginning at homecoming. Dyadic growth curve modeling results indicated that people's anxiety declined across the transition. For at-home partners, constructive communication during deployment predicted a steeper decline in anxiety over time. For both returning service members and at-home partners, destructive communication during deployment predicted more anxiety upon reunion but a steeper decline in anxiety over time. Results were robust beyond the frequency of communication during deployment and a host of individual, relational, and military variables. These findings advance the emotional cycle of deployment model, highlight the importance of the valence of communication during deployment, and illuminate how the effects of communication during deployment can endure after military couples are reunited. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
An economic evaluation of alternative biofuel deployment scenarios in the USA
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oladosu, Gbadebo
Energy market conditions have shifted dramatically since the USA renewable fuel standards (RFS1 in 2005; RFS2 in 2007) were enacted. The USA has transitioned from an increasing dependence on oil imports to abundant domestic oil production. In addition, increases in the use of ethanol, the main biofuel currently produced in the USA, is now limited by the blend wall constraint. Given this, the current study evaluates alternative biofuel deployment scenarios in the USA, accounting for changes in market conditions. The analysis is performed with a general equilibrium model that reflects the structure of the USA biofuel market as the transitionmore » to advanced biofuel begins. Results suggest that ethanol consumption would increase, albeit slowly, if current biofuel deployment rates of about 10% are maintained as persistently lower oil prices lead to a gradual increase in the consumption of liquid transportation fuels. Without the blend wall constraint, this study finds that the overall economic impact of a full implementation of the USA RFS2 policy is largely neutral before 2022. However, the economic impacts become slightly negative under the blend wall constraint since more expensive bio-hydrocarbons are needed to meet the RFS2 mandates. Results for a scenario with reduced advanced biofuel deployment based on current policy plans show near neutral economic impacts up to 2027. This scenario is also consistent with another scenario where the volume of bio-hydrocarbons deployed is reduced to adjust for its higher cost and energy content relative to deploying the mandated RFS2 advanced biofuel volumes as ethanol. The important role of technological change is demonstrated under pioneer and accelerated technology scenarios, with the latter leading to neutral or positive economic effects up to 2023 under most blend wall scenarios. Here, all scenarios evaluated in this study are found to have positive long-term economic benefits for the USA economy.« less
An economic evaluation of alternative biofuel deployment scenarios in the USA
Oladosu, Gbadebo
2017-05-03
Energy market conditions have shifted dramatically since the USA renewable fuel standards (RFS1 in 2005; RFS2 in 2007) were enacted. The USA has transitioned from an increasing dependence on oil imports to abundant domestic oil production. In addition, increases in the use of ethanol, the main biofuel currently produced in the USA, is now limited by the blend wall constraint. Given this, the current study evaluates alternative biofuel deployment scenarios in the USA, accounting for changes in market conditions. The analysis is performed with a general equilibrium model that reflects the structure of the USA biofuel market as the transitionmore » to advanced biofuel begins. Results suggest that ethanol consumption would increase, albeit slowly, if current biofuel deployment rates of about 10% are maintained as persistently lower oil prices lead to a gradual increase in the consumption of liquid transportation fuels. Without the blend wall constraint, this study finds that the overall economic impact of a full implementation of the USA RFS2 policy is largely neutral before 2022. However, the economic impacts become slightly negative under the blend wall constraint since more expensive bio-hydrocarbons are needed to meet the RFS2 mandates. Results for a scenario with reduced advanced biofuel deployment based on current policy plans show near neutral economic impacts up to 2027. This scenario is also consistent with another scenario where the volume of bio-hydrocarbons deployed is reduced to adjust for its higher cost and energy content relative to deploying the mandated RFS2 advanced biofuel volumes as ethanol. The important role of technological change is demonstrated under pioneer and accelerated technology scenarios, with the latter leading to neutral or positive economic effects up to 2023 under most blend wall scenarios. Here, all scenarios evaluated in this study are found to have positive long-term economic benefits for the USA economy.« less
Structural Design Considerations for a 50 kW-Class Solar Array for NASA's Asteroid Redirect Mission
NASA Technical Reports Server (NTRS)
Kerslake, Thomas W.; Kraft, Thomas G.; Yim, John T.; Le, Dzu K.
2016-01-01
NASA is planning an Asteroid Redirect Mission (ARM) to take place in the 2020s. To enable this multi-year mission, a 40 kW class solar electric propulsion (SEP) system powered by an advanced 50 kW class solar array will be required. Powered by the SEP module (SEPM), the ARM vehicle will travel to a large near-Earth asteroid, descend to its surface, capture a multi-metric ton (t) asteroid boulder, ascend from the surface and return to the Earth-moon system to ultimately place the ARM vehicle and its captured asteroid boulder into a stable distant orbit. During the years that follow, astronauts flying in the Orion multipurpose crew vehicle (MPCV) will dock with the ARM vehicle and conduct extra-vehicular activity (EVA) operations to explore and sample the asteroid boulder. This paper will review the top structural design considerations to successfully implement this 50 kW class solar array that must meet unprecedented performance levels. These considerations include beyond state-of-the-art metrics for specific mass, specific volume, deployed area, deployed solar array wing (SAW) keep in zone (KIZ), deployed strength and deployed frequency. Analytical and design results are presented that support definition of stowed KIZ and launch restraint interface definition. An offset boom is defined to meet the deployed SAW KIZ. The resulting parametric impact of the offset boom length on spacecraft moment of inertias and deployed SAW quasistatic and dynamic load cases are also presented. Load cases include ARM spacecraft thruster plume impingement, asteroid surface operations and Orion docking operations which drive the required SAW deployed strength and damping. The authors conclude that to support NASA's ARM power needs, an advanced SAW is required with mass performance better than 125 W/kg, stowed volume better than 40 kW/cu m, a deployed area of 200 sq m (100 sq m for each of two SAWs), a deployed SAW offset distance of nominally 3-4 m, a deployed SAW quasistatic strength of nominally 0.1 g in any direction, a deployed loading displacement under 2 m, a deployed fundamental frequency above 0.1 Hz and deployed damping of at least 1%. These parameters must be met on top of challenging mission environments and ground testing requirements unique to the ARM project.
Code of Federal Regulations, 2013 CFR
2013-10-01
... Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL ACCESS TO ADVANCED COMMUNICATIONS SERVICES AND EQUIPMENT... question, including on the development and deployment of new communications technologies; (3) The type of... features, and offered at differing price points. (c) The term advanced communications services shall mean...
Code of Federal Regulations, 2012 CFR
2012-10-01
... Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL ACCESS TO ADVANCED COMMUNICATIONS SERVICES AND EQUIPMENT... question, including on the development and deployment of new communications technologies; (3) The type of... features, and offered at differing price points. (c) The term advanced communications services shall mean...
Code of Federal Regulations, 2014 CFR
2014-10-01
... Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL ACCESS TO ADVANCED COMMUNICATIONS SERVICES AND EQUIPMENT... question, including on the development and deployment of new communications technologies; (3) The type of... features, and offered at differing price points. (c) The term advanced communications services shall mean...
Freight Advanced Traveler Information System (FRATIS) impact assessment.
DOT National Transportation Integrated Search
2016-01-01
This report is an independent assessment of three prototype Freight Advanced Traveler Information System (FRATIS) tests at Los Angeles, Dallas/Fort Worth, and South Florida. The FRATIS technologies deployed at one or two drayage companies in each tes...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
Wind Power Today and Tomorrow is an annual publication that provides an overview of the wind research conducted under the U.S. Department of Energy's Wind and Hydropower Technologies Program. The purpose of Wind Power Today and Tomorrow is to show how DOE supports wind turbine research and deployment in hopes of furthering the advancement of wind technologies that produce clean, low-cost, reliable energy. Content objectives include: educate readers about the advantages and potential for widespread deployment of wind energy; explain the program's objectives and goals; describe the program's accomplishments in research and application; examine the barriers to widespread deployment; describemore » the benefits of continued research and development; facilitate technology transfer; and attract cooperative wind energy projects with industry. This 2003 edition of the program overview also includes discussions about wind industry growth in 2003, how DOE is taking advantage of low wind speed region s through advancing technology, and distributed applications for small wind turbines.« less
Rail integrity alert system (RIAS) feature discrimination : final report.
DOT National Transportation Integrated Search
2016-08-01
This report describes GE Global Researchs research, in partnership with GE Transportation, into developing and deploying : algorithms for a locomotive-based inductive sensing system that has a very high probability of detecting broken rails with v...
NASA Astrophysics Data System (ADS)
Kinzel, P. J.; Legleiter, C. J.; Nelson, J. M.
2009-12-01
Airborne bathymetric Light Detection and Ranging (LiDAR) systems designed for coastal and marine surveys are increasingly being deployed in fluvial environments. While the adaptation of this technology to rivers and streams would appear to be straightforward, currently technical challenges remain with regard to achieving high levels of vertical accuracy and precision when mapping bathymetry in shallow fluvial settings. Collectively these mapping errors have a direct bearing on hydraulic model predictions made using these data. We compared channel surveys conducted along the Platte River, Nebraska, and the Trinity River, California, using conventional ground-based methods with those made with the hybrid topographic/bathymetric Experimental Advanced Airborne Research LiDAR (EAARL). In the turbid and braided Platte River, a bathymetric-waveform processing algorithm was shown to enhance the definition of thalweg channels over a more simplified, first-surface waveform processing algorithm. Consequently flow simulations using data processed with the shallow bathymetric algorithm resulted in improved prediction of wetted area relative to the first-surface algorithm, when compared to the wetted area in concurrent aerial imagery. However, when compared to using conventionally collected data for flow modeling, the inundation extent was over predicted with the EAARL topography due to higher bed elevations measured by the LiDAR. In the relatively clear, meandering Trinity River, bathymetric processing algorithms were capable of defining a 3 meter deep pool. However, a similar bias in depth measurement was observed, with the LiDAR measuring the elevation of the river bottom above its actual position, resulting in a predicted water surface higher than that measured by field data. This contribution addresses the challenge of making bathymetric measurements with the EAARL in different environmental conditions encountered in fluvial settings, explores technical issues related to reliably detecting the water surface and river bottom, and illustrates the impact of using LiDAR data and current processing techniques to produce above and below water topographic surfaces for hydraulic modeling and habitat applications.
Improvement of the SEP protocol based on community structure of node degree
NASA Astrophysics Data System (ADS)
Li, Donglin; Wei, Suyuan
2017-05-01
Analyzing the Stable election protocol (SEP) in wireless sensor networks and aiming at the problem of inhomogeneous cluster-heads distribution and unreasonable cluster-heads selectivity and single hop transmission in the SEP, a SEP Protocol based on community structure of node degree (SEP-CSND) is proposed. In this algorithm, network node deployed by using grid deployment model, and the connection between nodes established by setting up the communication threshold. The community structure constructed by node degree, then cluster head is elected in the community structure. On the basis of SEP, the node's residual energy and node degree is added in cluster-heads election. The information is transmitted with mode of multiple hops between network nodes. The simulation experiments showed that compared to the classical LEACH and SEP, this algorithm balances the energy consumption of the entire network and significantly prolongs network lifetime.
Transactive Control of Commercial Building HVAC Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Corbin, Charles D.; Makhmalbaf, Atefe; Huang, Sen
This document details the development and testing of market-based transactive controls for building heating, ventilating and air conditioning (HVAC) systems. These controls are intended to serve the purposes of reducing electricity use through conservation, reducing peak building electric demand, and providing demand flexibility to assist with power system operations. This report is the summary of the first year of work conducted under Phase 1 of the Clean Energy and Transactive Campus Project. The methods and techniques described here were first investigated in simulation, and then subsequently deployed to a physical testbed on the Pacific Northwest National Laboratory (PNNL) campus formore » validation. In this report, we describe the models and control algorithms we have developed, testing of the control algorithms in simulation, and deployment to a physical testbed. Results from physical experiments support previous simulation findings, and provide insights for further improvement.« less
Research on virtual network load balancing based on OpenFlow
NASA Astrophysics Data System (ADS)
Peng, Rong; Ding, Lei
2017-08-01
The Network based on OpenFlow technology separate the control module and data forwarding module. Global deployment of load balancing strategy through network view of control plane is fast and of high efficiency. This paper proposes a Weighted Round-Robin Scheduling algorithm for virtual network and a load balancing plan for server load based on OpenFlow. Load of service nodes and load balancing tasks distribution algorithm will be taken into account.
2016-09-01
identification and tracking algorithm. 14. SUBJECT TERMS unmanned ground vehicles , pure pursuit, vector field histogram, feature recognition 15. NUMBER OF...located within the various theaters of war. The pace for the development and deployment of unmanned ground vehicles (UGV) was, however, not keeping...DEVELOPMENT OF UNMANNED GROUND VEHICLES The development and fielding of UGVs in an operational role are not a new concept in the battlefield. In
One Time Passwords in Everything (OPIE): Experiences with Building and Using Stringer Authentication
1995-01-01
opiepasswd(1). The name change brings it more in line with its UNIX counterpart passwd (1), which should make both programs easier to remember for users. This...char * passwd ) int opiehash(char *x, unsigned algorithm) The one-time password schemes implemented in OPIE, as rst described in [Hal94], compute a...seed, passwd ); while (sequence-- != 0) opiehash(result, algorithm); opiebtoe(result,words); Send words. : : : 6 Deployment Every machine that has
Ontological Problem-Solving Framework for Dynamically Configuring Sensor Systems and Algorithms
Qualls, Joseph; Russomanno, David J.
2011-01-01
The deployment of ubiquitous sensor systems and algorithms has led to many challenges, such as matching sensor systems to compatible algorithms which are capable of satisfying a task. Compounding the challenges is the lack of the requisite knowledge models needed to discover sensors and algorithms and to subsequently integrate their capabilities to satisfy a specific task. A novel ontological problem-solving framework has been designed to match sensors to compatible algorithms to form synthesized systems, which are capable of satisfying a task and then assigning the synthesized systems to high-level missions. The approach designed for the ontological problem-solving framework has been instantiated in the context of a persistence surveillance prototype environment, which includes profiling sensor systems and algorithms to demonstrate proof-of-concept principles. Even though the problem-solving approach was instantiated with profiling sensor systems and algorithms, the ontological framework may be useful with other heterogeneous sensing-system environments. PMID:22163793
CE-ACCE: The Cloud Enabled Advanced sCience Compute Environment
NASA Astrophysics Data System (ADS)
Cinquini, L.; Freeborn, D. J.; Hardman, S. H.; Wong, C.
2017-12-01
Traditionally, Earth Science data from NASA remote sensing instruments has been processed by building custom data processing pipelines (often based on a common workflow engine or framework) which are typically deployed and run on an internal cluster of computing resources. This approach has some intrinsic limitations: it requires each mission to develop and deploy a custom software package on top of the adopted framework; it makes use of dedicated hardware, network and storage resources, which must be specifically purchased, maintained and re-purposed at mission completion; and computing services cannot be scaled on demand beyond the capability of the available servers.More recently, the rise of Cloud computing, coupled with other advances in containerization technology (most prominently, Docker) and micro-services architecture, has enabled a new paradigm, whereby space mission data can be processed through standard system architectures, which can be seamlessly deployed and scaled on demand on either on-premise clusters, or commercial Cloud providers. In this talk, we will present one such architecture named CE-ACCE ("Cloud Enabled Advanced sCience Compute Environment"), which we have been developing at the NASA Jet Propulsion Laboratory over the past year. CE-ACCE is based on the Apache OODT ("Object Oriented Data Technology") suite of services for full data lifecycle management, which are turned into a composable array of Docker images, and complemented by a plug-in model for mission-specific customization. We have applied this infrastructure to both flying and upcoming NASA missions, such as ECOSTRESS and SMAP, and demonstrated deployment on the Amazon Cloud, either using simple EC2 instances, or advanced AWS services such as Amazon Lambda and ECS (EC2 Container Services).
Arbitrated Quantum Signature with Hamiltonian Algorithm Based on Blind Quantum Computation
NASA Astrophysics Data System (ADS)
Shi, Ronghua; Ding, Wanting; Shi, Jinjing
2018-03-01
A novel arbitrated quantum signature (AQS) scheme is proposed motivated by the Hamiltonian algorithm (HA) and blind quantum computation (BQC). The generation and verification of signature algorithm is designed based on HA, which enables the scheme to rely less on computational complexity. It is unnecessary to recover original messages when verifying signatures since the blind quantum computation is applied, which can improve the simplicity and operability of our scheme. It is proved that the scheme can be deployed securely, and the extended AQS has some extensive applications in E-payment system, E-government, E-business, etc.
Arbitrated Quantum Signature with Hamiltonian Algorithm Based on Blind Quantum Computation
NASA Astrophysics Data System (ADS)
Shi, Ronghua; Ding, Wanting; Shi, Jinjing
2018-07-01
A novel arbitrated quantum signature (AQS) scheme is proposed motivated by the Hamiltonian algorithm (HA) and blind quantum computation (BQC). The generation and verification of signature algorithm is designed based on HA, which enables the scheme to rely less on computational complexity. It is unnecessary to recover original messages when verifying signatures since the blind quantum computation is applied, which can improve the simplicity and operability of our scheme. It is proved that the scheme can be deployed securely, and the extended AQS has some extensive applications in E-payment system, E-government, E-business, etc.
Department of Energy Recovery Act Investment in Biomass Technologies
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
2010-11-01
The American Recovery and Reinvestment Act of 2009 (Recovery Act) provided more than $36 billion to the Department of Energy (DOE) to accelerate work on existing projects, undertake new and transformative research, and deploy clean energy technologies across the nation. Of this funding, $1029 million is supporting innovative work to advance biomass research, development, demonstration, and deployment.
MD PHEV/EV ARRA Project Data Collection and Reporting (Presentation)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Walkowicz, K.; Ramroth, L.; Duran, A.
2012-01-01
This presentation describes a National Renewable Energy Laboratory project to collect and analyze commercial fleet deployment data from medium-duty plug-in hybrid electric and all-electric vehicles that were deployed using funds from the American Recovery and Reinvestment Act. This work supports the Department of Energy's Vehicle Technologies Program and its Advanced Vehicle Testing Activity.
Test Frame for Gravity Offload Systems
NASA Technical Reports Server (NTRS)
Murray, Alexander R.
2005-01-01
Advances in space telescope and aperture technology have created a need to launch larger structures into space. Traditional truss structures will be too heavy and bulky to be effectively used in the next generation of space-based structures. Large deployable structures are a possible solution. By packaging deployable trusses, the cargo volume of these large structures greatly decreases. The ultimate goal is to three dimensionally measure a boom's deployment in simulated microgravity. This project outlines the construction of the test frame that supports a gravity offload system. The test frame is stable enough to hold the gravity offload system and does not interfere with deployment of, or vibrations in, the deployable test boom. The natural frequencies and stability of the frame were engineered in FEMAP. The test frame was developed to have natural frequencies that would not match the first two modes of the deployable beam. The frame was then modeled in Solidworks and constructed. The test frame constructed is a stable base to perform studies on deployable structures.
Development and Ground Testing of a Compactly Stowed Scalable Inflatably Deployed Solar Sail
NASA Technical Reports Server (NTRS)
Lichodziejewski, David; Derbes, Billy; Reinert, Rich; Belvin, Keith; Slade, Kara; Mann, Troy
2004-01-01
This paper discusses the solar sail design and outlines the interim accomplishments to advance the technology readiness level (TRL) of the subsystem from 3 toward a technology readiness level of 6 in 2005. Under Phase II of the program many component test articles have been fabricated and tested successfully. Most notably an unprecedented section of the conically deployed rigidizable sail support beam, the heart of the inflatable rigidizable structure, has been deployed and tested in the NASA Goddard thermal vacuum chamber with good results. The development testing validated the beam packaging and deployment. The inflatable conically deployed, Sub Tg rigidizable beam technology is now in the TRL 5-6 range. The fabricated masses and structural test results of our beam components have met predictions and no changes to the mass estimates or design assumptions have been identified adding great credibility to the design. Several quadrants of the Mylar sail have also been fabricated and successfully deployed validating our design, manufacturing, and deployment techniques.
Accuracy of vaginal symptom self-diagnosis algorithms for deployed military women.
Ryan-Wenger, Nancy A; Neal, Jeremy L; Jones, Ashley S; Lowe, Nancy K
2010-01-01
Deployed military women have an increased risk for development of vaginitis due to extreme temperatures, primitive sanitation, hygiene and laundry facilities, and unavailable or unacceptable healthcare resources. The Women in the Military Self-Diagnosis (WMSD) and treatment kit was developed as a field-expedient solution to this problem. The primary study aims were to evaluate the accuracy of women's self-diagnosis of vaginal symptoms and eight diagnostic algorithms and to predict potential self-medication omission and commission error rates. Participants included 546 active duty, deployable Army (43.3%) and Navy (53.6%) women with vaginal symptoms who sought healthcare at troop medical clinics on base.In the clinic lavatory, women conducted a self-diagnosis using a sterile cotton swab to obtain vaginal fluid, a FemExam card to measure positive or negative pH and amines, and the investigator-developed WMSD Decision-Making Guide. Potential self-diagnoses were "bacterial infection" (bacterial vaginosis [BV] and/or trichomonas vaginitis [TV]), "yeast infection" (candida vaginitis [CV]), "no infection/normal," or "unclear." The Affirm VPIII laboratory reference standard was used to detect clinically significant amounts of vaginal fluid DNA for organisms associated with BV, TV, and CV. Women's self-diagnostic accuracy was 56% for BV/TV and 69.2% for CV. False-positives would have led to a self-medication commission error rate of 20.3% for BV/TV and 8% for CV. Potential self-medication omission error rates due to false-negatives were 23.7% for BV/TV and 24.8% for CV. The positive predictive value of diagnostic algorithms ranged from 0% to 78.1% for BV/TV and 41.7% for CV. The algorithms were based on clinical diagnostic standards. The nonspecific nature of vaginal symptoms, mixed infections, and a faulty device intended to measure vaginal pH and amines explain why none of the algorithms reached the goal of 95% accuracy. The next prototype of the WMSD kit will not include nonspecific vaginal signs and symptoms in favor of recently available point-of-care devices that identify antigens or enzymes of the causative BV, TV, and CV organisms.
NASA Astrophysics Data System (ADS)
Busanelli, Stefano; Martalò, Marco; Ferrari, Gianluigi; Spigoni, Giovanni; Iotti, Nicola
In this paper, we analyze the performance of vertical handover (VHO) algorithms for seamless mobility between WiFi and UMTS networks. We focus on a no-coupling scenario, characterized by the lack of any form of cooperation between the involved players (users and network operators). In this context, we first propose a low-complexity Received Signal Strength Indicator (RSSI)-based algorithm, and then an improved hybrid RSSI/goodput version. We present experimental results based on the implementation of a real testbed with commercial WiFi (Guglielmo) and UMTS (Telecom Italia) deployed networks. Despite the relatively long handover times experienced in our testbed, the proposed RSSI-based VHO algorithm guarantees an effective goodput increase at the MTs. Moreover, this algorithm mitigates the ping-pong phenomenon.
Developing an area-wide system for coordinated ramp meter control.
DOT National Transportation Integrated Search
2008-12-01
Ramp metering has been broadly accepted and deployed as an effective countermeasure : against both recurrent and non-recurrent congestion on freeways. However, many current ramp : metering algorithms tend to improve only freeway travels using local d...
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1993-09-15
This report contains an extensive evaluation of GE advanced boiling water reactor plants prepared for United State Department of Energy. The general areas covered in this report are: core and system performance; fuel cycle; infrastructure and deployment; and safety and environmental approval.
A simple method for verifying the deployment of the TOMS-EP solar arrays
NASA Technical Reports Server (NTRS)
Koppersmith, James R.; Ketchum, Eleanor
1995-01-01
The Total Ozone Mapping Spectrometer-Earth Probe (TOMS-EP) mission relies upon a successful deployment of the spacecraft's solar arrays. Several methods of verification are being employed to ascertain the solar array deployment status, with each requiring differing amounts of data. This paper describes a robust attitude-independent verification method that utilizes telemetry from the coarse Sun sensors (CSS's) and the three-axis magnetometers (TAM's) to determine the solar array deployment status - and it can do so with only a few, not necessarily contiguous, points of data. The method developed assumes that the solar arrays are deployed. Telemetry data from the CSS and TAM are converted to the Sun and magnetic field vectors in spacecraft body coordinates, and the angle between them is calculated. Deployment is indicated if this angle is within a certain error tolerance of the angle between the reference Sun and magnetic field vectors. Although several other methods can indicate a non-deployed state, with this method there is a 70% confidence level in confirming deployment as well as a nearly 100% certainty in confirming a non-deployed state. In addition, the spacecraft attitude (which is not known during the first orbit after launch) is not needed for this algorithm because the angle between the Sun and magnetic field vectors is independent of the spacecraft attitude. This technique can be applied to any spacecraft with a TAM and with CSS's mounted on the solar array(s).
NASA Astrophysics Data System (ADS)
Moses, J. F.; Jain, P.; Johnson, J.; Doiron, J. A.
2017-12-01
New Earth observation instruments are planned to enable advancements in Earth science research over the next decade. Diversity of Earth observing instruments and their observing platforms will continue to increase as new instrument technologies emerge and are deployed as part of National programs such as Joint Polar Satellite System (JPSS), Geostationary Operational Environmental Satellite system (GOES), Landsat as well as the potential for many CubeSat and aircraft missions. The practical use and value of these observational data often extends well beyond their original purpose. The practicing community needs intuitive and standardized tools to enable quick unfettered development of tailored products for specific applications and decision support systems. However, the associated data processing system can take years to develop and requires inherent knowledge and the ability to integrate increasingly diverse data types from multiple sources. This paper describes the adaptation of a large-scale data processing system built for supporting JPSS algorithm calibration and validation (Cal/Val) node to a simplified science data system for rapid application. The new configurable data system reuses scalable JAVA technologies built for the JPSS Government Resource for Algorithm Verification, Independent Test, and Evaluation (GRAVITE) system to run within a laptop environment and support product generation and data processing of AURA Ozone Monitoring Instrument (OMI) science products. Of particular interest are the root requirements necessary for integrating experimental algorithms and Hierarchical Data Format (HDF) data access libraries into a science data production system. This study demonstrates the ability to reuse existing Ground System technologies to support future missions with minimal changes.
Arsalan, Muhammad; Naqvi, Rizwan Ali; Kim, Dong Seop; Nguyen, Phong Ha; Owais, Muhammad; Park, Kang Ryoung
2018-01-01
The recent advancements in computer vision have opened new horizons for deploying biometric recognition algorithms in mobile and handheld devices. Similarly, iris recognition is now much needed in unconstraint scenarios with accuracy. These environments make the acquired iris image exhibit occlusion, low resolution, blur, unusual glint, ghost effect, and off-angles. The prevailing segmentation algorithms cannot cope with these constraints. In addition, owing to the unavailability of near-infrared (NIR) light, iris recognition in visible light environment makes the iris segmentation challenging with the noise of visible light. Deep learning with convolutional neural networks (CNN) has brought a considerable breakthrough in various applications. To address the iris segmentation issues in challenging situations by visible light and near-infrared light camera sensors, this paper proposes a densely connected fully convolutional network (IrisDenseNet), which can determine the true iris boundary even with inferior-quality images by using better information gradient flow between the dense blocks. In the experiments conducted, five datasets of visible light and NIR environments were used. For visible light environment, noisy iris challenge evaluation part-II (NICE-II selected from UBIRIS.v2 database) and mobile iris challenge evaluation (MICHE-I) datasets were used. For NIR environment, the institute of automation, Chinese academy of sciences (CASIA) v4.0 interval, CASIA v4.0 distance, and IIT Delhi v1.0 iris datasets were used. Experimental results showed the optimal segmentation of the proposed IrisDenseNet and its excellent performance over existing algorithms for all five datasets. PMID:29748495
Arsalan, Muhammad; Naqvi, Rizwan Ali; Kim, Dong Seop; Nguyen, Phong Ha; Owais, Muhammad; Park, Kang Ryoung
2018-05-10
The recent advancements in computer vision have opened new horizons for deploying biometric recognition algorithms in mobile and handheld devices. Similarly, iris recognition is now much needed in unconstraint scenarios with accuracy. These environments make the acquired iris image exhibit occlusion, low resolution, blur, unusual glint, ghost effect, and off-angles. The prevailing segmentation algorithms cannot cope with these constraints. In addition, owing to the unavailability of near-infrared (NIR) light, iris recognition in visible light environment makes the iris segmentation challenging with the noise of visible light. Deep learning with convolutional neural networks (CNN) has brought a considerable breakthrough in various applications. To address the iris segmentation issues in challenging situations by visible light and near-infrared light camera sensors, this paper proposes a densely connected fully convolutional network (IrisDenseNet), which can determine the true iris boundary even with inferior-quality images by using better information gradient flow between the dense blocks. In the experiments conducted, five datasets of visible light and NIR environments were used. For visible light environment, noisy iris challenge evaluation part-II (NICE-II selected from UBIRIS.v2 database) and mobile iris challenge evaluation (MICHE-I) datasets were used. For NIR environment, the institute of automation, Chinese academy of sciences (CASIA) v4.0 interval, CASIA v4.0 distance, and IIT Delhi v1.0 iris datasets were used. Experimental results showed the optimal segmentation of the proposed IrisDenseNet and its excellent performance over existing algorithms for all five datasets.
Jaïdi, Faouzi; Labbene-Ayachi, Faten; Bouhoula, Adel
2016-12-01
Nowadays, e-healthcare is a main advancement and upcoming technology in healthcare industry that contributes to setting up automated and efficient healthcare infrastructures. Unfortunately, several security aspects remain as main challenges towards secure and privacy-preserving e-healthcare systems. From the access control perspective, e-healthcare systems face several issues due to the necessity of defining (at the same time) rigorous and flexible access control solutions. This delicate and irregular balance between flexibility and robustness has an immediate impact on the compliance of the deployed access control policy. To address this issue, the paper defines a general framework to organize thinking about verifying, validating and monitoring the compliance of access control policies in the context of e-healthcare databases. We study the problem of the conformity of low level policies within relational databases and we particularly focus on the case of a medical-records management database defined in the context of a Medical Information System. We propose an advanced solution for deploying reliable and efficient access control policies. Our solution extends the traditional lifecycle of an access control policy and allows mainly managing the compliance of the policy. We refer to an example to illustrate the relevance of our proposal.
2014-12-01
Introduction 1.1 Background In today’s world of high -tech warfare, we have developed the ability to deploy virtually any type of ordnance quickly and... ANSI Std. 239–18 i THIS PAGE INTENTIONALLY LEFT BLANK ii Approved for public release; distribution is unlimited TEMPORALLY ADJUSTED COMPLEX AMBIGUITY...this time due to time constraints and the high computational complexity involved in the current implementation of the Moss algorithm. Full maps, with
Computer-automated evolution of an X-band antenna for NASA's Space Technology 5 mission.
Hornby, Gregory S; Lohn, Jason D; Linden, Derek S
2011-01-01
Whereas the current practice of designing antennas by hand is severely limited because it is both time and labor intensive and requires a significant amount of domain knowledge, evolutionary algorithms can be used to search the design space and automatically find novel antenna designs that are more effective than would otherwise be developed. Here we present our work in using evolutionary algorithms to automatically design an X-band antenna for NASA's Space Technology 5 (ST5) spacecraft. Two evolutionary algorithms were used: the first uses a vector of real-valued parameters and the second uses a tree-structured generative representation for constructing the antenna. The highest-performance antennas from both algorithms were fabricated and tested and both outperformed a hand-designed antenna produced by the antenna contractor for the mission. Subsequent changes to the spacecraft orbit resulted in a change in requirements for the spacecraft antenna. By adjusting our fitness function we were able to rapidly evolve a new set of antennas for this mission in less than a month. One of these new antenna designs was built, tested, and approved for deployment on the three ST5 spacecraft, which were successfully launched into space on March 22, 2006. This evolved antenna design is the first computer-evolved antenna to be deployed for any application and is the first computer-evolved hardware in space.
Development of the HERMIES III mobile robot research testbed at Oak Ridge National Laboratory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Manges, W.W.; Hamel, W.R.; Weisbin, C.R.
1988-01-01
The latest robot in the Hostile Environment Robotic Machine Intelligence Experiment Series (HERMIES) is now under development at the Center for Engineering Systems Advanced Research (CESAR) in the Oak Ridge National Laboratory. The HERMIES III robot incorporates a larger than human size 7-degree-of-freedom manipulator mounted on a 2-degree-of-freedom mobile platform including a variety of sensors and computers. The deployment of this robot represents a significant increase in research capabilities for the CESAR laboratory. The initial on-board computer capacity of the robot exceeds that of 20 Vax 11/780s. The navigation and vision algorithms under development make extensive use of the on-boardmore » NCUBE hypercube computer while the sensors are interfaced through five VME computers running the OS-9 real-time, multitasking operating system. This paper describes the motivation, key issues, and detailed design trade-offs of implementing the first phase (basic functionality) of the HERMIES III robot. 10 refs., 7 figs.« less
Turbulence profiling for adaptive optics tomographic reconstructors
NASA Astrophysics Data System (ADS)
Laidlaw, Douglas J.; Osborn, James; Wilson, Richard W.; Morris, Timothy J.; Butterley, Timothy; Reeves, Andrew P.; Townson, Matthew J.; Gendron, Éric; Vidal, Fabrice; Morel, Carine
2016-07-01
To approach optimal performance advanced Adaptive Optics (AO) systems deployed on ground-based telescopes must have accurate knowledge of atmospheric turbulence as a function of altitude. Stereo-SCIDAR is a high-resolution stereoscopic instrument dedicated to this measure. Here, its profiles are directly compared to internal AO telemetry atmospheric profiling techniques for CANARY (Vidal et al. 20141), a Multi-Object AO (MOAO) pathfinder on the William Herschel Telescope (WHT), La Palma. In total twenty datasets are analysed across July and October of 2014. Levenberg-Marquardt fitting algorithms dubbed Direct Fitting and Learn 2 Step (L2S; Martin 20142) are used in the recovery of profile information via covariance matrices - respectively attaining average Pearson product-moment correlation coefficients with stereo-SCIDAR of 0.2 and 0.74. By excluding the measure of covariance between orthogonal Wavefront Sensor (WFS) slopes these results have revised values of 0.65 and 0.2. A data analysis technique that combines L2S and SLODAR is subsequently introduced that achieves a correlation coefficient of 0.76.
The Development of Point Doppler Velocimeter Data Acquisition and Processing Software
NASA Technical Reports Server (NTRS)
Cavone, Angelo A.
2008-01-01
In order to develop efficient and quiet aircraft and validate Computational Fluid Dynamic predications, aerodynamic researchers require flow parameter measurements to characterize flow fields about wind tunnel models and jet flows. A one-component Point Doppler Velocimeter (pDv), a non-intrusive, laser-based instrument, was constructed using a design/develop/test/validate/deploy approach. A primary component of the instrument is software required for system control/management and data collection/reduction. This software along with evaluation algorithms, advanced pDv from a laboratory curiosity to a production level instrument. Simultaneous pDv and pitot probe velocity measurements obtained at the centerline of a flow exiting a two-inch jet, matched within 0.4%. Flow turbulence spectra obtained with pDv and a hot-wire detected the primary and secondary harmonics with equal dynamic range produced by the fan driving the flow. Novel,hardware and software methods were developed, tested and incorporated into the system to eliminate and/or minimize error sources and improve system reliability.
NASA Technical Reports Server (NTRS)
Withrow, Colleen A.; Reveley, Mary S.
2015-01-01
The Aviation Safety Program (AvSP) System-Wide Safety and Assurance Technologies (SSAT) Project asked the AvSP Systems and Portfolio Analysis Team to identify SSAT-related trends. SSAT had four technical challenges: advance safety assurance to enable deployment of NextGen systems; automated discovery of precursors to aviation safety incidents; increasing safety of human-automation interaction by incorporating human performance, and prognostic algorithm design for safety assurance. This report reviews incident data from the NASA Aviation Safety Reporting System (ASRS) for system-component-failure- or-malfunction- (SCFM-) related and human-factor-related incidents for commercial or cargo air carriers (Part 121), commuter airlines (Part 135), and general aviation (Part 91). The data was analyzed by Federal Aviation Regulations (FAR) part, phase of flight, SCFM category, human factor category, and a variety of anomalies and results. There were 38 894 SCFM-related incidents and 83 478 human-factorrelated incidents analyzed between January 1993 and April 2011.
Advancing Partnerships Towards an Integrated Approach to Oil Spill Response
NASA Astrophysics Data System (ADS)
Green, D. S.; Stough, T.; Gallegos, S. C.; Leifer, I.; Murray, J. J.; Streett, D.
2015-12-01
Oil spills can cause enormous ecological and economic devastation, necessitating application of the best science and technology available, and remote sensing is playing a growing critical role in the detection and monitoring of oil spills, as well as facilitating validation of remote sensing oil spill products. The FOSTERRS (Federal Oil Science Team for Emergency Response Remote Sensing) interagency working group seeks to ensure that during an oil spill, remote sensing assets (satellite/aircraft/instruments) and analysis techniques are quickly, effectively, appropriately, and seamlessly available to oil spills responders. Yet significant challenges remain for addressing oils spanning a vast range of chemical properties that may be spilled from the Tropics to the Arctic, with algorithms and scientific understanding needing advances to keep up with technology. Thus, FOSTERRS promotes enabling scientific discovery to ensure robust utilization of available technology as well as identifying technologies moving up the TRL (Technology Readiness Level). A recent FOSTERRS facilitated support activity involved deployment of the AVIRIS NG (Airborne Visual Infrared Imaging Spectrometer- Next Generation) during the Santa Barbara Oil Spill to validate the potential of airborne hyperspectral imaging to real-time map beach tar coverage including surface validation data. Many developing airborne technologies have potential to transition to space-based platforms providing global readiness.
General advancing front packing algorithm for the discrete element method
NASA Astrophysics Data System (ADS)
Morfa, Carlos A. Recarey; Pérez Morales, Irvin Pablo; de Farias, Márcio Muniz; de Navarra, Eugenio Oñate Ibañez; Valera, Roberto Roselló; Casañas, Harold Díaz-Guzmán
2018-01-01
A generic formulation of a new method for packing particles is presented. It is based on a constructive advancing front method, and uses Monte Carlo techniques for the generation of particle dimensions. The method can be used to obtain virtual dense packings of particles with several geometrical shapes. It employs continuous, discrete, and empirical statistical distributions in order to generate the dimensions of particles. The packing algorithm is very flexible and allows alternatives for: 1—the direction of the advancing front (inwards or outwards), 2—the selection of the local advancing front, 3—the method for placing a mobile particle in contact with others, and 4—the overlap checks. The algorithm also allows obtaining highly porous media when it is slightly modified. The use of the algorithm to generate real particle packings from grain size distribution curves, in order to carry out engineering applications, is illustrated. Finally, basic applications of the algorithm, which prove its effectiveness in the generation of a large number of particles, are carried out.
Advances in Mechanical Architectures of Large Precision Space Apertures
NASA Astrophysics Data System (ADS)
Datashvili, Leri; Maghaldadze, Nikoloz; Endler, Stephan; Pauw, Julian; He, Peng; Baier, Horst; Ihle, Alexander; Santiago Prowlad, Julian
2014-06-01
Recent advances in development of mechanical architectures of large deployable reflectors (LDRs) through the projects of the European Space Agency are addressed in this paper. Two different directions of LDR architectures are being investigated and developed at LSS and LLB. These are LDRs with knitted metal mesh and with flexible shell-membrane reflecting surfaces. The first direction is matured and required advancing of the novel architecture of the supporting structure that provides deployment and final shape accuracy of the metal mesh is underway. The second direction is rather new and its current development stage is focused on investigations of dimensional stability of the flexible shell-membrane reflecting surface. In both directions 5 m diameter functional models will be built to demonstrate achieved performances, which shall prepare the basis for further improvement of their technology readiness levels.
Will crown ideotype help determine optimum varietal silviculture?
Timothy J. Albaugh; Thomas R. Fox; Marco A. Yanez; Rafael A. Rubilar; Barry Goldfarb
2016-01-01
Recent advances in somatic embryogenesis permit large numbers of clonal loblolly pine (Pinus taeda L.) to be produced and deployed. Clones may have greater growth (mean annual increment exceeding 30 cubic meters per hectare per year), greater stand uniformity and may be more susceptible to genotype by environment interactions when they are deployed in intensively...
A Summary of Proceedings for the Advanced Deployable Day/Night Simulation Symposium
2009-07-01
initiated to design , develop, and deliver transportable visual simulations that jointly provide night-vision and high-resolution daylight capability. The...Deployable Day/Night Simulation (ADDNS) Technology Demonstration Project was initiated to design , develop, and deliver transportable visual...was Dr. Richard Wildes (York University); Mr. Vitaly Zholudev (Department of Computer Science, York University), Mr. X. Zhu (Neptec Design Group), and
2015-02-01
Centralization . . . . . . . . . . . . . . . . . . . . . . 43 “Anonymity”: A Bitcoin Case Study...been a case of x National Security Implications of Virtual Currency such a non-state actor deployment; in this report, we aim to high- light...development of VCs may advance, including a gen- eral increased sophistication in cryptographic applications. More gen- erally, we make the case that the main
Cross-modal work helps OMC improve the safety of commercial transportation
DOT National Transportation Integrated Search
1997-01-01
This article describes the Commercial Vehicle Information System (CVIS), designed to deploy a national safety program for the U.S. commercial trucking fleet. CVIS is built around a safety analysis algorithm called SafeStat which constructs a profile ...
Karstoft, Karen-Inge; Statnikov, Alexander; Andersen, Søren B; Madsen, Trine; Galatzer-Levy, Isaac R
2015-09-15
Pre-deployment identification of soldiers at risk for long-term posttraumatic stress psychopathology after home coming is important to guide decisions about deployment. Early post-deployment identification can direct early interventions to those in need and thereby prevents the development of chronic psychopathology. Both hold significant public health benefits given large numbers of deployed soldiers, but has so far not been achieved. Here, we aim to assess the potential for pre- and early post-deployment prediction of resilience or posttraumatic stress development in soldiers by application of machine learning (ML) methods. ML feature selection and prediction algorithms were applied to a prospective cohort of 561 Danish soldiers deployed to Afghanistan in 2009 to identify unique risk indicators and forecast long-term posttraumatic stress responses. Robust pre- and early postdeployment risk indicators were identified, and included individual PTSD symptoms as well as total level of PTSD symptoms, previous trauma and treatment, negative emotions, and thought suppression. The predictive performance of these risk indicators combined was assessed by cross-validation. Together, these indicators forecasted long term posttraumatic stress responses with high accuracy (pre-deployment: AUC = 0.84 (95% CI = 0.81-0.87), post-deployment: AUC = 0.88 (95% CI = 0.85-0.91)). This study utilized a previously collected data set and was therefore not designed to exhaust the potential of ML methods. Further, the study relied solely on self-reported measures. Pre-deployment and early post-deployment identification of risk for long-term posttraumatic psychopathology are feasible and could greatly reduce the public health costs of war. Copyright © 2015 Elsevier B.V. All rights reserved.
Computing Generalized Matrix Inverse on Spiking Neural Substrate
Shukla, Rohit; Khoram, Soroosh; Jorgensen, Erik; Li, Jing; Lipasti, Mikko; Wright, Stephen
2018-01-01
Emerging neural hardware substrates, such as IBM's TrueNorth Neurosynaptic System, can provide an appealing platform for deploying numerical algorithms. For example, a recurrent Hopfield neural network can be used to find the Moore-Penrose generalized inverse of a matrix, thus enabling a broad class of linear optimizations to be solved efficiently, at low energy cost. However, deploying numerical algorithms on hardware platforms that severely limit the range and precision of representation for numeric quantities can be quite challenging. This paper discusses these challenges and proposes a rigorous mathematical framework for reasoning about range and precision on such substrates. The paper derives techniques for normalizing inputs and properly quantizing synaptic weights originating from arbitrary systems of linear equations, so that solvers for those systems can be implemented in a provably correct manner on hardware-constrained neural substrates. The analytical model is empirically validated on the IBM TrueNorth platform, and results show that the guarantees provided by the framework for range and precision hold under experimental conditions. Experiments with optical flow demonstrate the energy benefits of deploying a reduced-precision and energy-efficient generalized matrix inverse engine on the IBM TrueNorth platform, reflecting 10× to 100× improvement over FPGA and ARM core baselines. PMID:29593483
Outlooks for Wind Power in the United States: Drivers and Trends under a 2016 Policy Environment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mai, Trieu; Lantz, Eric; Ho, Jonathan
Over the past decade, wind power has become one of the fastest growing electricity generation sources in the United States. Despite this growth, the U.S. wind industry continues to experience year-to-year fluctuations across the manufacturing and supply chain as a result of dynamic market conditions and changing policy landscapes. Moreover, with advancing wind technologies, ever-changing fossil fuel prices, and evolving energy policies, the long-term future for wind power is highly uncertain. In this report, we present multiple outlooks for wind power in the United States, to explore the possibilities of future wind deployment. The future wind power outlooks presented relymore » on high-resolution wind resource data and advanced electric sector modeling capabilities to evaluate an array of potential scenarios of the U.S. electricity system. Scenario analysis is used to explore drivers, trends, and implications for wind power deployment over multiple periods through 2050. Specifically, we model 16 scenarios of wind deployment in the contiguous United States. These scenarios span a wide range of wind technology costs, natural gas prices, and future transmission expansion. We identify conditions with more consistent wind deployment after the production tax credit expires as well as drivers for more robust wind growth in the long run. Conversely, we highlight challenges to future wind deployment. We find that the degree to which wind technology costs decline can play an important role in future wind deployment, electric sector CO 2 emissions, and lowering allowance prices for the Clean Power Plan.« less
DOT National Transportation Integrated Search
1999-01-01
In 1997, the Ann Arbor (Michigan) Transportation Authority began deploying advanced public transportation systems (APTS) technologies in its fixed route and paratransit operations. The project's concept is the integration of a range of such technolog...
Distilling the Verification Process for Prognostics Algorithms
NASA Technical Reports Server (NTRS)
Roychoudhury, Indranil; Saxena, Abhinav; Celaya, Jose R.; Goebel, Kai
2013-01-01
The goal of prognostics and health management (PHM) systems is to ensure system safety, and reduce downtime and maintenance costs. It is important that a PHM system is verified and validated before it can be successfully deployed. Prognostics algorithms are integral parts of PHM systems. This paper investigates a systematic process of verification of such prognostics algorithms. To this end, first, this paper distinguishes between technology maturation and product development. Then, the paper describes the verification process for a prognostics algorithm as it moves up to higher maturity levels. This process is shown to be an iterative process where verification activities are interleaved with validation activities at each maturation level. In this work, we adopt the concept of technology readiness levels (TRLs) to represent the different maturity levels of a prognostics algorithm. It is shown that at each TRL, the verification of a prognostics algorithm depends on verifying the different components of the algorithm according to the requirements laid out by the PHM system that adopts this prognostics algorithm. Finally, using simplified examples, the systematic process for verifying a prognostics algorithm is demonstrated as the prognostics algorithm moves up TRLs.
ERIC Educational Resources Information Center
Jakubowski, Henry; Xie, Jianping; Kumar Mitra, Arup; Ghooi, Ravindra; Hosseinkhani, Saman; Alipour, Mohsen; Hajipour, Behnam; Obiero, George
2017-01-01
The profound advances in the biomolecular sciences over the last decades have enabled similar advances in biomedicine. These advances have increasingly challenged our abilities to deploy them in an equitable and ethically acceptable manner. As such, it has become necessary and important to teach biomedical and scientific ethics to our students who…
An advancing front Delaunay triangulation algorithm designed for robustness
NASA Technical Reports Server (NTRS)
Mavriplis, D. J.
1992-01-01
A new algorithm is described for generating an unstructured mesh about an arbitrary two-dimensional configuration. Mesh points are generated automatically by the algorithm in a manner which ensures a smooth variation of elements, and the resulting triangulation constitutes the Delaunay triangulation of these points. The algorithm combines the mathematical elegance and efficiency of Delaunay triangulation algorithms with the desirable point placement features, boundary integrity, and robustness traditionally associated with advancing-front-type mesh generation strategies. The method offers increased robustness over previous algorithms in that it cannot fail regardless of the initial boundary point distribution and the prescribed cell size distribution throughout the flow-field.
Optimized passive sonar placement to allow improved interdiction
NASA Astrophysics Data System (ADS)
Johnson, Bruce A.; Matthews, Cameron
2016-05-01
The Art Gallery Problem (AGP) is the name given to a constrained optimization problem meant to determine the maximum amount of sensor coverage while utilizing the minimum number of resources. The AGP is significant because a common issue among surveillance and interdiction systems is obtaining an understanding of the optimal position of sensors and weapons in advance of enemy combatant maneuvers. The implication that an optimal position for a sensor to observe an event or for a weapon to engage a target autonomously is usually very clear after the target has passed, but for autonomous systems the solution must at least be conjectured in advance for deployment purposes. This abstract applies the AGP as a means to solve where best to place underwater sensor nodes such that the amount of information acquired about a covered area is maximized while the number of resources used to gain that information is minimized. By phrasing the ISR/interdiction problem this way, the issue is addressed as an instance of the AGP. The AGP is a member of a set of computational problems designated as nondeterministic polynomial-time (NP)-hard. As a member of this set, the AGP shares its members' defining feature, namely that no one has proven that there exists a deterministic algorithm providing a computationally-tractable solution to the AGP within a finite amount of time. At best an algorithm meant to solve the AGP can asymptotically approach perfect coverage with minimal resource usage but providing perfect coverage would either break the minimal resource usage constraint or require an exponentially-growing amount of time. No perfectly-optimal solution yet exists to the AGP, however, approximately optimal solutions to the AGP can approach complete area or barrier coverage while simultaneously minimizing the number of sensors and weapons utilized. A minimal number of underwater sensor nodes deployed can greatly increase the Mean Time Between Operational Failure (MTBOF) and logistical footprint. The resulting coverage optimizes the likelihood of encounter given an arbitrary sensor profile and threat from a free field statistical model approach. The free field statistical model is particularly applicable to worst case scenario modeling in open ocean operational profiles where targets to do not follow a particular pattern in any of the modeled dimensions. We present an algorithmic testbed which shows how to achieve approximately optimal solutions to the AGP for a network of underwater sensor nodes with or without effector systems for engagement while operating under changing environmental circumstances. The means by which we accomplish this goal are three-fold: 1) Develop a 3D model for the sonar signal propagating through the underwater environment 2) Add rigorous physics-based modeling of environmental events which can affect sensor information acquisition 3) Provide innovative solutions to the AGP which account for the environmental circumstances affecting sensor performance.
Aeolus End-To-End Simulator and Wind Retrieval Algorithms up to Level 1B
NASA Astrophysics Data System (ADS)
Reitebuch, Oliver; Marksteiner, Uwe; Rompel, Marc; Meringer, Markus; Schmidt, Karsten; Huber, Dorit; Nikolaus, Ines; Dabas, Alain; Marshall, Jonathan; de Bruin, Frank; Kanitz, Thomas; Straume, Anne-Grete
2018-04-01
The first wind lidar in space ALADIN will be deployed on ESÁs Aeolus mission. In order to assess the performance of ALADIN and to optimize the wind retrieval and calibration algorithms an end-to-end simulator was developed. This allows realistic simulations of data downlinked by Aeolus. Together with operational processors this setup is used to assess random and systematic error sources and perform sensitivity studies about the influence of atmospheric and instrument parameters.
NASA Technical Reports Server (NTRS)
Abbott, David; Batten, Adam; Carpenter, David; Dunlop, John; Edwards, Graeme; Farmer, Tony; Gaffney, Bruce; Hedley, Mark; Hoschke, Nigel; Isaacs, Peter;
2008-01-01
This report describes the first phase of the implementation of the Concept Demonstrator. The Concept Demonstrator system is a powerful and flexible experimental test-bed platform for developing sensors, communications systems, and multi-agent based algorithms for an intelligent vehicle health monitoring system for deployment in aerospace vehicles. The Concept Demonstrator contains sensors and processing hardware distributed throughout the structure, and uses multi-agent algorithms to characterize impacts and determine an appropriate response to these impacts.
An epidemiological approach to mass casualty incidents in the Principality of Asturias (Spain).
Castro Delgado, Rafael; Naves Gómez, Cecilia; Cuartas Álvarez, Tatiana; Arcos González, Pedro
2016-02-24
Mass Casualty Incidents (MCI) have been rarely studied from epidemiological approaches. The objective of this study is to establish the epidemiological profile of MCI in the autonomous region of the Principality of Asturias (Spain) and analyse ambulance deployment and severity of patients. This is a population-based prospective study run in 2014. Inclusion criteria for MCI is "every incident with four or more people affected that requires ambulance mobilisation". Thirty-nine MCI have been identified in Asturias in 2014. Thirty-one (79%) were road traffic accidents, three (7.5%) fires and five (12.8%) other types. Twenty-one incidents (56.7%) had four patients, and only three of them (8%) had seven or more patients. An average of 2.41 ambulances per incident were deployed (standard error = 0.18). Most of the patients per incident were minor injured patients (mean = 4; standard error = 0.2), and 0,26 were severe patients (standard error = 0.08). There was a positive significant correlation (p < 0.01) between the total number of patients and the total number of ambulances deployed and between the total number of patients and Advanced Life Support (ALS) ambulances deployed (p < 0.001). The total number of non-ALS ambulances was not related with the total number of patients. Population based research in MCI is essential to define MCI profile. Quantitative definition of MCI, adapted to resources, avoid selection bias and present a more accurate profile of MCI. As espected, road traffic accidents are the most frequent MCI in our region. This aspect is essential to plan training and response to MCI. Analysis of total response to MCI shows that for almost an hour, we should plan extra resources for daily emergencies. This data is an important issue to bear in mind when planning MCI response. The fact that most patients are classified as minor injured and more advanced life support units than needed are deployed shows that analysis of resources deployment and patient severity helps us to better plan future MCI response. Road traffic accidents with minor injured patients are the most frequent MCI in our region. More advanced life support units than needed have been initially deployed, which might compromise response to daily emergencies during an MCI.
NASA Technical Reports Server (NTRS)
Merrill, W. C.; Delaat, J. C.
1986-01-01
An advanced sensor failure detection, isolation, and accommodation (ADIA) algorithm has been developed for use with an aircraft turbofan engine control system. In a previous paper the authors described the ADIA algorithm and its real-time implementation. Subsequent improvements made to the algorithm and implementation are discussed, and the results of an evaluation presented. The evaluation used a real-time, hybrid computer simulation of an F100 turbofan engine.
NASA Astrophysics Data System (ADS)
Biradar, Anandrao Shesherao
The presented work in this report is about Real time Estimation of wind and analyzing current wind correction algorithm in commercial off the shelf Autopilot board. The open source ArduPilot Mega 2.5 (APM 2.5) board manufactured by 3D Robotics is used. Currently there is lot of development being done in the field of Unmanned Aerial Systems (UAVs), various aerial platforms and corresponding; autonomous systems for them. This technology has advanced to such a stage that UAVs can be used for specific designed missions and deployed with reliability. But in some areas like missions requiring high maneuverability with greater efficiency is still under research area. This would help in increasing reliability and augmenting range of UAVs significantly. One of the problems addressed through this thesis work is, current autopilot systems have algorithm that handles wind by attitude correction with appropriate Crab angle. But the real time wind vector (direction) and its calculated velocity is based on geometrical and algebraic transformation between ground speed and air speed vectors. This method of wind estimation and prediction, many a times leads to inaccuracy in attitude correction. The same has been proved in the following report with simulation and actual field testing. In later part, new ways to tackle while flying windy conditions have been proposed.
Advanced consequence management program: challenges and recent real-world implementations
NASA Astrophysics Data System (ADS)
Graser, Tom; Barber, K. S.; Williams, Bob; Saghir, Feras; Henry, Kurt A.
2002-08-01
The Enhanced Consequence Management, Planning and Support System (ENCOMPASS) was developed under DARPA's Advanced Consequence Management program to assist decision-makers operating in crisis situations such as terrorist attacks using conventional and unconventional weapons and natural disasters. ENCOMPASS provides the tools for first responders, incident commanders, and officials at all levels to share vital information and consequently, plan and execute a coordinated response to incidents of varying complexity and size. ENCOMPASS offers custom configuration of components with capabilities ranging from map-based situation assessment, situation-based response checklists, casualty tracking, and epidemiological surveillance. Developing and deploying such a comprehensive system posed significant challenges for DARPA program management, due to an inherently complex domain, a broad spectrum of customer sites and skill sets, an often inhospitable runtime environment, demanding development-to-deployment transition requirements, and a technically diverse and geographically distributed development team. This paper introduces ENCOMPASS and explores these challenges, followed by an outline of selected ENCOMPASS deployments, demonstrating how ENCOMPASS can enhance consequence management in a variety real world contexts.
Design of Mechanisms for Deployable, Optical Instruments: Guidelines for Reducing Hysteresis
NASA Technical Reports Server (NTRS)
Lake, Mark S.; Hachkowski, M. Roman
2000-01-01
This paper is intended to facilitate the development of deployable, optical instruments by providing a rational approach for the design, testing, and qualification of high-precision (i.e., low-hysteresis) deployment mechanisms for these instruments. Many of the guidelines included herein come directly from the field of optomechanical engineering, and are, therefore, neither newly developed guidelines, nor are they uniquely applicable to the design of high-precision deployment mechanisms. This paper is to be regarded as a guide to design and not a set of NASA requirements, except as may be defined in formal project specifications. Furthermore, due to the rapid pace of advancement in the field of precision deployment, this paper should be regarded as a preliminary set of guidelines. However, it is expected that this paper, with revisions as experience may indicate to be desirable, might eventually form the basis for a set of uniform design requirements for high-precision deployment mechanisms on future NASA space-based science instruments.
The Value Proposition for Fractionated Space Architectures
2006-09-01
transmission relying on electrostatic forces has been proposed for use in GEO by Parker et al.37 Demonstration Program The Defense Advanced...capability of the original monolithic system.6 One can envision the fractionation trade space to be defined by three high-level metrics. First, the ... by deploying additional modules. Thus, for instance, one could envision deploying an initial communications capability in the form of a power
ERIC Educational Resources Information Center
National Telecommunications and Information Administration (DOC), Washington, DC.
This report, in response to a request by 10 U.S. Senators examines the status of broadband deployment in the United States. The rate of deployment of broadband services will be key to future economic growth, particularly in rural areas far from urban and world markets. This report finds that rural areas, especially remote areas outside of towns,…
Radiation Detection at Borders for Homeland Security
NASA Astrophysics Data System (ADS)
Kouzes, Richard
2004-05-01
Countries around the world are deploying radiation detection instrumentation to interdict the illegal shipment of radioactive material crossing international borders at land, rail, air, and sea ports of entry. These efforts include deployments in the US and a number of European and Asian countries by governments and international agencies. Items of concern include radiation dispersal devices (RDD), nuclear warheads, and special nuclear material (SNM). Radiation portal monitors (RPMs) are used as the main screening tool for vehicles and cargo at borders, supplemented by handheld detectors, personal radiation detectors, and x-ray imaging systems. Some cargo contains naturally occurring radioactive material (NORM) that triggers "nuisance" alarms in RPMs at these border crossings. Individuals treated with medical radiopharmaceuticals also produce nuisance alarms and can produce cross-talk between adjacent lanes of a multi-lane deployment. The operational impact of nuisance alarms can be significant at border crossings. Methods have been developed for reducing this impact without negatively affecting the requirements for interdiction of radioactive materials of interest. Plastic scintillator material is commonly used in RPMs for the detection of gamma rays from radioactive material, primarily due to the efficiency per unit cost compared to other detection materials. The resolution and lack of full-energy peaks in the plastic scintillator material prohibits detailed spectroscopy. However, the limited spectroscopic information from plastic scintillator can be exploited to provide some discrimination. Energy-based algorithms used in RPMs can effectively exploit the crude energy information available from a plastic scintillator to distinguish some NORM. Whenever NORM cargo limits the level of the alarm threshold, energy-based algorithms produce significantly better detection probabilities for small SNM sources than gross-count algorithms. This presentation discusses experience with RPMs for interdiction of radioactive materials at borders.
Experimental Characterization of Hysteresis in a Revolute Joint for Precision Deployable Structures
NASA Technical Reports Server (NTRS)
Lake, Mark S.; Fung, Jimmy; Gloss, Kevin; Liechty, Derek S.
1997-01-01
Recent studies of the micro-dynamic behavior of a deployable telescope metering truss have identified instabilities in the equilibrium shape of the truss in response to low-energy dynamic loading. Analyses indicate that these micro-dynamic instabilities arise from stick-slip friction within the truss joints (e.g., hinges and latches). The present study characterizes the low-magnitude quasi-static load cycle response of the precision revolute joints incorporated in the deployable telescope metering truss, and specifically, the hysteretic response of these joints caused by stick-slip friction within the joint. Detailed descriptions are presented of the test setup and data reduction algorithms, including discussions of data-error sources and data-filtering techniques. Test results are presented from thirteen specimens, and the effects of joint preload and manufacturing tolerances are investigated. Using a simplified model of stick-slip friction, a relationship is made between joint load-cycle behavior and micro-dynamic dimensional instabilities in the deployable telescope metering truss.
Zheng, Wei; Yan, Xiaoyong; Zhao, Wei; Qian, Chengshan
2017-12-20
A novel large-scale multi-hop localization algorithm based on regularized extreme learning is proposed in this paper. The large-scale multi-hop localization problem is formulated as a learning problem. Unlike other similar localization algorithms, the proposed algorithm overcomes the shortcoming of the traditional algorithms which are only applicable to an isotropic network, therefore has a strong adaptability to the complex deployment environment. The proposed algorithm is composed of three stages: data acquisition, modeling and location estimation. In data acquisition stage, the training information between nodes of the given network is collected. In modeling stage, the model among the hop-counts and the physical distances between nodes is constructed using regularized extreme learning. In location estimation stage, each node finds its specific location in a distributed manner. Theoretical analysis and several experiments show that the proposed algorithm can adapt to the different topological environments with low computational cost. Furthermore, high accuracy can be achieved by this method without setting complex parameters.
Self-Deployable Membrane Structures
NASA Technical Reports Server (NTRS)
Sokolowski, Witold M.; Willis, Paul B.; Tan, Seng C.
2010-01-01
Currently existing approaches for deployment of large, ultra-lightweight gossamer structures in space rely typically upon electromechanical mechanisms and mechanically expandable or inflatable booms for deployment and to maintain them in a fully deployed, operational configuration. These support structures, with the associated deployment mechanisms, launch restraints, inflation systems, and controls, can comprise more than 90 percent of the total mass budget. In addition, they significantly increase the stowage volume, cost, and complexity. A CHEM (cold hibernated elastic memory) membrane structure without any deployable mechanism and support booms/structure is deployed by using shape memory and elastic recovery. The use of CHEM micro-foams reinforced with carbon nanotubes is considered for thin-membrane structure applications. In this advanced structural concept, the CHEM membrane structure is warmed up to allow packaging and stowing prior to launch, and then cooled to induce hibernation of the internal restoring forces. In space, the membrane remembers its original shape and size when warmed up. After the internal restoring forces deploy the structure, it is then cooled to achieve rigidization. For this type of structure, the solar radiation could be utilized as the heat energy used for deployment and space ambient temperature for rigidization. The overall simplicity of the CHEM self-deployable membrane is one of its greatest assets. In present approaches to space-deployable structures, the stow age and deployment are difficult and challenging, and introduce a significant risk, heavy mass, and high cost. Simple procedures provided by CHEM membrane greatly simplify the overall end-to-end process for designing, fabricating, deploying, and rigidizing large structures. The CHEM membrane avoids the complexities associated with other methods for deploying and rigidizing structures by eliminating deployable booms, deployment mechanisms, and inflation and control systems that can use up the majority of the mass budget
Knebel, Ann R; Martinelli, Angela M; Orsega, Susan; Doss, Thomas L; Balingit-Wines, Ana Marie; Konchan, Carol L
2010-06-01
The events of September 11, 2001, set in motion the broadest emergency response ever conducted by the US Department of Health and Human Services. In this article, some of the nurses who deployed to New York City in the aftermath of that horrific attack on the United States offer their recollections of the events. Although Public Health Service Commissioned Corps (PHS CC) officers participated in deployments before 9/11, this particular deployment accelerated the transformation of the PHS CC, because people came to realize the tremendous potential of a uniformed service of 6,000 health care professionals. When not responding to emergencies, PHS CC nurses daily serve the mission of the PHS to protect, promote, and advance the health and safety of the nation. In times of crisis, the PHS CC nurses stand ready to deploy in support of those in need of medical assistance. Published by Elsevier Inc.
DOT National Transportation Integrated Search
2006-08-02
In 2000, the Treasure Valley area of the State of Idaho received a federal earmark of $390,000 to develop an Advanced Transportation Management System (ATMS) for the Treasure Valley region of Idaho. The Ada County Highway District (ACHD), located in ...
An Investigation of Pronunciation Learning Strategies of Advanced EFL Learners
ERIC Educational Resources Information Center
Hismanoglu, Murat
2012-01-01
This paper aims at investigating the kinds of strategies deployed by advanced EFL learners at English Language Teaching Department to learn or improve English pronunciation and revealing whether there are any significant differences between the strategies of successful pronunciation learners and those of unsuccessful pronunciation learners. After…
Evolutionary Design of an X-Band Antenna for NASA's Space Technology 5 Mission
NASA Technical Reports Server (NTRS)
Lohn, Jason D.; Hornby, Gregory S.; Rodriguez-Arroyo, Adan; Linden, Derek S.; Kraus, William F.; Seufert, Stephen E.
2003-01-01
We present an evolved X-band antenna design and flight prototype currently on schedule to be deployed on NASA s Space Technology 5 spacecraft in 2004. The mission consists of three small satellites that wall take science measurements in Earth s magnetosphere. The antenna was evolved to meet a challenging set of mission requirements, most notably the combination of wide beamwidth for a circularly-polarized wave and wide bandwidth. Two genetic algorithms were used: one allowed branching an the antenna arms and the other did not. The highest performance antennas from both algorithms were fabricated and tested. A handdesigned antenna was produced by the contractor responsible for the design and build of the mission antennas. The hand-designed antenna is a quadrifilar helix, and we present performance data for comparison to the evolved antennas. As of this writing, one of our evolved antenna prototypes is undergoing flight qualification testing. If successful, the resulting antenna would represent the first evolved hardware in space, and the first deployed evolved antenna.
Advanced biologically plausible algorithms for low-level image processing
NASA Astrophysics Data System (ADS)
Gusakova, Valentina I.; Podladchikova, Lubov N.; Shaposhnikov, Dmitry G.; Markin, Sergey N.; Golovan, Alexander V.; Lee, Seong-Whan
1999-08-01
At present, in computer vision, the approach based on modeling the biological vision mechanisms is extensively developed. However, up to now, real world image processing has no effective solution in frameworks of both biologically inspired and conventional approaches. Evidently, new algorithms and system architectures based on advanced biological motivation should be developed for solution of computational problems related to this visual task. Basic problems that should be solved for creation of effective artificial visual system to process real world imags are a search for new algorithms of low-level image processing that, in a great extent, determine system performance. In the present paper, the result of psychophysical experiments and several advanced biologically motivated algorithms for low-level processing are presented. These algorithms are based on local space-variant filter, context encoding visual information presented in the center of input window, and automatic detection of perceptually important image fragments. The core of latter algorithm are using local feature conjunctions such as noncolinear oriented segment and composite feature map formation. Developed algorithms were integrated into foveal active vision model, the MARR. It is supposed that proposed algorithms may significantly improve model performance while real world image processing during memorizing, search, and recognition.
Lifecycle Prognostics Architecture for Selected High-Cost Active Components
DOE Office of Scientific and Technical Information (OSTI.GOV)
N. Lybeck; B. Pham; M. Tawfik
There are an extensive body of knowledge and some commercial products available for calculating prognostics, remaining useful life, and damage index parameters. The application of these technologies within the nuclear power community is still in its infancy. Online monitoring and condition-based maintenance is seeing increasing acceptance and deployment, and these activities provide the technological bases for expanding to add predictive/prognostics capabilities. In looking to deploy prognostics there are three key aspects of systems that are presented and discussed: (1) component/system/structure selection, (2) prognostic algorithms, and (3) prognostics architectures. Criteria are presented for component selection: feasibility, failure probability, consequences of failure,more » and benefits of the prognostics and health management (PHM) system. The basis and methods commonly used for prognostics algorithms are reviewed and summarized. Criteria for evaluating PHM architectures are presented: open, modular architecture; platform independence; graphical user interface for system development and/or results viewing; web enabled tools; scalability; and standards compatibility. Thirteen software products were identified and discussed in the context of being potentially useful for deployment in a PHM program applied to systems in a nuclear power plant (NPP). These products were evaluated by using information available from company websites, product brochures, fact sheets, scholarly publications, and direct communication with vendors. The thirteen products were classified into four groups of software: (1) research tools, (2) PHM system development tools, (3) deployable architectures, and (4) peripheral tools. Eight software tools fell into the deployable architectures category. Of those eight, only two employ all six modules of a full PHM system. Five systems did not offer prognostic estimates, and one system employed the full health monitoring suite but lacked operations and maintenance support. Each product is briefly described in Appendix A. Selection of the most appropriate software package for a particular application will depend on the chosen component, system, or structure. Ongoing research will determine the most appropriate choices for a successful demonstration of PHM systems in aging NPPs.« less
Potential of dynamic spectrum allocation in LTE macro networks
NASA Astrophysics Data System (ADS)
Hoffmann, H.; Ramachandra, P.; Kovács, I. Z.; Jorguseski, L.; Gunnarsson, F.; Kürner, T.
2015-11-01
In recent years Mobile Network Operators (MNOs) worldwide are extensively deploying LTE networks in different spectrum bands and utilising different bandwidth configurations. Initially, the deployment is coverage oriented with macro cells using the lower LTE spectrum bands. As the offered traffic (i.e. the requested traffic from the users) increases the LTE deployment evolves with macro cells expanded with additional capacity boosting LTE carriers in higher frequency bands complemented with micro or small cells in traffic hotspot areas. For MNOs it is crucial to use the LTE spectrum assets, as well as the installed network infrastructure, in the most cost efficient way. The dynamic spectrum allocation (DSA) aims at (de)activating the available LTE frequency carriers according to the temporal and spatial traffic variations in order to increase the overall LTE system performance in terms of total network capacity by reducing the interference. This paper evaluates the DSA potential of achieving the envisaged performance improvement and identifying in which system and traffic conditions the DSA should be deployed. A self-optimised network (SON) DSA algorithm is also proposed and evaluated. The evaluations have been carried out in a hexagonal and a realistic site-specific urban macro layout assuming a central traffic hotspot area surrounded with an area of lower traffic with a total size of approximately 8 × 8 km2. The results show that up to 47 % and up to 40 % possible DSA gains are achievable with regards to the carried system load (i.e. used resources) for homogenous traffic distribution with hexagonal layout and for realistic site-specific urban macro layout, respectively. The SON DSA algorithm evaluation in a realistic site-specific urban macro cell deployment scenario including realistic non-uniform spatial traffic distribution shows insignificant cell throughput (i.e. served traffic) performance gains. Nevertheless, in the SON DSA investigations, a gain of up to 25 % has been observed when analysing the resource utilisation in the non-hotspot cells.
Machine Learning for Flood Prediction in Google Earth Engine
NASA Astrophysics Data System (ADS)
Kuhn, C.; Tellman, B.; Max, S. A.; Schwarz, B.
2015-12-01
With the increasing availability of high-resolution satellite imagery, dynamic flood mapping in near real time is becoming a reachable goal for decision-makers. This talk describes a newly developed framework for predicting biophysical flood vulnerability using public data, cloud computing and machine learning. Our objective is to define an approach to flood inundation modeling using statistical learning methods deployed in a cloud-based computing platform. Traditionally, static flood extent maps grounded in physically based hydrologic models can require hours of human expertise to construct at significant financial cost. In addition, desktop modeling software and limited local server storage can impose restraints on the size and resolution of input datasets. Data-driven, cloud-based processing holds promise for predictive watershed modeling at a wide range of spatio-temporal scales. However, these benefits come with constraints. In particular, parallel computing limits a modeler's ability to simulate the flow of water across a landscape, rendering traditional routing algorithms unusable in this platform. Our project pushes these limits by testing the performance of two machine learning algorithms, Support Vector Machine (SVM) and Random Forests, at predicting flood extent. Constructed in Google Earth Engine, the model mines a suite of publicly available satellite imagery layers to use as algorithm inputs. Results are cross-validated using MODIS-based flood maps created using the Dartmouth Flood Observatory detection algorithm. Model uncertainty highlights the difficulty of deploying unbalanced training data sets based on rare extreme events.
Automated detection of sperm whale sounds as a function of abrupt changes in sound intensity
NASA Astrophysics Data System (ADS)
Walker, Christopher D.; Rayborn, Grayson H.; Brack, Benjamin A.; Kuczaj, Stan A.; Paulos, Robin L.
2003-04-01
An algorithm designed to detect abrupt changes in sound intensity was developed and used to identify and count sperm whale vocalizations and to measure boat noise. The algorithm is a MATLAB routine that counts the number of occurrences for which the change in intensity level exceeds a threshold. The algorithm also permits the setting of a ``dead time'' interval to prevent the counting of multiple pulses within a single sperm whale click. This algorithm was used to analyze digitally sampled recordings of ambient noise obtained from the Gulf of Mexico using near bottom mounted EARS buoys deployed as part of the Littoral Acoustic Demonstration Center experiment. Because the background in these data varied slowly, the result of the application of the algorithm was automated detection of sperm whale clicks and creaks with results that agreed well with those obtained by trained human listeners. [Research supported by ONR.
NASA Astrophysics Data System (ADS)
Bolodurina, I. P.; Parfenov, D. I.
2017-10-01
The goal of our investigation is optimization of network work in virtual data center. The advantage of modern infrastructure virtualization lies in the possibility to use software-defined networks. However, the existing optimization of algorithmic solutions does not take into account specific features working with multiple classes of virtual network functions. The current paper describes models characterizing the basic structures of object of virtual data center. They including: a level distribution model of software-defined infrastructure virtual data center, a generalized model of a virtual network function, a neural network model of the identification of virtual network functions. We also developed an efficient algorithm for the optimization technology of containerization of virtual network functions in virtual data center. We propose an efficient algorithm for placing virtual network functions. In our investigation we also generalize the well renowned heuristic and deterministic algorithms of Karmakar-Karp.
NASA Astrophysics Data System (ADS)
Walker, Joel W.
2014-08-01
The M T2, or "s-transverse mass", statistic was developed to associate a parent mass scale to a missing transverse energy signature, given that escaping particles are generally expected in pairs, while collider experiments are sensitive to just a single transverse momentum vector sum. This document focuses on the generalized extension of that statistic to asymmetric one- and two-step decay chains, with arbitrary child particle masses and upstream missing transverse momentum. It provides a unified theoretical formulation, complete solution classification, taxonomy of critical points, and technical algorithmic prescription for treatment of the event scale. An implementation of the described algorithm is available for download, and is also a deployable component of the author's selection cut software package AEAC uS (Algorithmic Event Arbiter and C ut Selector). appendices address combinatoric event assembly, algorithm validation, and a complete pseudocode.
NASA Technical Reports Server (NTRS)
Nyangweso, Emmanuel; Bole, Brian
2014-01-01
Successful prediction and management of battery life using prognostic algorithms through ground and flight tests is important for performance evaluation of electrical systems. This paper details the design of test beds suitable for replicating loading profiles that would be encountered in deployed electrical systems. The test bed data will be used to develop and validate prognostic algorithms for predicting battery discharge time and battery failure time. Online battery prognostic algorithms will enable health management strategies. The platform used for algorithm demonstration is the EDGE 540T electric unmanned aerial vehicle (UAV). The fully designed test beds developed and detailed in this paper can be used to conduct battery life tests by controlling current and recording voltage and temperature to develop a model that makes a prediction of end-of-charge and end-of-life of the system based on rapid state of health (SOH) assessment.
The Navy at a Tipping Point: Maritime Dominance at Stake?
2010-03-01
Navy" USN Deployment Strategy Future Global Environment for USN Operations External and Internal Drivers on USN Options Five Means for a "Global...Defense CARAT Deployment HCA cruises Counter-Dnjg oPs NAVSO/4 ,h Fleet Patmi NAVCENT/5,h Fleet GFS . . „ Horn of Global Fleet Station ...against advanced air defenses, conduct and enable littoral/amphibious operations in opposed environments , and establish blue-water dominance against
Juneau Airport Doppler Lidar Deployment: Extraction of Accurate Turbulent Wind Statistics
NASA Technical Reports Server (NTRS)
Hannon, Stephen M.; Frehlich, Rod; Cornman, Larry; Goodrich, Robert; Norris, Douglas; Williams, John
1999-01-01
A 2 micrometer pulsed Doppler lidar was deployed to the Juneau Airport in 1998 to measure turbulence and wind shear in and around the departure and arrival corridors. The primary objective of the measurement program was to demonstrate and evaluate the capability of a pulsed coherent lidar to remotely and unambiguously measure wind turbulence. Lidar measurements were coordinated with flights of an instrumented research aircraft operated by representatives of the University of North Dakota (UND) under the direction of the National Center for Atmospheric Research (NCAR). The data collected is expected to aid both turbulence characterization as well as airborne turbulence detection algorithm development activities within NASA and the FAA. This paper presents a summary of the deployment and results of analysis and simulation which address important issues regarding the measurement requirements for accurate turbulent wind statistics extraction.
Modeling and analysis of a large deployable antenna structure
NASA Astrophysics Data System (ADS)
Chu, Zhengrong; Deng, Zongquan; Qi, Xiaozhi; Li, Bing
2014-02-01
One kind of large deployable antenna (LDA) structure is proposed by combining a number of basic deployable units in this paper. In order to avoid vibration caused by fast deployment speed of the mechanism, a braking system is used to control the spring-actuated system. Comparisons between the LDA structure and a similar structure used by the large deployable reflector (LDR) indicate that the former has potential for use in antennas with up to 30 m aperture due to its lighter weight. The LDA structure is designed to form a spherical surface found by the least square fitting method so that it can be symmetrical. In this case, the positions of the terminal points in the structure are determined by two principles. A method to calculate the cable network stretched on the LDA structure is developed, which combines the original force density method and the parabolic surface constraint. Genetic algorithm is applied to ensure that each cable reaches a desired tension, which avoids the non-convergence issue effectively. We find that the pattern for the front and rear cable net must be the same when finding the shape of the rear cable net, otherwise anticlastic surface would generate.
Achieving Passive Localization with Traffic Light Schedules in Urban Road Sensor Networks
Niu, Qiang; Yang, Xu; Gao, Shouwan; Chen, Pengpeng; Chan, Shibing
2016-01-01
Localization is crucial for the monitoring applications of cities, such as road monitoring, environment surveillance, vehicle tracking, etc. In urban road sensor networks, sensors are often sparely deployed due to the hardware cost. Under this sparse deployment, sensors cannot communicate with each other via ranging hardware or one-hop connectivity, rendering the existing localization solutions ineffective. To address this issue, this paper proposes a novel Traffic Lights Schedule-based localization algorithm (TLS), which is built on the fact that vehicles move through the intersection with a known traffic light schedule. We can first obtain the law by binary vehicle detection time stamps and describe the law as a matrix, called a detection matrix. At the same time, we can also use the known traffic light information to construct the matrices, which can be formed as a collection called a known matrix collection. The detection matrix is then matched in the known matrix collection for identifying where sensors are located on urban roads. We evaluate our algorithm by extensive simulation. The results show that the localization accuracy of intersection sensors can reach more than 90%. In addition, we compare it with a state-of-the-art algorithm and prove that it has a wider operational region. PMID:27735871
Robotics for Nuclear Material Handling at LANL:Capabilities and Needs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harden, Troy A; Lloyd, Jane A; Turner, Cameron J
Nuclear material processing operations present numerous challenges for effective automation. Confined spaces, hazardous materials and processes, particulate contamination, radiation sources, and corrosive chemical operations are but a few of the significant hazards. However, automated systems represent a significant safety advance when deployed in place of manual tasks performed by human workers. The replacement of manual operations with automated systems has been desirable for nearly 40 years, yet only recently are automated systems becoming increasingly common for nuclear materials handling applications. This paper reviews several automation systems which are deployed or about to be deployed at Los Alamos National Laboratory formore » nuclear material handling operations. Highlighted are the current social and technological challenges faced in deploying automated systems into hazardous material handling environments and the opportunities for future innovations.« less
Irizarry, Daniel; Wadman, Michael C; Bernhagen, Mary A; Miljkovic, Nikola; Boedeker, Ben H
2012-01-01
This work describes the use of Adobe Connect software along with algorithm software to provide the necessary audio visual communication platform for telementoring a complex medical procedure to novice providers located at a distant site.
NASA Astrophysics Data System (ADS)
Hibert, Clement; Stumpf, André; Provost, Floriane; Malet, Jean-Philippe
2017-04-01
In the past decades, the increasing quality of seismic sensors and capability to transfer remotely large quantity of data led to a fast densification of local, regional and global seismic networks for near real-time monitoring of crustal and surface processes. This technological advance permits the use of seismology to document geological and natural/anthropogenic processes (volcanoes, ice-calving, landslides, snow and rock avalanches, geothermal fields), but also led to an ever-growing quantity of seismic data. This wealth of seismic data makes the construction of complete seismicity catalogs, which include earthquakes but also other sources of seismic waves, more challenging and very time-consuming as this critical pre-processing stage is classically done by human operators and because hundreds of thousands of seismic signals have to be processed. To overcome this issue, the development of automatic methods for the processing of continuous seismic data appears to be a necessity. The classification algorithm should satisfy the need of a method that is robust, precise and versatile enough to be deployed to monitor the seismicity in very different contexts. In this study, we evaluate the ability of machine learning algorithms for the analysis of seismic sources at the Piton de la Fournaise volcano being Random Forest and Deep Neural Network classifiers. We gather a catalog of more than 20,000 events, belonging to 8 classes of seismic sources. We define 60 attributes, based on the waveform, the frequency content and the polarization of the seismic waves, to parameterize the seismic signals recorded. We show that both algorithms provide similar positive classification rates, with values exceeding 90% of the events. When trained with a sufficient number of events, the rate of positive identification can reach 99%. These very high rates of positive identification open the perspective of an operational implementation of these algorithms for near-real time monitoring of mass movements and other environmental sources at the local, regional and even global scale.
Abdullahi, Mohammed; Ngadi, Md Asri
2016-01-01
Cloud computing has attracted significant attention from research community because of rapid migration rate of Information Technology services to its domain. Advances in virtualization technology has made cloud computing very popular as a result of easier deployment of application services. Tasks are submitted to cloud datacenters to be processed on pay as you go fashion. Task scheduling is one the significant research challenges in cloud computing environment. The current formulation of task scheduling problems has been shown to be NP-complete, hence finding the exact solution especially for large problem sizes is intractable. The heterogeneous and dynamic feature of cloud resources makes optimum task scheduling non-trivial. Therefore, efficient task scheduling algorithms are required for optimum resource utilization. Symbiotic Organisms Search (SOS) has been shown to perform competitively with Particle Swarm Optimization (PSO). The aim of this study is to optimize task scheduling in cloud computing environment based on a proposed Simulated Annealing (SA) based SOS (SASOS) in order to improve the convergence rate and quality of solution of SOS. The SOS algorithm has a strong global exploration capability and uses fewer parameters. The systematic reasoning ability of SA is employed to find better solutions on local solution regions, hence, adding exploration ability to SOS. Also, a fitness function is proposed which takes into account the utilization level of virtual machines (VMs) which reduced makespan and degree of imbalance among VMs. CloudSim toolkit was used to evaluate the efficiency of the proposed method using both synthetic and standard workload. Results of simulation showed that hybrid SOS performs better than SOS in terms of convergence speed, response time, degree of imbalance, and makespan.
Abdullahi, Mohammed; Ngadi, Md Asri
2016-01-01
Cloud computing has attracted significant attention from research community because of rapid migration rate of Information Technology services to its domain. Advances in virtualization technology has made cloud computing very popular as a result of easier deployment of application services. Tasks are submitted to cloud datacenters to be processed on pay as you go fashion. Task scheduling is one the significant research challenges in cloud computing environment. The current formulation of task scheduling problems has been shown to be NP-complete, hence finding the exact solution especially for large problem sizes is intractable. The heterogeneous and dynamic feature of cloud resources makes optimum task scheduling non-trivial. Therefore, efficient task scheduling algorithms are required for optimum resource utilization. Symbiotic Organisms Search (SOS) has been shown to perform competitively with Particle Swarm Optimization (PSO). The aim of this study is to optimize task scheduling in cloud computing environment based on a proposed Simulated Annealing (SA) based SOS (SASOS) in order to improve the convergence rate and quality of solution of SOS. The SOS algorithm has a strong global exploration capability and uses fewer parameters. The systematic reasoning ability of SA is employed to find better solutions on local solution regions, hence, adding exploration ability to SOS. Also, a fitness function is proposed which takes into account the utilization level of virtual machines (VMs) which reduced makespan and degree of imbalance among VMs. CloudSim toolkit was used to evaluate the efficiency of the proposed method using both synthetic and standard workload. Results of simulation showed that hybrid SOS performs better than SOS in terms of convergence speed, response time, degree of imbalance, and makespan. PMID:27348127
Control fast or control smart: When should invading pathogens be controlled?
Thompson, Robin N; Gilligan, Christopher A; Cunniffe, Nik J
2018-02-01
The intuitive response to an invading pathogen is to start disease management as rapidly as possible, since this would be expected to minimise the future impacts of disease. However, since more spread data become available as an outbreak unfolds, processes underpinning pathogen transmission can almost always be characterised more precisely later in epidemics. This allows the future progression of any outbreak to be forecast more accurately, and so enables control interventions to be targeted more precisely. There is also the chance that the outbreak might die out without any intervention whatsoever, making prophylactic control unnecessary. Optimal decision-making involves continuously balancing these potential benefits of waiting against the possible costs of further spread. We introduce a generic, extensible data-driven algorithm based on parameter estimation and outbreak simulation for making decisions in real-time concerning when and how to control an invading pathogen. The Control Smart Algorithm (CSA) resolves the trade-off between the competing advantages of controlling as soon as possible and controlling later when more information has become available. We show-using a generic mathematical model representing the transmission of a pathogen of agricultural animals or plants through a population of farms or fields-how the CSA allows the timing and level of deployment of vaccination or chemical control to be optimised. In particular, the algorithm outperforms simpler strategies such as intervening when the outbreak size reaches a pre-specified threshold, or controlling when the outbreak has persisted for a threshold length of time. This remains the case even if the simpler methods are fully optimised in advance. Our work highlights the potential benefits of giving careful consideration to the question of when to start disease management during emerging outbreaks, and provides a concrete framework to allow policy-makers to make this decision.
Automated Cloud Observation for Ground Telescope Optimization
NASA Astrophysics Data System (ADS)
Lane, B.; Jeffries, M. W., Jr.; Therien, W.; Nguyen, H.
As the number of man-made objects placed in space each year increases with advancements in commercial, academic and industry, the number of objects required to be detected, tracked, and characterized continues to grow at an exponential rate. Commercial companies, such as ExoAnalytic Solutions, have deployed ground based sensors to maintain track custody of these objects. For the ExoAnalytic Global Telescope Network (EGTN), observation of such objects are collected at the rate of over 10 million unique observations per month (as of September 2017). Currently, the EGTN does not optimally collect data on nights with significant cloud levels. However, a majority of these nights prove to be partially cloudy providing clear portions in the sky for EGTN sensors to observe. It proves useful for a telescope to utilize these clear areas to continue resident space object (RSO) observation. By dynamically updating the tasking with the varying cloud positions, the number of observations could potentially increase dramatically due to increased persistence, cadence, and revisit. This paper will discuss the recent algorithms being implemented within the EGTN, including the motivation, need, and general design. The use of automated image processing as well as various edge detection methods, including Canny, Sobel, and Marching Squares, on real-time large FOV images of the sky enhance the tasking and scheduling of a ground based telescope is discussed in Section 2. Implementations of these algorithms on single and expanding to multiple telescopes, will be explored. Results of applying these algorithms to the EGTN in real-time and comparison to non-optimized EGTN tasking is presented in Section 3. Finally, in Section 4 we explore future work in applying these throughout the EGTN as well as other optical telescopes.
Performance of 3-Component Nodes in the IRIS Community Wavefield Demonstration Experiment
NASA Astrophysics Data System (ADS)
Sweet, J. R.; Anderson, K. R.; Woodward, R.
2017-12-01
In June 2016, a field crew of 50 students, faculty, industry personnel, and IRIS staff deployed a total of 390 stations as part of a community seismic experiment above an active seismic lineament in north-central Oklahoma. The goals of the experiment were to test new instrumentation and deployment strategies that record the full seismic wavefield, and to advance understanding of earthquake source processes and regional lithospheric structure. The crew deployed 363 3-component, 5Hz Generation 2 Fairfield Z-Land nodes along three seismic lines and in a seven-layer nested gradiometer array. The seismic lines spanned a region 13 km long by 5 km wide. A broadband, 18 station "Golay 3x6" array with an aperture of approximately 5 km was deployed around the gradiometer and seismic lines to collect waveform data from local and regional events. In addition, 9 infrasound stations were deployed in order to capture and identify acoustic events that might be recorded by the seismic array. The variety and geometry of instrumentation deployed was intended to capture the full seismic wavefield generated by the local and regional seismicity beneath the array and the surrounding region. Additional details on the instrumentation and how it was deployed can be found by visiting our website www.iris.edu/wavefields. We present a detailed analysis of noise across the array—including station performance, as well as noise from nearby sources (wind turbines, automobiles, etc.). We report a clear reduction in noise for buried 3-component nodes compared to co-located surface nodes (see Figure). Using the IRIS DMC's ISPAQ client, we present a variety of metrics to evaluate the network's performance. We also present highlights from student projects at the recently-held IRIS advanced data processing short course, which focused on analyzing the wavefield dataset using array processing techniques.
Semiautomated tremor detection using a combined cross-correlation and neural network approach
Horstmann, Tobias; Harrington, Rebecca M.; Cochran, Elizabeth S.
2013-01-01
Despite observations of tectonic tremor in many locations around the globe, the emergent phase arrivals, low‒amplitude waveforms, and variable event durations make automatic detection a nontrivial task. In this study, we employ a new method to identify tremor in large data sets using a semiautomated technique. The method first reduces the data volume with an envelope cross‒correlation technique, followed by a Self‒Organizing Map (SOM) algorithm to identify and classify event types. The method detects tremor in an automated fashion after calibrating for a specific data set, hence we refer to it as being “semiautomated”. We apply the semiautomated detection algorithm to a newly acquired data set of waveforms from a temporary deployment of 13 seismometers near Cholame, California, from May 2010 to July 2011. We manually identify tremor events in a 3 week long test data set and compare to the SOM output and find a detection accuracy of 79.5%. Detection accuracy improves with increasing signal‒to‒noise ratios and number of available stations. We find detection completeness of 96% for tremor events with signal‒to‒noise ratios above 3 and optimal results when data from at least 10 stations are available. We compare the SOM algorithm to the envelope correlation method of Wech and Creager and find the SOM performs significantly better, at least for the data set examined here. Using the SOM algorithm, we detect 2606 tremor events with a cumulative signal duration of nearly 55 h during the 13 month deployment. Overall, the SOM algorithm is shown to be a flexible new method that utilizes characteristics of the waveforms to identify tremor from noise or other seismic signals.
Semiautomated tremor detection using a combined cross-correlation and neural network approach
NASA Astrophysics Data System (ADS)
Horstmann, T.; Harrington, R. M.; Cochran, E. S.
2013-09-01
Despite observations of tectonic tremor in many locations around the globe, the emergent phase arrivals, low-amplitude waveforms, and variable event durations make automatic detection a nontrivial task. In this study, we employ a new method to identify tremor in large data sets using a semiautomated technique. The method first reduces the data volume with an envelope cross-correlation technique, followed by a Self-Organizing Map (SOM) algorithm to identify and classify event types. The method detects tremor in an automated fashion after calibrating for a specific data set, hence we refer to it as being "semiautomated". We apply the semiautomated detection algorithm to a newly acquired data set of waveforms from a temporary deployment of 13 seismometers near Cholame, California, from May 2010 to July 2011. We manually identify tremor events in a 3 week long test data set and compare to the SOM output and find a detection accuracy of 79.5%. Detection accuracy improves with increasing signal-to-noise ratios and number of available stations. We find detection completeness of 96% for tremor events with signal-to-noise ratios above 3 and optimal results when data from at least 10 stations are available. We compare the SOM algorithm to the envelope correlation method of Wech and Creager and find the SOM performs significantly better, at least for the data set examined here. Using the SOM algorithm, we detect 2606 tremor events with a cumulative signal duration of nearly 55 h during the 13 month deployment. Overall, the SOM algorithm is shown to be a flexible new method that utilizes characteristics of the waveforms to identify tremor from noise or other seismic signals.
Autonomous Vision-Based Tethered-Assisted Rover Docking
NASA Technical Reports Server (NTRS)
Tsai, Dorian; Nesnas, Issa A.D.; Zarzhitsky, Dimitri
2013-01-01
Many intriguing science discoveries on planetary surfaces, such as the seasonal flows on crater walls and skylight entrances to lava tubes, are at sites that are currently inaccessible to state-of-the-art rovers. The in situ exploration of such sites is likely to require a tethered platform both for mechanical support and for providing power and communication. Mother/daughter architectures have been investigated where a mother deploys a tethered daughter into extreme terrains. Deploying and retracting a tethered daughter requires undocking and re-docking of the daughter to the mother, with the latter being the challenging part. In this paper, we describe a vision-based tether-assisted algorithm for the autonomous re-docking of a daughter to its mother following an extreme terrain excursion. The algorithm uses fiducials mounted on the mother to improve the reliability and accuracy of estimating the pose of the mother relative to the daughter. The tether that is anchored by the mother helps the docking process and increases the system's tolerance to pose uncertainties by mechanically aligning the mating parts in the final docking phase. A preliminary version of the algorithm was developed and field-tested on the Axel rover in the JPL Mars Yard. The algorithm achieved an 80% success rate in 40 experiments in both firm and loose soils and starting from up to 6 m away at up to 40 deg radial angle and 20 deg relative heading. The algorithm does not rely on an initial estimate of the relative pose. The preliminary results are promising and help retire the risk associated with the autonomous docking process enabling consideration in future martian and lunar missions.
Algorithms for Lightweight Key Exchange.
Alvarez, Rafael; Caballero-Gil, Cándido; Santonja, Juan; Zamora, Antonio
2017-06-27
Public-key cryptography is too slow for general purpose encryption, with most applications limiting its use as much as possible. Some secure protocols, especially those that enable forward secrecy, make a much heavier use of public-key cryptography, increasing the demand for lightweight cryptosystems that can be implemented in low powered or mobile devices. This performance requirements are even more significant in critical infrastructure and emergency scenarios where peer-to-peer networks are deployed for increased availability and resiliency. We benchmark several public-key key-exchange algorithms, determining those that are better for the requirements of critical infrastructure and emergency applications and propose a security framework based on these algorithms and study its application to decentralized node or sensor networks.
TRL-6 for JWST wavefront sensing and control
NASA Astrophysics Data System (ADS)
Feinberg, Lee D.; Dean, Bruce H.; Aronstein, David L.; Bowers, Charles W.; Hayden, William; Lyon, Richard G.; Shiri, Ron; Smith, J. Scott; Acton, D. Scott; Carey, Larkin; Contos, Adam; Sabatke, Erin; Schwenker, John; Shields, Duncan; Towell, Tim; Shi, Fang; Meza, Luis
2007-09-01
NASA's Technology Readiness Level (TRL)-6 is documented for the James Webb Space Telescope (JWST) Wavefront Sensing and Control (WFSC) subsystem. The WFSC subsystem is needed to align the Optical Telescope Element (OTE) after all deployments have occurred, and achieves that requirement through a robust commissioning sequence consisting of unique commissioning algorithms, all of which are part of the WFSC algorithm suite. This paper identifies the technology need, algorithm heritage, describes the finished TRL-6 design platform, and summarizes the TRL-6 test results and compliance. Additionally, the performance requirements needed to satisfy JWST science goals as well as the criterion that relate to the TRL-6 Testbed Telescope (TBT) performance requirements are discussed.
TRL-6 for JWST Wavefront Sensing and Control
NASA Technical Reports Server (NTRS)
Feinberg, Lee; Dean, Bruce; Smith, Scott; Aronstein, David; Shiri, Ron; Lyon, Rick; Hayden, Bill; Bowers, Chuck; Acton, D. Scott; Shields, Duncan;
2007-01-01
NASA's Technology Readiness Level (TRL)-6 is documented for the James Webb Space Telescope (JWST) Wavefront Sensing and Control (WFSC) subsystem. The WFSC subsystem is needed to align the Optical Telescope Element (OTE) after all deployments have occurred, and achieves that requirement through a robust commissioning sequence consisting of unique commissioning algorithms, all of which are part of the WFSC algorithm suite. This paper identifies the technology need, algorithm heritage, describes the finished TRL-6 design platform, and summarizes the TRL-6 test results and compliance. Additionally, the performance requirements needed to satisfy JWST science goals as well as the criterion that relate to the TRL-6 Testbed Telescope (TBT) performance requirements are discussed
NASA Astrophysics Data System (ADS)
Gittinger, Jaxon M.; Jimenez, Edward S.; Holswade, Erica A.; Nunna, Rahul S.
2017-02-01
This work will demonstrate the implementation of a traditional and non-traditional visualization of x-ray images for aviation security applications that will be feasible with open system architecture initiatives such as the Open Threat Assessment Platform (OTAP). Anomalies of interest to aviation security are fluid, where characteristic signals of anomalies of interest can evolve rapidly. OTAP is a limited scope open architecture baggage screening prototype that intends to allow 3rd-party vendors to develop and easily implement, integrate, and deploy detection algorithms and specialized hardware on a field deployable screening technology [13]. In this study, stereoscopic images were created using an unmodified, field-deployed system and rendered on the Oculus Rift, a commercial virtual reality video gaming headset. The example described in this work is not dependent on the Oculus Rift, and is possible using any comparable hardware configuration capable of rendering stereoscopic images. The depth information provided from viewing the images will aid in the detection of characteristic signals from anomalies of interest. If successful, OTAP has the potential to allow for aviation security to become more fluid in its adaptation to the evolution of anomalies of interest. This work demonstrates one example that is easily implemented using the OTAP platform, that could lead to the future generation of ATR algorithms and data visualization approaches.
Optimization model of conventional missile maneuvering route based on improved Floyd algorithm
NASA Astrophysics Data System (ADS)
Wu, Runping; Liu, Weidong
2018-04-01
Missile combat plays a crucial role in the victory of war under high-tech conditions. According to the characteristics of maneuver tasks of conventional missile units in combat operations, the factors influencing road maneuvering are analyzed. Based on road distance, road conflicts, launching device speed, position requirements, launch device deployment, Concealment and so on. The shortest time optimization model was built to discuss the situation of road conflict and the strategy of conflict resolution. The results suggest that in the process of solving road conflict, the effect of node waiting is better than detour to another way. In this study, we analyzed the deficiency of the traditional Floyd algorithm which may limit the optimal way of solving road conflict, and put forward the improved Floyd algorithm, meanwhile, we designed the algorithm flow which would be better than traditional Floyd algorithm. Finally, throgh a numerical example, the model and the algorithm were proved to be reliable and effective.
Advanced Networks in Motion Mobile Sensorweb
NASA Technical Reports Server (NTRS)
Ivancic, William D.; Stewart, David H.
2011-01-01
Advanced mobile networking technology applicable to mobile sensor platforms was developed, deployed and demonstrated. A two-tier sensorweb design was developed. The first tier utilized mobile network technology to provide mobility. The second tier, which sits above the first tier, utilizes 6LowPAN (Internet Protocol version 6 Low Power Wireless Personal Area Networks) sensors. The entire network was IPv6 enabled. Successful mobile sensorweb system field tests took place in late August and early September of 2009. The entire network utilized IPv6 and was monitored and controlled using a remote Web browser via IPv6 technology. This paper describes the mobile networking and 6LowPAN sensorweb design, implementation, deployment and testing as well as wireless systems and network monitoring software developed to support testing and validation.
STS-93 Crew Interview: Michel Tognini
NASA Technical Reports Server (NTRS)
1999-01-01
This NASA Johnson Space Center (JSC) video release presents a one-on-one interview with Mission Specialist 3, Michel Tognini (Col., French Air Force and Centre Nacional Etudes Spatiales (CNES) Astronaut). Subjects discussed include early influences that made Michel want to be a pilot and astronaut, his experience as a French military pilot and his flying history. Also discussed were French participation in building the International Space Station (ISS), the STS-93 primary mission objective, X-ray observation using the Advanced X-ray Astrophysics Facility (AXAF), and failure scenarios associated with AXAF deployment. The STS-93 mission objective was to deploy the Advanced X-ray Astrophysics Facility (AXAF), later renamed the Chandra X-Ray Observatory in honor of the late Indian-American Nobel Laureate Subrahmanyan Chandrasekhar.
Modern Advances in Ablative TPS
NASA Technical Reports Server (NTRS)
Venkatapathy, Ethiraj
2013-01-01
Topics covered include: Physics of Hypersonic Flow and TPS Considerations. Destinations, Missions and Requirements. State of the Art Thermal Protection Systems Capabilities. Modern Advances in Ablative TPS. Entry Systems Concepts. Flexible TPS for Hypersonic Inflatable Aerodynamic Decelerators. Conformal TPS for Rigid Aeroshell. 3-D Woven TPS for Extreme Entry Environment. Multi-functional Carbon Fabric for Mechanically Deployable.
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
The U.S. Department of Energy’s (DOE’s) Wind Energy Technologies Office (WETO) works to accelerate the development and deployment of wind power. The office provides information for researchers, developers,businesses, manufacturers, communities, and others seeking various types of federal assistance available for advancing wind projects.
DOT National Transportation Integrated Search
1999-01-01
In 1997, the Ann Arbor (Michigan) Transportation Authority began deploying advanced public transportation systems (APTS) technologies in its fixed route and paratransit operations. The project's concept is the integration of a range of such technolog...
DOT National Transportation Integrated Search
1999-01-01
In 1997, the Ann Arbor (Michigan) Transportation Authority (AATA) began deploying advanced public transportation systems (APTS) technologies in its fixed route and paratransit operations. The project's concept is the integration of a range of such te...
NREL Facilitates Installment of Advanced Hydrogen Fuel Station in
. Department of Energy's (DOE's) Fuel Cell Technologies Office and Department of Interior's National Park the first phase of their collaborative efforts to accelerate deployment of advanced hydrogen fuel cell experience by showcasing and using fuel cell electric vehicle (FCEV) technologies throughout the D.C. metro
DOE Office of Scientific and Technical Information (OSTI.GOV)
Palmintier, Bryan; Broderick, Robert; Mather, Barry
2016-05-01
This report analyzes distribution-integration challenges, solutions, and research needs in the context of distributed generation from PV (DGPV) deployment to date and the much higher levels of deployment expected with achievement of the U.S. Department of Energy's SunShot targets. Recent analyses have improved estimates of the DGPV hosting capacities of distribution systems. This report uses these results to statistically estimate the minimum DGPV hosting capacity for the contiguous United States using traditional inverters of approximately 170 GW without distribution system modifications. This hosting capacity roughly doubles if advanced inverters are used to manage local voltage and additional minor, low-cost changesmore » could further increase these levels substantially. Key to achieving these deployment levels at minimum cost is siting DGPV based on local hosting capacities, suggesting opportunities for regulatory, incentive, and interconnection innovation. Already, pre-computed hosting capacity is beginning to expedite DGPV interconnection requests and installations in select regions; however, realizing SunShot-scale deployment will require further improvements to DGPV interconnection processes, standards and codes, and compensation mechanisms so they embrace the contributions of DGPV to system-wide operations. SunShot-scale DGPV deployment will also require unprecedented coordination of the distribution and transmission systems. This includes harnessing DGPV's ability to relieve congestion and reduce system losses by generating closer to loads; minimizing system operating costs and reserve deployments through improved DGPV visibility; developing communication and control architectures that incorporate DGPV into system operations; providing frequency response, transient stability, and synthesized inertia with DGPV in the event of large-scale system disturbances; and potentially managing reactive power requirements due to large-scale deployment of advanced inverter functions. Finally, additional local and system-level value could be provided by integrating DGPV with energy storage and 'virtual storage,' which exploits improved management of electric vehicle charging, building energy systems, and other large loads. Together, continued innovation across this rich distribution landscape can enable the very-high deployment levels envisioned by SunShot.« less
Advanced Nuclear Fuel Cycle Transitions: Optimization, Modeling Choices, and Disruptions
NASA Astrophysics Data System (ADS)
Carlsen, Robert W.
Many nuclear fuel cycle simulators have evolved over time to help understan the nuclear industry/ecosystem at a macroscopic level. Cyclus is one of th first fuel cycle simulators to accommodate larger-scale analysis with it liberal open-source licensing and first-class Linux support. Cyclus also ha features that uniquely enable investigating the effects of modeling choices o fuel cycle simulators and scenarios. This work is divided into thre experiments focusing on optimization, effects of modeling choices, and fue cycle uncertainty. Effective optimization techniques are developed for automatically determinin desirable facility deployment schedules with Cyclus. A novel method fo mapping optimization variables to deployment schedules is developed. Thi allows relationships between reactor types and scenario constraints to b represented implicitly in the variable definitions enabling the usage o optimizers lacking constraint support. It also prevents wasting computationa resources evaluating infeasible deployment schedules. Deployed power capacit over time and deployment of non-reactor facilities are also included a optimization variables There are many fuel cycle simulators built with different combinations o modeling choices. Comparing results between them is often difficult. Cyclus flexibility allows comparing effects of many such modeling choices. Reacto refueling cycle synchronization and inter-facility competition among othe effects are compared in four cases each using combinations of fleet of individually modeled reactors with 1-month or 3-month time steps. There are noticeable differences in results for the different cases. The larges differences occur during periods of constrained reactor fuel availability This and similar work can help improve the quality of fuel cycle analysi generally There is significant uncertainty associated deploying new nuclear technologie such as time-frames for technology availability and the cost of buildin advanced reactors. Historically, fuel cycle analysis has focused on answerin questions of fuel cycle feasibility and optimality. However, there has no been much work done to address uncertainty in fuel cycle analysis helpin answer questions of fuel cycle robustness. This work develops an demonstrates a methodology for evaluating deployment strategies whil accounting for uncertainty. Techniques are developed for measuring th hedging properties of deployment strategies under uncertainty. Additionally methods for using optimization to automatically find good hedging strategie are demonstrated.
NASA Astrophysics Data System (ADS)
Parker, Tim; Devanney, Peter; Bainbridge, Geoff; Townsend, Bruce
2017-04-01
The march to make every type of seismometer, weak to strong motion, reliable and economically deployable in any terrestrial environment continues with the availability of three new sensors and seismic systems including ones with over 200dB of dynamic range. Until recently there were probably 100 pier type broadband sensors for every observatory type pier, not the types of deployments geoscientists are needing to advance science and monitoring capability. Deeper boreholes are now the recognized quieter environments for best observatory class instruments and these same instruments can now be deployed in direct burial environments which is unprecedented. The experiences of facilities in large deployments of broadband seismometers in continental scale rolling arrays proves the utility of packaging new sensors in corrosion resistant casings and designing in the robustness needed to work reliably in temporary deployments. Integrating digitizers and other sensors decreases deployment complexity, decreases acquisition and deployment costs, increases reliability and utility. We'll discuss the informed evolution of broadband pier instruments into the modern integrated field tools that enable economic densification of monitoring arrays along with supporting new ways to approach geoscience research in a field environment.
Satellite services system analysis study. Volume 4: Service equipment concepts
NASA Technical Reports Server (NTRS)
1981-01-01
Payload deployment equipment is discussed, including payload separation, retention structures, the remote manipulator system, tilt tables, the payload installation and deployment aid, the handling and positioning aid, and spin tables. Close proximity retrieval, and on-orbit servicing equipment is discussed. Backup and contingency equipment is also discussed. Delivery and retrieval of high-energy payloads are considered. Earth return equipment, the aft flight deck, optional, and advanced equipment are also discussed.
The Test and Evaluation of Unmanned and Autonomous Systems
2008-12-01
robotic/ intelli - gent machines for the U.S. Department of Defense (DoD). Although the technology is still nascent and advancing, we are faced with the...evolutionary nature of UAS acquisition must be met with evolutionary test capabilities yet to be discovered and developed. Test capabilities must be deployed...at a faster pace than UAS deployment to satisfy the demand for warfighter improvements. The DoD is stimulating this new area of innovation with
ALFA MHK Biological Monitoring Stationary deployment
Horne, John
2016-10-01
Acoustic backscatter data from a WBAT operating at 70kHz deployed at PMEC-SETS from April to September of 2016. 180 pings were collected at 1Hz every two hours, as part of the Advanced Laboratory and Field Arrays (ALFA) for Marine Energy project. Data was subject to preliminary processing (noise removal, a threshold of -75dB was applied, surface turbulence and data below 0.5m from the bottom was removed).
Return to contingency: developing a coherent strategy for future R2E/R3 land medical capabilities.
Ingram, Mike; Mahan, J
2015-03-01
Key to deploying forces in the future will be the provision of a rapidly deployable Deployed Hospital Capability. Developing this capability has been the focus of 34 Field Hospital and 2nd Medical Brigade over the last 18 months and this paper describes a personal account of this development work to date. Future contingent Deployed Hospital Capability must meet the requirements of Defence; that is to be rapidly deployable delivering a hospital standard of care. The excellence seen in clinical delivery on recent operations is intensive; in personnel, equipment, infrastructure and sustainment. The challenge in developing a coherent capability has been in balancing the clinical capability and capacity against strategic load in light of recent advances in battlefield medicine. This paper explores the issues encountered and solutions found to date in reconstituting a Very High Readiness Deployed Hospital Capability. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Field testing of a next generation pointer/tracker for IRCM
NASA Astrophysics Data System (ADS)
Chapman, Stuart; Wildgoose, Iain; McDonald, Eric; Duncan, Stuart
2008-10-01
SELEX Galileo has been involved in the development, manufacture and support of high performance electro-optic pointing and stabilisation systems for over forty years. The Company currently supplies the pointer/trackers for the AN/AAQ-24(V) NEMESIS DIRCM system, for which over 1,000 combat-proven units have been produced and deployed in the US, the UK and other nations. In 2007, SELEX Galileo embarked on an internally funded programme to develop ECLIPSE, a new advanced, lightweight, low-cost IRCM pointer/tracker, exploiting the extensive knowledge and experience gained from previous targeting and IRCM programmes. The ECLIPSE design is centred on a low inertia, two-axis servo mechanism with a strap-down inertial sensor and advanced sightline control algorithms, allowing effective tracking through the nadir and providing superior sightline performance. The programme involved the production of three demonstrator units in 2007, and two pre-production units in 2008. The demonstrator units were first trialled as part of a NEMESIS DIRCM system in late 2007, and in April 2008 100% success was achieved in jamming live-fire demonstrations. Helicopter installation and ground testing of a UK only trials system is complete, initial flight testing has just begun, and the airborne test and evaluation scheduled for late summer 2008 will bring the ECLIPSE System to technology readiness to level 7 (TRL7). This paper describes the Eclipse performance demonstrated to date.
Semantic interoperability--HL7 Version 3 compared to advanced architecture standards.
Blobel, B G M E; Engel, K; Pharow, P
2006-01-01
To meet the challenge for high quality and efficient care, highly specialized and distributed healthcare establishments have to communicate and co-operate in a semantically interoperable way. Information and communication technology must be open, flexible, scalable, knowledge-based and service-oriented as well as secure and safe. For enabling semantic interoperability, a unified process for defining and implementing the architecture, i.e. structure and functions of the cooperating systems' components, as well as the approach for knowledge representation, i.e. the used information and its interpretation, algorithms, etc. have to be defined in a harmonized way. Deploying the Generic Component Model, systems and their components, underlying concepts and applied constraints must be formally modeled, strictly separating platform-independent from platform-specific models. As HL7 Version 3 claims to represent the most successful standard for semantic interoperability, HL7 has been analyzed regarding the requirements for model-driven, service-oriented design of semantic interoperable information systems, thereby moving from a communication to an architecture paradigm. The approach is compared with advanced architectural approaches for information systems such as OMG's CORBA 3 or EHR systems such as GEHR/openEHR and CEN EN 13606 Electronic Health Record Communication. HL7 Version 3 is maturing towards an architectural approach for semantic interoperability. Despite current differences, there is a close collaboration between the teams involved guaranteeing a convergence between competing approaches.
A review of recent advances in data analytics for post-operative patient deterioration detection.
Petit, Clemence; Bezemer, Rick; Atallah, Louis
2018-06-01
Most deaths occurring due to a surgical intervention happen postoperatively rather than during surgery. The current standard of care in many hospitals cannot fully cope with detecting and addressing post-surgical deterioration in time. For millions of patients, this deterioration is left unnoticed, leading to increased mortality and morbidity. Postoperative deterioration detection currently relies on general scores that are not fully able to cater for the complex post-operative physiology of surgical patients. In the last decade however, advanced risk and warning scoring techniques have started to show encouraging results in terms of using the large amount of data available peri-operatively to improve postoperative deterioration detection. Relevant literature has been carefully surveyed to provide a summary of the most promising approaches as well as how they have been deployed in the perioperative domain. This work also aims to highlight the opportunities that lie in personalizing the models developed for patient deterioration for these particular post-surgical patients and make the output more actionable. The integration of pre- and intra-operative data, e.g. comorbidities, vitals, lab data, and information about the procedure performed, in post-operative early warning algorithms would lead to more contextualized, personalized, and adaptive patient modelling. This, combined with careful integration in the clinical workflow, would result in improved clinical decision support and better post-surgical care outcomes.
Aerosol and Cloud Experiments in Eastern North Atlantic (ACE-ENA) Science Plan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Jian; Dong, Xiquan; Wood, Robert
With their extensive coverage, low clouds greatly impact global climate. Presently, low clouds are poorly represented in global climate models (GCMs), and the response of low clouds to changes in atmospheric greenhouse gases and aerosols remains the major source of uncertainty in climate simulations. The poor representations of low clouds in GCMs are in part due to inadequate observations of their microphysical and macrophysical structures, radiative effects, and the associated aerosol distribution and budget in regions where the aerosol impact is the greatest. The Eastern North Atlantic (ENA) is a region of persistent but diverse subtropical marine boundary-layer (MBL) clouds,more » whose albedo and precipitation are highly susceptible to perturbations in aerosol properties. Boundary-layer aerosol in the ENA region is influenced by a variety of sources, leading to strong variations in cloud condensation nuclei (CCN) concentration and aerosol optical properties. Recently a permanent ENA site was established by the U.S. Department of Energy (DOE)’s Atmospheric Radiation Measurement (ARM) Climate Research Facility on Graciosa Island in the Azores, providing invaluable information on MBL aerosol and low clouds. At the same time, the vertical structures and horizontal variabilities of aerosol, trace gases, cloud, drizzle, and atmospheric thermodynamics are critically needed for understanding and quantifying the budget of MBL aerosol, the radiative properties, precipitation efficiency, and lifecycle of MBL clouds, and the cloud response to aerosol perturbations. Much of this data can be obtained only through aircraft-based measurements. In addition, the interconnected aerosol and cloud processes are best investigated by a study involving simultaneous in situ aerosol, cloud, and thermodynamics measurements. Furthermore, in situ measurements are also necessary for validating and improving ground-based retrieval algorithms at the ENA site. This project is motivated by the need for comprehensive in situ characterizations of boundary-layer structure, and associated vertical distributions and horizontal variabilities of low clouds and aerosol over the Azores. ARM Aerial Facility (AAF) Gulfstream-1 (G-1) aircraft will be deployed at the ENA site during two intensive operational periods (IOPs) of early summer (June to July) of 2017 and winter (January to February) of 2018, respectively. Deployments during both seasons allow for examination of key aerosol and cloud processes under a variety of representative meteorological and cloud conditions. The science themes for the deployments include: 1) Budget of MBL CCN and its seasonal variation; 2) Effects of aerosol on cloud and precipitation; 3) Cloud microphysical and macrophysical structures, and entrainment mixing; 4) Advancing retrievals of turbulence, cloud, and drizzle; and 5) Model evaluation and processes studies. A key advantage of the deployments is the strong synergy between the measurements onboard the G-1 and the routine measurements at the ENA site, including state-of-the-art profiling and scanning radars. The 3D cloud structures provided by the scanning radars will put the detailed in situ measurements into mesoscale and cloud lifecycle contexts. On the other hand, high quality in situ measurements will enable validation and improvements of ground-based retrieval algorithms at the ENA site, leading to high-quality and statistically robust data sets from the routine measurements. The deployments, combined with the routine measurements at the ENA site, will have a long lasting impact on the research and modeling of low clouds and aerosols in the remote marine environment.« less
Fast, Inclusive Searches for Geographic Names Using Digraphs
Donato, David I.
2008-01-01
An algorithm specifies how to quickly identify names that approximately match any specified name when searching a list or database of geographic names. Based on comparisons of the digraphs (ordered letter pairs) contained in geographic names, this algorithmic technique identifies approximately matching names by applying an artificial but useful measure of name similarity. A digraph index enables computer name searches that are carried out using this technique to be fast enough for deployment in a Web application. This technique, which is a member of the class of n-gram algorithms, is related to, but distinct from, the soundex, PHONIX, and metaphone phonetic algorithms. Despite this technique's tendency to return some counterintuitive approximate matches, it is an effective aid for fast, inclusive searches for geographic names when the exact name sought, or its correct spelling, is unknown.
Tien, Col Homer; Beckett, Maj Andrew; Garraway, LCol Naisan; Talbot, LCol Max; Pannell, Capt Dylan; Alabbasi, Thamer
2015-01-01
Medical support to deployed field forces is increasingly becoming a shared responsibility among allied nations. National military medical planners face several key challenges, including fiscal restraints, raised expectations of standards of care in the field and a shortage of appropriately trained specialists. Even so, medical services are now in high demand, and the availability of medical support may become the limiting factor that determines how and where combat units can deploy. The influence of medical factors on operational decisions is therefore leading to an increasing requirement for multinational medical solutions. Nations must agree on the common standards that govern the care of the wounded. These standards will always need to take into account increased public expectations regarding the quality of care. The purpose of this article is to both review North Atlantic Treaty Organization (NATO) policies that govern multinational medical missions and to discuss how recent scientific advances in prehospital battlefield care, damage control resuscitation and damage control surgery may inform how countries within NATO choose to organize and deploy their field forces in the future. PMID:26100784
Engineering challenges of operating year-round portable seismic stations at high-latitude
NASA Astrophysics Data System (ADS)
Beaudoin, Bruce; Carpenter, Paul; Hebert, Jason; Childs, Dean; Anderson, Kent
2017-04-01
Remote portable seismic stations are, in most cases, constrained by logistics and cost. High latitude operations introduce environmental, technical and logistical challenges that require substantially more engineering work to ensure robust, high quality data return. Since 2006, IRIS PASSCAL has been funded by NSF to develop, deploy, and maintain a pool of polar specific seismic stations. Here, we describe our latest advancements to mitigate the challenges of high-latitude, year-round station operation. The IRIS PASSCAL program has supported high-latitude deployments since the late 1980s. These early deployments were largely controlled source, summer only experiments. In early 2000 PASSCAL users began proposing year-round deployments of broadband stations in some of the harshest environments on the planet. These early year-round deployments were stand-alone (no telemetry) stations largely designed to operate during summer months and then run as long as possible during the winter with hopes the stations would revive come following summer. In 2006 and in collaboration with UNAVCO, we began developing communications, power systems, and enclosures to extend recording to year-round. Since this initial effort, PASSCAL continued refinement to power systems, enclosure design and manufacturability, and real-time data communications. Several sensor and data logger manufacturers have made advances in cold weather performance and delivered newly designed instruments that have furthered our ability to successfully run portable stations at high-latitude with minimal logistics - reducing size and weight of instruments and infrastructure. All PASSCAL polar engineering work is openly shared through our website: www.passcal.nmt.edu/content/polar
Deploying Darter A Cray XC30 System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fahey, Mark R; Budiardja, Reuben D; Crosby, Lonnie D
TheUniversityofTennessee,KnoxvilleacquiredaCrayXC30 supercomputer, called Darter, with a peak performance of 248.9 Ter- aflops. Darter was deployed in late March of 2013 with a very aggressive production timeline - the system was deployed, accepted, and placed into production in only 2 weeks. The Spring Experiment for the Center for Analysis and Prediction of Storms (CAPS) largely drove the accelerated timeline, as the experiment was scheduled to start in mid-April. The Consortium for Advanced Simulation of Light Water Reactors (CASL) project also needed access and was able to meet their tight deadlines on the newly acquired XC30. Darter s accelerated deployment and op-more » erations schedule resulted in substantial scientific impacts within the re- search community as well as immediate real-world impacts such as early severe tornado warnings« less
On the Use of a Range Trigger for the Mars Science Laboratory Entry Descent and Landing
NASA Technical Reports Server (NTRS)
Way, David W.
2011-01-01
In 2012, during the Entry, Descent, and Landing (EDL) of the Mars Science Laboratory (MSL) entry vehicle, a 21.5 m Viking-heritage, Disk-Gap-Band, supersonic parachute will be deployed at approximately Mach 2. The baseline algorithm for commanding this parachute deployment is a navigated planet-relative velocity trigger. This paper compares the performance of an alternative range-to-go trigger (sometimes referred to as Smart Chute ), which can significantly reduce the landing footprint size. Numerical Monte Carlo results, predicted by the POST2 MSL POST End-to-End EDL simulation, are corroborated and explained by applying propagation of uncertainty methods to develop an analytic estimate for the standard deviation of Mach number. A negative correlation is shown to exist between the standard deviations of wind velocity and the planet-relative velocity at parachute deploy, which mitigates the Mach number rise in the case of the range trigger.
NASA Astrophysics Data System (ADS)
Alegria Mira, Lara; Thrall, Ashley P.; De Temmerman, Niels
2016-02-01
Deployable scissor structures are well equipped for temporary and mobile applications since they are able to change their form and functionality. They are structural mechanisms that transform from a compact state to an expanded, fully deployed configuration. A barrier to the current design and reuse of scissor structures, however, is that they are traditionally designed for a single purpose. Alternatively, a universal scissor component (USC)-a generalized element which can achieve all traditional scissor types-introduces an opportunity for reuse in which the same component can be utilized for different configurations and spans. In this article, the USC is optimized for structural performance. First, an optimized length for the USC is determined based on a trade-off between component weight and structural performance (measured by deflections). Then, topology optimization, using the simulated annealing algorithm, is implemented to determine a minimum weight layout of beams within a single USC component.
NASA Technical Reports Server (NTRS)
Murray, John J.; Hudnall, L. A.; Matus, A.; Krueger, A. J.; Trepte, C. r.
2010-01-01
The Aleutian Islands of Alaska are home to a number of major volcanoes which periodically present a significant hazard to aviation. During summer of 2008, the Okmok and Kasatochi volcanoes experienced moderate eruptive events. These were followed a dramatic, major eruption of Mount Redoubt in late March 2009. The Redoubt case is extensively covered in this paper. Volcanic ash and SO2 from each of these eruptions dispersed throughout the atmosphere. This created the potential for major problems for air traffic near the ash dispersions and at significant distances downwind. The NASA Applied Sciences Weather Program implements a wide variety of research projects to develop volcanic ash detection, characterization and tracking applications for NASA Earth Observing System and NOAA GOES and POES satellites. Chemistry applications using NASA AURA satellite Ozone Monitoring System (OMI) retrievals produced SO2 measurements to trace the dispersion of volcanic aerosol. This work was complimented by advanced multi-channel imager applications for the discrimination and height assignment of volcanic ash using NASA MODIS and NOAA GOES and POES imager data. Instruments similar to MODIS and OMI are scheduled for operational deployment on NPOESS. In addition, the NASA Calipso satellite provided highly accurate measurements of aerosol height and dispersion for the calibration and validation of these algorithms and for corroborative research studies. All of this work shortens the lead time for transition to operations and ensures that research satellite data and applications are operationally relevant and utilized quickly after the deployment of operational satellite systems. Introduction
Safety related drug-labelling changes: findings from two data mining algorithms.
Hauben, Manfred; Reich, Lester
2004-01-01
With increasing volumes of postmarketing safety surveillance data, data mining algorithms (DMAs) have been developed to search large spontaneous reporting system (SRS) databases for disproportional statistical dependencies between drugs and events. A crucial question is the proper deployment of such techniques within the universe of methods historically used for signal detection. One question of interest is comparative performance of algorithms based on simple forms of disproportionality analysis versus those incorporating Bayesian modelling. A potential benefit of Bayesian methods is a reduced volume of signals, including false-positive signals. To compare performance of two well described DMAs (proportional reporting ratios [PRRs] and an empirical Bayesian algorithm known as multi-item gamma Poisson shrinker [MGPS]) using commonly recommended thresholds on a diverse data set of adverse events that triggered drug labelling changes. PRRs and MGPS were retrospectively applied to a diverse sample of drug-event combinations (DECs) identified on a government Internet site for a 7-month period. Metrics for this comparative analysis included the number and proportion of these DECs that generated signals of disproportionate reporting with PRRs, MGPS, both or neither method, differential timing of signal generation between the two methods, and clinical nature of events that generated signals with only one, both or neither method. There were 136 relevant DECs that triggered safety-related labelling changes for 39 drugs during a 7-month period. PRRs generated a signal of disproportionate reporting with almost twice as many DECs as MGPS (77 vs 40). No DECs were flagged by MGPS only. PRRs highlighted DECs in advance of MGPS (1-15 years) and a label change (1-30 years). For 59 DECs, there was no signal with either DMA. DECs generating signals of disproportionate reporting with only PRRs were both medically serious and non-serious. In most instances in which a DEC generated a signal of disproportionate reporting with both DMAs (almost twice as many with PRRs), the signal was generated using PRRs in advance of MGPS. No medically important events were signalled only by MGPS. It is likely that the incremental utility of DMAs are highly situation-dependent. It is clear, however, that the volume of signals generated by itself is an inadequate criterion for comparison and that clinical nature of signalled events and differential timing of signals needs to be considered. Accepting commonly recommended threshold criteria for DMAs examined in this study as universal benchmarks for signal detection is not justified.
A Decade Remote Sensing River Bathymetry with the Experimental Advanced Airborne Research LiDAR
NASA Astrophysics Data System (ADS)
Kinzel, P. J.; Legleiter, C. J.; Nelson, J. M.; Skinner, K.
2012-12-01
Since 2002, the first generation of the Experimental Advanced Airborne Research LiDAR (EAARL-A) sensor has been deployed for mapping rivers and streams. We present and summarize the results of comparisons between ground truth surveys and bathymetry collected by the EAARL-A sensor in a suite of rivers across the United States. These comparisons include reaches on the Platte River (NE), Boise and Deadwood Rivers (ID), Blue and Colorado Rivers (CO), Klamath and Trinity Rivers (CA), and the Shenandoah River (VA). In addition to diverse channel morphologies (braided, single thread, and meandering) these rivers possess a variety of substrates (sand, gravel, and bedrock) and a wide range of optical characteristics which influence the attenuation and scattering of laser energy through the water column. Root mean square errors between ground truth elevations and those measured by the EAARL-A ranged from 0.15-m in rivers with relatively low turbidity and highly reflective sandy bottoms to over 0.5-m in turbid rivers with less reflective substrates. Mapping accuracy with the EAARL-A has proved challenging in pools where bottom returns are either absent in waveforms or are of such low intensity that they are treated as noise by waveform processing algorithms. Resolving bathymetry in shallow depths where near surface and bottom returns are typically convolved also presents difficulties for waveform processing routines. The results of these evaluations provide an empirical framework to discuss the capabilities and limitations of the EAARL-A sensor as well as previous generations of post-processing software for extracting bathymetry from complex waveforms. These experiences and field studies not only provide benchmarks for the evaluation of the next generation of bathymetric LiDARs for use in river mapping, but also highlight the importance of developing and standardizing more rigorous methods to characterize substrate reflectance and in-situ optical properties at study sites. They also point out the continued necessity of ground truth data for algorithm refinement and survey verification.
Lynx: Automatic Elderly Behavior Prediction in Home Telecare
Lopez-Guede, Jose Manuel; Moreno-Fernandez-de-Leceta, Aitor; Martinez-Garcia, Alexeiw; Graña, Manuel
2015-01-01
This paper introduces Lynx, an intelligent system for personal safety at home environments, oriented to elderly people living independently, which encompasses a decision support machine for automatic home risk prevention, tested in real-life environments to respond to real time situations. The automatic system described in this paper prevents such risks by an advanced analytic methods supported by an expert knowledge system. It is minimally intrusive, using plug-and-play sensors and machine learning algorithms to learn the elder's daily activity taking into account even his health records. If the system detects that something unusual happens (in a wide sense) or if something is wrong relative to the user's health habits or medical recommendations, it sends at real-time alarm to the family, care center, or medical agents, without human intervention. The system feeds on information from sensors deployed in the home and knowledge of subject physical activities, which can be collected by mobile applications and enriched by personalized health information from clinical reports encoded in the system. The system usability and reliability have been tested in real-life conditions, with an accuracy larger than 81%. PMID:26783514
MAGID-II: a next-generation magnetic unattended ground sensor (UGS)
NASA Astrophysics Data System (ADS)
Walter, Paul A.; Mauriello, Fred; Huber, Philip
2012-06-01
A next generation magnetic sensor is being developed at L-3 Communications, Communication Systems East to enhance the ability of Army and Marine Corps unattended ground sensor (UGS) systems to detect and track targets on the battlefield. This paper describes a magnetic sensor that provides superior detection range for both armed personnel and vehicle targets, at a reduced size, weight, and level of power consumption (SWAP) over currently available magnetic sensors. The design integrates the proven technology of a flux gate magnetometer combined with advanced digital signal processing algorithms to provide the warfighter with a rapidly deployable, extremely low false-alarm-rate sensor. This new sensor improves on currently available magnetic UGS systems by providing not only target detection and direction information, but also a magnetic disturbance readout, indicating the size of the target. The sensor integrates with Government Off-the-Shelf (GOTS) systems such as the United States Army's Battlefield Anti-Intrusion System (BAIS) and the United States Marine Corps Tactical Remote Sensor System (TRSS). The system has undergone testing by the US Marine Corps, as well as extensive company testing. Results from these field tests are given.
Echocardiography in Infective Endocarditis: State of the Art.
Afonso, Luis; Kottam, Anupama; Reddy, Vivek; Penumetcha, Anirudh
2017-10-25
In this review, we examine the central role of echocardiography in the diagnosis, prognosis, and management of infective endocarditis (IE). 2D transthoracic echocardiography (TTE) and transesophageal echocardiography TEE have complementary roles and are unequivocally the mainstay of diagnostic imaging in IE. The advent of 3D and multiplanar imaging have greatly enhanced the ability of the imager to evaluate cardiac structure and function. Technologic advances in 3D imaging allow for the reconstruction of realistic anatomic images that in turn have positively impacted IE-related surgical planning and intervention. CT and metabolic imaging appear to be emerging as promising ancillary diagnostic tools that could be deployed in select scenarios to circumvent some of the limitations of echocardiography. Our review summarizes the indispensable and central role of various echocardiographic modalities in the management of infective endocarditis. The complementary role of 2D TTE and TEE are discussed and areas where 3D TEE offers incremental value highlighted. An algorithm summarizing a contemporary approach to the workup of endocarditis is provided and major societal guidelines for timing of surgery are reviewed.
Lynx: Automatic Elderly Behavior Prediction in Home Telecare.
Lopez-Guede, Jose Manuel; Moreno-Fernandez-de-Leceta, Aitor; Martinez-Garcia, Alexeiw; Graña, Manuel
2015-01-01
This paper introduces Lynx, an intelligent system for personal safety at home environments, oriented to elderly people living independently, which encompasses a decision support machine for automatic home risk prevention, tested in real-life environments to respond to real time situations. The automatic system described in this paper prevents such risks by an advanced analytic methods supported by an expert knowledge system. It is minimally intrusive, using plug-and-play sensors and machine learning algorithms to learn the elder's daily activity taking into account even his health records. If the system detects that something unusual happens (in a wide sense) or if something is wrong relative to the user's health habits or medical recommendations, it sends at real-time alarm to the family, care center, or medical agents, without human intervention. The system feeds on information from sensors deployed in the home and knowledge of subject physical activities, which can be collected by mobile applications and enriched by personalized health information from clinical reports encoded in the system. The system usability and reliability have been tested in real-life conditions, with an accuracy larger than 81%.
2014-10-01
applications of present nano-/ bio -technology include advanced health and fitness monitoring, high-resolution imaging, new environmental sensor platforms...others areas where nano-/ bio -technology development is needed: • Sensors : Diagnostic and detection kits (gene-chips, protein-chips, lab-on-chips, etc...studies on chemo- bio nano- sensors , ultra-sensitive biochips (“lab-on-a-chip” and “cells-on-chips” devices) have been prepared for routine medical
Recent development of infrasound monitoring network in Romania
NASA Astrophysics Data System (ADS)
Ghica, Daniela; Popa, Mihaela; Ionescu, Constantin
2017-04-01
The second half of 2016 was marked at National Institute for Earth Physics (NIEP) by a significant development of infrasound monitoring infrastructure in Romania. In addition to IPLOR, the 6-element acoustic array installed at Plostina, in the central part of Romania, since 2009, two other four-element arrays were deployed. The first one, BURARI infrasound research array, was deployed in late July 2016, under a joint effort of AFTAC, USA and NIEP, in the northern part of Romania, in Bucovina region. The sites, placed in vicinity of the central elements of BURAR seismic array (over 1.2 km aperture), are equipped with Chaparral Physics Model 21 microbarometers and Reftek RT 130 data loggers. The data, used mainly for research purposes within the scientific collaboration project between NIEP and AFTAC, are available to scientific community. The second one is a PTS portable infrasound array (I67RO) deployed for one year, starting with the end of September 2016, within a collaboration project between NIEP and PTS of the Preparatory Commission for CTBTO. This array is located in the western part of Romania, at Marisel, Cluj County, covering a 0.9 km aperture and being equipped with CEA/DAM MB2005 microbarometers and Reftek RT 130 data loggers. This joint experiment aims to contribute both to advanced understanding of infrasound sources in Central-Europe and to ARISE design study project, as an expansion of the spatial coverage of the European infrasound network. The data recorded by the three infrasound arrays deployed in Romania, during a same time interval (October - December 2016) were processed into detection arrival bulletins applying CEA/DASE PMCC algorithm embedded in DTK-GPMCC (extended CTBTO NDC-in-a-box) and WinPMCC software applications. The results were plotted and analyzed using DTK-DIVA software (extended CTBTO NDC-in-a-box), in order to assess detectability of each station, as well as the capacity of fusing detections into support of infrasound monitoring activity at NIEP. We present infrasound signals generated by an impulsive event (accidental explosion of a train carrying liquid petroleum gas in Hitrino, Bulgaria) recorded on these three arrays. The features calculated for the arrivals detected (backazimuth, arrival time, frequency and celerity) are used to associate signals with event and observe individually array performance.
Algorithms for Lightweight Key Exchange †
Santonja, Juan; Zamora, Antonio
2017-01-01
Public-key cryptography is too slow for general purpose encryption, with most applications limiting its use as much as possible. Some secure protocols, especially those that enable forward secrecy, make a much heavier use of public-key cryptography, increasing the demand for lightweight cryptosystems that can be implemented in low powered or mobile devices. This performance requirements are even more significant in critical infrastructure and emergency scenarios where peer-to-peer networks are deployed for increased availability and resiliency. We benchmark several public-key key-exchange algorithms, determining those that are better for the requirements of critical infrastructure and emergency applications and propose a security framework based on these algorithms and study its application to decentralized node or sensor networks. PMID:28654006
On securing wireless sensor network--novel authentication scheme against DOS attacks.
Raja, K Nirmal; Beno, M Marsaline
2014-10-01
Wireless sensor networks are generally deployed for collecting data from various environments. Several applications specific sensor network cryptography algorithms have been proposed in research. However WSN's has many constrictions, including low computation capability, less memory, limited energy resources, vulnerability to physical capture, which enforce unique security challenges needs to make a lot of improvements. This paper presents a novel security mechanism and algorithm for wireless sensor network security and also an application of this algorithm. The proposed scheme is given to strong authentication against Denial of Service Attacks (DOS). The scheme is simulated using network simulator2 (NS2). Then this scheme is analyzed based on the network packet delivery ratio and found that throughput has improved.
NASA Technical Reports Server (NTRS)
Lockett, Tiffany Russell; Martinez, Armando; Boyd, Darren; SanSouice, Michael; Farmer, Brandon; Schneider, Todd; Laue, Greg; Fabisinski, Leo; Johnson, Les; Carr, John A.
2015-01-01
This paper describes recent advancements of the Lightweight Integrated Solar Array and Transceiver (LISA-T) currently being developed at NASA's Marshall Space Flight Center. The LISA-T array comprises a launch stowed, orbit deployed structure on which thin-film photovoltaic (PV) and antenna devices are embedded. The system provides significant electrical power generation at low weights, high stowage efficiency, and without the need for solar tracking. Leveraging high-volume terrestrial-market PVs also gives the potential for lower array costs. LISA-T is addressing the power starvation epidemic currently seen by many small-scale satellites while also enabling the application of deployable antenna arrays. Herein, an overview of the system and its applications are presented alongside sub-system development progress and environmental testing plans.
NASA Technical Reports Server (NTRS)
Russell, Tiffany; Martinez, Armando; Boyd, Darren; SanSoucie, Michael; Farmer, Brandon; Schneider, Todd; Fabisinski, Leo; Johnson, Les; Carr, John A.
2015-01-01
This paper describes recent advancements of the Lightweight Integrated Solar Array and Transceiver (LISA-T) currently being developed at NASA's Marshall Space Flight Center. The LISA-T array comprises a launch stowed, orbit deployed structure on which thin-film photovoltaic (PV) and antenna devices are embedded. The system provides significant electrical power generation at low weights, high stowage efficiency, and without the need for solar tracking. Leveraging high-volume terrestrial-market PVs also gives the potential for lower array costs. LISA-T is addressing the power starvation epidemic currently seen by many small-scale satellites while also enabling the application of deployable antenna arrays. Herein, an overview of the system and its applications are presented alongside sub-system development progress and environmental testing plans/initial results.
Exascale computing and what it means for shock physics
NASA Astrophysics Data System (ADS)
Germann, Timothy
2015-06-01
The U.S. Department of Energy is preparing to launch an Exascale Computing Initiative, to address the myriad challenges required to deploy and effectively utilize an exascale-class supercomputer (i.e., one capable of performing 1018 operations per second) in the 2023 timeframe. Since physical (power dissipation) requirements limit clock rates to at most a few GHz, this will necessitate the coordination of on the order of a billion concurrent operations, requiring sophisticated system and application software, and underlying mathematical algorithms, that may differ radically from traditional approaches. Even at the smaller workstation or cluster level of computation, the massive concurrency and heterogeneity within each processor will impact computational scientists. Through the multi-institutional, multi-disciplinary Exascale Co-design Center for Materials in Extreme Environments (ExMatEx), we have initiated an early and deep collaboration between domain (computational materials) scientists, applied mathematicians, computer scientists, and hardware architects, in order to establish the relationships between algorithms, software stacks, and architectures needed to enable exascale-ready materials science application codes within the next decade. In my talk, I will discuss these challenges, and what it will mean for exascale-era electronic structure, molecular dynamics, and engineering-scale simulations of shock-compressed condensed matter. In particular, we anticipate that the emerging hierarchical, heterogeneous architectures can be exploited to achieve higher physical fidelity simulations using adaptive physics refinement. This work is supported by the U.S. Department of Energy, Office of Science, Office of Advanced Scientific Computing Research.
GROMACS 4.5: a high-throughput and highly parallel open source molecular simulation toolkit
Pronk, Sander; Páll, Szilárd; Schulz, Roland; Larsson, Per; Bjelkmar, Pär; Apostolov, Rossen; Shirts, Michael R.; Smith, Jeremy C.; Kasson, Peter M.; van der Spoel, David; Hess, Berk; Lindahl, Erik
2013-01-01
Motivation: Molecular simulation has historically been a low-throughput technique, but faster computers and increasing amounts of genomic and structural data are changing this by enabling large-scale automated simulation of, for instance, many conformers or mutants of biomolecules with or without a range of ligands. At the same time, advances in performance and scaling now make it possible to model complex biomolecular interaction and function in a manner directly testable by experiment. These applications share a need for fast and efficient software that can be deployed on massive scale in clusters, web servers, distributed computing or cloud resources. Results: Here, we present a range of new simulation algorithms and features developed during the past 4 years, leading up to the GROMACS 4.5 software package. The software now automatically handles wide classes of biomolecules, such as proteins, nucleic acids and lipids, and comes with all commonly used force fields for these molecules built-in. GROMACS supports several implicit solvent models, as well as new free-energy algorithms, and the software now uses multithreading for efficient parallelization even on low-end systems, including windows-based workstations. Together with hand-tuned assembly kernels and state-of-the-art parallelization, this provides extremely high performance and cost efficiency for high-throughput as well as massively parallel simulations. Availability: GROMACS is an open source and free software available from http://www.gromacs.org. Contact: erik.lindahl@scilifelab.se Supplementary information: Supplementary data are available at Bioinformatics online. PMID:23407358
Advanced detection, isolation and accommodation of sensor failures: Real-time evaluation
NASA Technical Reports Server (NTRS)
Merrill, Walter C.; Delaat, John C.; Bruton, William M.
1987-01-01
The objective of the Advanced Detection, Isolation, and Accommodation (ADIA) Program is to improve the overall demonstrated reliability of digital electronic control systems for turbine engines by using analytical redundacy to detect sensor failures. The results of a real time hybrid computer evaluation of the ADIA algorithm are presented. Minimum detectable levels of sensor failures for an F100 engine control system are determined. Also included are details about the microprocessor implementation of the algorithm as well as a description of the algorithm itself.
ERIC Educational Resources Information Center
Wheeldon, R.; Atkinson, R.; Dawes, A.; Levinson, R.
2012-01-01
Background and purpose: Chemistry examinations can favour the deployment of algorithmic procedures like Le Chatelier's Principle (LCP) rather than reasoning using chemical principles. This study investigated the explanatory resources which high school students use to answer equilibrium problems and whether the marks given for examination answers…
Bradbury, Kyle; Saboo, Raghav; L. Johnson, Timothy; Malof, Jordan M.; Devarajan, Arjun; Zhang, Wuming; M. Collins, Leslie; G. Newell, Richard
2016-01-01
Earth-observing remote sensing data, including aerial photography and satellite imagery, offer a snapshot of the world from which we can learn about the state of natural resources and the built environment. The components of energy systems that are visible from above can be automatically assessed with these remote sensing data when processed with machine learning methods. Here, we focus on the information gap in distributed solar photovoltaic (PV) arrays, of which there is limited public data on solar PV deployments at small geographic scales. We created a dataset of solar PV arrays to initiate and develop the process of automatically identifying solar PV locations using remote sensing imagery. This dataset contains the geospatial coordinates and border vertices for over 19,000 solar panels across 601 high-resolution images from four cities in California. Dataset applications include training object detection and other machine learning algorithms that use remote sensing imagery, developing specific algorithms for predictive detection of distributed PV systems, estimating installed PV capacity, and analysis of the socioeconomic correlates of PV deployment. PMID:27922592
NASA Astrophysics Data System (ADS)
Bradbury, Kyle; Saboo, Raghav; L. Johnson, Timothy; Malof, Jordan M.; Devarajan, Arjun; Zhang, Wuming; M. Collins, Leslie; G. Newell, Richard
2016-12-01
Earth-observing remote sensing data, including aerial photography and satellite imagery, offer a snapshot of the world from which we can learn about the state of natural resources and the built environment. The components of energy systems that are visible from above can be automatically assessed with these remote sensing data when processed with machine learning methods. Here, we focus on the information gap in distributed solar photovoltaic (PV) arrays, of which there is limited public data on solar PV deployments at small geographic scales. We created a dataset of solar PV arrays to initiate and develop the process of automatically identifying solar PV locations using remote sensing imagery. This dataset contains the geospatial coordinates and border vertices for over 19,000 solar panels across 601 high-resolution images from four cities in California. Dataset applications include training object detection and other machine learning algorithms that use remote sensing imagery, developing specific algorithms for predictive detection of distributed PV systems, estimating installed PV capacity, and analysis of the socioeconomic correlates of PV deployment.
Bradbury, Kyle; Saboo, Raghav; L Johnson, Timothy; Malof, Jordan M; Devarajan, Arjun; Zhang, Wuming; M Collins, Leslie; G Newell, Richard
2016-12-06
Earth-observing remote sensing data, including aerial photography and satellite imagery, offer a snapshot of the world from which we can learn about the state of natural resources and the built environment. The components of energy systems that are visible from above can be automatically assessed with these remote sensing data when processed with machine learning methods. Here, we focus on the information gap in distributed solar photovoltaic (PV) arrays, of which there is limited public data on solar PV deployments at small geographic scales. We created a dataset of solar PV arrays to initiate and develop the process of automatically identifying solar PV locations using remote sensing imagery. This dataset contains the geospatial coordinates and border vertices for over 19,000 solar panels across 601 high-resolution images from four cities in California. Dataset applications include training object detection and other machine learning algorithms that use remote sensing imagery, developing specific algorithms for predictive detection of distributed PV systems, estimating installed PV capacity, and analysis of the socioeconomic correlates of PV deployment.
Advances in multi-sensor data fusion: algorithms and applications.
Dong, Jiang; Zhuang, Dafang; Huang, Yaohuan; Fu, Jingying
2009-01-01
With the development of satellite and remote sensing techniques, more and more image data from airborne/satellite sensors have become available. Multi-sensor image fusion seeks to combine information from different images to obtain more inferences than can be derived from a single sensor. In image-based application fields, image fusion has emerged as a promising research area since the end of the last century. The paper presents an overview of recent advances in multi-sensor satellite image fusion. Firstly, the most popular existing fusion algorithms are introduced, with emphasis on their recent improvements. Advances in main applications fields in remote sensing, including object identification, classification, change detection and maneuvering targets tracking, are described. Both advantages and limitations of those applications are then discussed. Recommendations are addressed, including: (1) Improvements of fusion algorithms; (2) Development of "algorithm fusion" methods; (3) Establishment of an automatic quality assessment scheme.
Final Scientific Report - Wireless and Sensing Solutions Advancing Industrial Efficiency
DOE Office of Scientific and Technical Information (OSTI.GOV)
Budampati, Rama; McBrady, Adam; Nusseibeh, Fouad
2009-09-28
The project team's goal for the Wireless and Sensing Solution Advancing Industrial Efficiency award (DE-FC36-04GO14002) was to develop, demonstrate, and test a number of leading edge technologies that could enable the emergence of wireless sensor and sampling systems for the industrial market space. This effort combined initiatives in advanced sensor development, configurable sampling and deployment platforms, and robust wireless communications to address critical obstacles in enabling enhanced industrial efficiency.
A retrospective analysis of funding and focus in US advanced fission innovation
NASA Astrophysics Data System (ADS)
Abdulla, A.; Ford, M. J.; Morgan, M. G.; Victor, D. G.
2017-08-01
Deep decarbonization of the global energy system will require large investments in energy innovation and the deployment of new technologies. While many studies have focused on the expenditure that will be needed, here we focus on how government has spent public sector resources on innovation for a key carbon-free technology: advanced nuclear. We focus on nuclear power because it has been contributing almost 20% of total US electric generation, and because the US program in this area has historically been the world’s leading effort. Using extensive data acquired through the Freedom of Information Act, we reconstruct the budget history of the Department of Energy’s program to develop advanced, non-light water nuclear reactors. Our analysis shows that—despite spending 2 billion since the late 1990s—no advanced design is ready for deployment. Even if the program had been well designed, it still would have been insufficient to demonstrate even one non-light water technology. It has violated much of the wisdom about the effective execution of innovative programs: annual funding varies fourfold, priorities are ephemeral, incumbent technologies and fuels are prized over innovation, and infrastructure spending consumes half the budget. Absent substantial changes, the possibility of US-designed advanced reactors playing a role in decarbonization by mid-century is low.
manage the portfolio of projects funded by non-DOE partners in the Deployment & Market Transformation Advanced and Renewable Technologies group. Education Executive MBA, University of Colorado, Denver B.S
NASA Astrophysics Data System (ADS)
Senkerik, Roman; Zelinka, Ivan; Davendra, Donald; Oplatkova, Zuzana
2010-06-01
This research deals with the optimization of the control of chaos by means of evolutionary algorithms. This work is aimed on an explanation of how to use evolutionary algorithms (EAs) and how to properly define the advanced targeting cost function (CF) securing very fast and precise stabilization of desired state for any initial conditions. As a model of deterministic chaotic system, the one dimensional Logistic equation was used. The evolutionary algorithm Self-Organizing Migrating Algorithm (SOMA) was used in four versions. For each version, repeated simulations were conducted to outline the effectiveness and robustness of used method and targeting CF.
NASA Astrophysics Data System (ADS)
Schott, John R.; Brown, Scott D.; Raqueno, Rolando V.; Gross, Harry N.; Robinson, Gary
1999-01-01
The need for robust image data sets for algorithm development and testing has prompted the consideration of synthetic imagery as a supplement to real imagery. The unique ability of synthetic image generation (SIG) tools to supply per-pixel truth allows algorithm writers to test difficult scenarios that would require expensive collection and instrumentation efforts. In addition, SIG data products can supply the user with `actual' truth measurements of the entire image area that are not subject to measurement error thereby allowing the user to more accurately evaluate the performance of their algorithm. Advanced algorithms place a high demand on synthetic imagery to reproduce both the spectro-radiometric and spatial character observed in real imagery. This paper describes a synthetic image generation model that strives to include the radiometric processes that affect spectral image formation and capture. In particular, it addresses recent advances in SIG modeling that attempt to capture the spatial/spectral correlation inherent in real images. The model is capable of simultaneously generating imagery from a wide range of sensors allowing it to generate daylight, low-light-level and thermal image inputs for broadband, multi- and hyper-spectral exploitation algorithms.
Resource Optimization Scheme for Multimedia-Enabled Wireless Mesh Networks
Ali, Amjad; Ahmed, Muhammad Ejaz; Piran, Md. Jalil; Suh, Doug Young
2014-01-01
Wireless mesh networking is a promising technology that can support numerous multimedia applications. Multimedia applications have stringent quality of service (QoS) requirements, i.e., bandwidth, delay, jitter, and packet loss ratio. Enabling such QoS-demanding applications over wireless mesh networks (WMNs) require QoS provisioning routing protocols that lead to the network resource underutilization problem. Moreover, random topology deployment leads to have some unused network resources. Therefore, resource optimization is one of the most critical design issues in multi-hop, multi-radio WMNs enabled with multimedia applications. Resource optimization has been studied extensively in the literature for wireless Ad Hoc and sensor networks, but existing studies have not considered resource underutilization issues caused by QoS provisioning routing and random topology deployment. Finding a QoS-provisioned path in wireless mesh networks is an NP complete problem. In this paper, we propose a novel Integer Linear Programming (ILP) optimization model to reconstruct the optimal connected mesh backbone topology with a minimum number of links and relay nodes which satisfies the given end-to-end QoS demands for multimedia traffic and identification of extra resources, while maintaining redundancy. We further propose a polynomial time heuristic algorithm called Link and Node Removal Considering Residual Capacity and Traffic Demands (LNR-RCTD). Simulation studies prove that our heuristic algorithm provides near-optimal results and saves about 20% of resources from being wasted by QoS provisioning routing and random topology deployment. PMID:25111241
Resource optimization scheme for multimedia-enabled wireless mesh networks.
Ali, Amjad; Ahmed, Muhammad Ejaz; Piran, Md Jalil; Suh, Doug Young
2014-08-08
Wireless mesh networking is a promising technology that can support numerous multimedia applications. Multimedia applications have stringent quality of service (QoS) requirements, i.e., bandwidth, delay, jitter, and packet loss ratio. Enabling such QoS-demanding applications over wireless mesh networks (WMNs) require QoS provisioning routing protocols that lead to the network resource underutilization problem. Moreover, random topology deployment leads to have some unused network resources. Therefore, resource optimization is one of the most critical design issues in multi-hop, multi-radio WMNs enabled with multimedia applications. Resource optimization has been studied extensively in the literature for wireless Ad Hoc and sensor networks, but existing studies have not considered resource underutilization issues caused by QoS provisioning routing and random topology deployment. Finding a QoS-provisioned path in wireless mesh networks is an NP complete problem. In this paper, we propose a novel Integer Linear Programming (ILP) optimization model to reconstruct the optimal connected mesh backbone topology with a minimum number of links and relay nodes which satisfies the given end-to-end QoS demands for multimedia traffic and identification of extra resources, while maintaining redundancy. We further propose a polynomial time heuristic algorithm called Link and Node Removal Considering Residual Capacity and Traffic Demands (LNR-RCTD). Simulation studies prove that our heuristic algorithm provides near-optimal results and saves about 20% of resources from being wasted by QoS provisioning routing and random topology deployment.
Efficient greedy algorithms for economic manpower shift planning
NASA Astrophysics Data System (ADS)
Nearchou, A. C.; Giannikos, I. C.; Lagodimos, A. G.
2015-01-01
Consideration is given to the economic manpower shift planning (EMSP) problem, an NP-hard capacity planning problem appearing in various industrial settings including the packing stage of production in process industries and maintenance operations. EMSP aims to determine the manpower needed in each available workday shift of a given planning horizon so as to complete a set of independent jobs at minimum cost. Three greedy heuristics are presented for the EMSP solution. These practically constitute adaptations of an existing algorithm for a simplified version of EMSP which had shown excellent performance in terms of solution quality and speed. Experimentation shows that the new algorithms perform very well in comparison to the results obtained by both the CPLEX optimizer and an existing metaheuristic. Statistical analysis is deployed to rank the algorithms in terms of their solution quality and to identify the effects that critical planning factors may have on their relative efficiency.
Knowledge-based vision for space station object motion detection, recognition, and tracking
NASA Technical Reports Server (NTRS)
Symosek, P.; Panda, D.; Yalamanchili, S.; Wehner, W., III
1987-01-01
Computer vision, especially color image analysis and understanding, has much to offer in the area of the automation of Space Station tasks such as construction, satellite servicing, rendezvous and proximity operations, inspection, experiment monitoring, data management and training. Knowledge-based techniques improve the performance of vision algorithms for unstructured environments because of their ability to deal with imprecise a priori information or inaccurately estimated feature data and still produce useful results. Conventional techniques using statistical and purely model-based approaches lack flexibility in dealing with the variabilities anticipated in the unstructured viewing environment of space. Algorithms developed under NASA sponsorship for Space Station applications to demonstrate the value of a hypothesized architecture for a Video Image Processor (VIP) are presented. Approaches to the enhancement of the performance of these algorithms with knowledge-based techniques and the potential for deployment of highly-parallel multi-processor systems for these algorithms are discussed.
Development of a Deployable Nonmetallic Boom for Reconfigurable Systems of Small Spacecraft
NASA Technical Reports Server (NTRS)
Rehnmark, Fredrik; Pryor, Mark; Holmes, Buck; Schaechter, David; Pedreiro, Nelson; Carrington, Connie
2007-01-01
In 2005, NASA commenced Phase 1 of the Modular Reconfigurable High Energy Technology Demonstrator (MRHE) program to investigate reconfigurable systems of small spacecraft. During that year, Lockheed Martin's Advanced Technology Center (ATC) led an accelerated effort to develop a 1-g MRHE concept demonstration featuring robotic spacecraft simulators equipped with docking mechanisms and deployable booms. The deployable boom built for MRHE was the result of a joint effort in which ATK was primarily responsible for developing and fabricating the Collapsible Rollable Tube (CRT patent pending) boom while Lockheed Martin designed and built the motorized Boom Deployment Mechanism (BDM) under a concurrent but separate IR&D program. Tight coordination was necessary to meet testbed integration and functionality requirements. This paper provides an overview of the CRT boom and BDM designs and presents preliminary results of integration and testing to support the MRHE demonstration.
NASA Technical Reports Server (NTRS)
Lake, Mark S.; Peterson, Lee D.; Hachkowski, M. Roman; Hinkle, Jason D.; Hardaway, Lisa R.
1998-01-01
The present paper summarizes results from an ongoing research program conducted jointly by the University of Colorado and NASA Langley Research Center since 1994. This program has resulted in general guidelines for the design of high-precision deployment mechanisms, and tests of prototype deployable structures incorporating these mechanisms have shown microdynamically stable behavior (i.e., dimensional stability to parts per million). These advancements have resulted from the identification of numerous heretofore unknown microdynamic and micromechanical response phenomena, and the development of new test techniques and instrumentation systems to interrogate these phenomena. In addition, recent tests have begun to interrogate nanomechanical response of materials and joints and have been used to develop an understanding of nonlinear nanodynamic behavior in microdynamically stable structures. The ultimate goal of these efforts is to enable nano-precision active control of micro-precision deployable structures (i.e., active control to a resolution of parts per billion).
Survivability of intelligent transportation systems
DOT National Transportation Integrated Search
1999-10-01
Intelligent Transportation Systems (ITS) are being deployed around the world to improve the safety and efficiency of surface transportation through the application of advanced information technology. The introduction of ITS exposes the transportation...
transformational technologies that reduce the nation's dependence on foreign energy imports; reduce U.S. energy ; and ensure that the United States maintains its leadership in developing and deploying advanced energy
Can 100Gb/s wavelengths be deployed using 10Gb/s engineering rules?
NASA Astrophysics Data System (ADS)
Saunders, Ross; Nicholl, Gary; Wollenweber, Kevin; Schmidt, Ted
2007-09-01
A key challenge set by carriers for 40Gb/s deployments was that the 40Gb/s wavelengths should be deployable over existing 10Gb/s DWDM systems, using 10Gb/s link engineering design rules. Typical 10Gb/s link engineering rules are: 1. Polarization Mode Dispersion (PMD) tolerance of 10ps (mean); 2. Chromatic Dispersion (CD) tolerance of +/-700ps/nm 3. Operation at 50GHz channel spacing, including transit through multiple cascaded [R]OADMs; 4. Optical reach up to 2,000km. By using a combination of advanced modulation formats and adaptive dispersion compensation (technologies rarely seen at 10Gb/s outside of the submarine systems space), vendors did respond to the challenge and broadly met this requirement. As we now start to explore feasible technologies for 100Gb/s optical transport, driven by 100GE port availability on core IP routers, the carrier challenge remains the same. 100Gb/s links should be deployable over existing 10Gb/s DWDM systems using 10Gb/s link engineering rules (as listed above). To meet this challenge, optical transport technology must evolve to yet another level of complexity/maturity in both modulation formats and adaptive compensation techniques. Many clues as to how this might be achieved can be gained by first studying sister telecommunications industries, e.g. satellite (QPSK, QAM, LDCP FEC codes), wireless (advanced DSP, MSK), HDTV (TCM), etc. The optical industry is not a pioneer of new ideas in modulation schemes and coding theory, we will always be followers. However, we do have the responsibility of developing the highest capacity "modems" on the planet to carry the core backbone traffic of the Internet. As such, the key to our success will be to analyze the pros and cons of advanced modulation/coding techniques and balance this with the practical limitations of high speed electronics processing speed and the challenges of real world optical layer impairments. This invited paper will present a view on what advanced technologies are likely candidates to support 100GE optical IP transport over existing 10Gb/s DWDM systems, using 10Gb/s link engineering rules.
Distributed Load Shedding over Directed Communication Networks with Time Delays
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, Tao; Wu, Di
When generation is insufficient to support all loads under emergencies, effective and efficient load shedding needs to be deployed in order to maintain the supply-demand balance. This paper presents a distributed load shedding algorithm, which makes efficient decision based on the discovered global information. In the global information discovery process, each load only communicates with its neighboring load via directed communication links possibly with arbitrarily large but bounded time varying communication delays. We propose a novel distributed information discovery algorithm based on ratio consensus. Simulation results are used to validate the proposed method.
A framework for porting the NeuroBayes machine learning algorithm to FPGAs
NASA Astrophysics Data System (ADS)
Baehr, S.; Sander, O.; Heck, M.; Feindt, M.; Becker, J.
2016-01-01
The NeuroBayes machine learning algorithm is deployed for online data reduction at the pixel detector of Belle II. In order to test, characterize and easily adapt its implementation on FPGAs, a framework was developed. Within the framework an HDL model, written in python using MyHDL, is used for fast exploration of possible configurations. Under usage of input data from physics simulations figures of merit like throughput, accuracy and resource demand of the implementation are evaluated in a fast and flexible way. Functional validation is supported by usage of unit tests and HDL simulation for chosen configurations.
Advanced Civilian Aeronautical Concepts
NASA Technical Reports Server (NTRS)
Bushnell, Dennis M.
1996-01-01
Paper discusses alternatives to currently deployed systems which could provide revolutionary improvements in metrics applicable to civilian aeronautics. Specific missions addressed include subsonic transports, supersonic transports and personal aircraft. These alternative systems and concepts are enabled by recent and envisaged advancements in electronics, communications, computing and Designer Fluid Mechanics in conjunction with a design approach employing extensive synergistic interactions between propulsion, aerodynamics and structures.
ERIC Educational Resources Information Center
Mukolwe, Joseph O.; Okwara, Michael; Ajowi, O. Jack
2016-01-01
Worldwide, women representation in management and leadership positions is marginal. Despite immense academic advancement by women, few of them do advance to management positions. In Kenya, women make up a critical portion of human resource base. However, they are grossly underrepresented at leadership positions. This situation is reflected in…
NASA Astrophysics Data System (ADS)
Gendreau, Audrey
Efficient self-organizing virtual clusterheads that supervise data collection based on their wireless connectivity, risk, and overhead costs, are an important element of Wireless Sensor Networks (WSNs). This function is especially critical during deployment when system resources are allocated to a subsequent application. In the presented research, a model used to deploy intrusion detection capability on a Local Area Network (LAN), in the literature, was extended to develop a role-based hierarchical agent deployment algorithm for a WSN. The resulting model took into consideration the monitoring capability, risk, deployment distribution cost, and monitoring cost associated with each node. Changing the original LAN methodology approach to model a cluster-based sensor network depended on the ability to duplicate a specific parameter that represented the monitoring capability. Furthermore, other parameters derived from a LAN can elevate costs and risk of deployment, as well as jeopardize the success of an application on a WSN. A key component of the approach presented in this research was to reduce the costs when established clusterheads in the network were found to be capable of hosting additional detection agents. In addition, another cost savings component of the study addressed the reduction of vulnerabilities associated with deployment of agents to high volume nodes. The effectiveness of the presented method was validated by comparing it against a type of a power-based scheme that used each node's remaining energy as the deployment value. While available energy is directly related to the model used in the presented method, the study deliberately sought out nodes that were identified with having superior monitoring capability, cost less to create and sustain, and are at low-risk of an attack. This work investigated improving the efficiency of an intrusion detection system (IDS) by using the proposed model to deploy monitoring agents after a temperature sensing application had established the network traffic flow to the sink. The same scenario was repeated using a power-based IDS to compare it against the proposed model. To identify a clusterhead's ability to host monitoring agents after the temperature sensing application terminated, the deployed IDS utilized the communication history and other network factors in order to rank the nodes. Similarly, using the node's communication history, the deployed power-based IDS ranked nodes based on their remaining power. For each individual scenario, and after the IDS application was deployed, the temperature sensing application was run for a second time. This time, to monitor the temperature sensing agents as the data flowed towards the sink, the network traffic was rerouted through the new intrusion detection clusterheads. Consequently, if the clusterheads were shared, the re-routing step was not preformed. Experimental results in this research demonstrated the effectiveness of applying a robust deployment metric to improve upon the energy efficiency of a deployed application in a multi-application WSN. It was found that in the scenarios with the intrusion detection application that utilized the proposed model resulted in more remaining energy than in the scenarios that implemented the power-based IDS. The algorithm especially had a positive impact on the small, dense, and more homogeneous networks. This finding was reinforced by the smaller percentage of new clusterheads that was selected. Essentially, the energy cost of the route to the sink was reduced because the network traffic was rerouted through fewer new clusterheads. Additionally, it was found that the intrusion detection topology that used the proposed approach formed smaller and more connected sets of clusterheads than the power-based IDS. As a consequence, this proposed approach essentially achieved the research objective for enhancing energy use in a multi-application WSN.
2007-04-30
surface combatant. Take, for instance, the tumblehome hull design of the new Zumwalt-class destroyer. If some critical issues were to arise with the ...more aggressive target is selected, there will be a greater increase in capability for each new system deployed. However, the expected duration of...push for the most advanced technology they can get into each new system. • This behavior exacerbates the problem and leads to even longer acquisition
Road to Grid Parity through Deployment of Low-Cost 21.5% N-Type Si Solar Cells
DOE Office of Scientific and Technical Information (OSTI.GOV)
Velundur, Vijay
This project seeks to develop and deploy differentiated 21.5% efficient n-type Si solar cells while reaching the SunShot module cost goal of ≤ $0.50/W. This objective hinges on development of enabling low cost technologies that simplify the manufacturing process and reduce overall processing costs. These comprise of (1) Boron emitter formation and passivation; (2) Simplified processing process for emitter and BSF layers; and (3) Advanced metallization for the front and back contacts.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hansen, Timothy M.; Kadavil, Rahul; Palmintier, Bryan
The 21st century electric power grid is transforming with an unprecedented increase in demand and increase in new technologies. In the United States Energy Independence and Security Act of 2007, Title XIII sets the tenets for modernizing the electricity grid through what is known as the 'Smart Grid Initiative.' This initiative calls for increased design, deployment, and integration of distributed energy resources, smart technologies and appliances, and advanced storage devices. The deployment of these new technologies requires rethinking and re-engineering the traditional boundaries between different electric power system domains.
Wind deployment in the United States: states, resources, policy, and discourse.
Wilson, Elizabeth J; Stephens, Jennie C
2009-12-15
A transformation in the way the United States produces and uses energy is needed to achieve greenhouse gas reduction targets for climate change mitigation. Wind power is an important low-carbon technology and the most rapidly growing renewable energy technology in the U.S. Despite recent advances in wind deployment, significant state-by-state variation in wind power distribution cannot be explained solely by wind resource patterns nor by state policy. Other factors embedded within the state-level socio-political context also contribute to wind deployment patterns. We explore this socio-political context in four U.S. states by integrating multiple research methods. Through comparative state-level analysis of the energy system, energy policy, and public discourse as represented in the media, we examine variation in the context for wind deployment in Massachusetts, Minnesota, Montana, and Texas. Our results demonstrate that these states have different patterns of wind deployment, are engaged in different debates about wind power, and appear to frame the risks and benefits of wind power in different ways. This comparative assessment highlights the complex variation of the state-level socio-political context and contributes depth to our understanding of energy technology deployment processes, decision-making, and outcomes.
ERIC Educational Resources Information Center
Pliszka, Steven R.; Crismon, M. Lynn; Hughes, Carroll W.; Corners, C. Keith; Emslie, Graham J.; Jensen, Peter S.; McCracken, James T.; Swanson, James M.; Lopez, Molly
2006-01-01
Objective: In 1998, the Texas Department of Mental Health and Mental Retardation developed algorithms for medication treatment of attention-deficit/hyperactivity disorder (ADHD). Advances in the psychopharmacology of ADHD and results of a feasibility study of algorithm use in community mental health centers caused the algorithm to be modified and…
Signal and image processing algorithm performance in a virtual and elastic computing environment
NASA Astrophysics Data System (ADS)
Bennett, Kelly W.; Robertson, James
2013-05-01
The U.S. Army Research Laboratory (ARL) supports the development of classification, detection, tracking, and localization algorithms using multiple sensing modalities including acoustic, seismic, E-field, magnetic field, PIR, and visual and IR imaging. Multimodal sensors collect large amounts of data in support of algorithm development. The resulting large amount of data, and their associated high-performance computing needs, increases and challenges existing computing infrastructures. Purchasing computer power as a commodity using a Cloud service offers low-cost, pay-as-you-go pricing models, scalability, and elasticity that may provide solutions to develop and optimize algorithms without having to procure additional hardware and resources. This paper provides a detailed look at using a commercial cloud service provider, such as Amazon Web Services (AWS), to develop and deploy simple signal and image processing algorithms in a cloud and run the algorithms on a large set of data archived in the ARL Multimodal Signatures Database (MMSDB). Analytical results will provide performance comparisons with existing infrastructure. A discussion on using cloud computing with government data will discuss best security practices that exist within cloud services, such as AWS.
A Comparison of Three Algorithms for Orion Drogue Parachute Release
NASA Technical Reports Server (NTRS)
Matz, Daniel A.; Braun, Robert D.
2015-01-01
The Orion Multi-Purpose Crew Vehicle is susceptible to ipping apex forward between drogue parachute release and main parachute in ation. A smart drogue release algorithm is required to select a drogue release condition that will not result in an apex forward main parachute deployment. The baseline algorithm is simple and elegant, but does not perform as well as desired in drogue failure cases. A simple modi cation to the baseline algorithm can improve performance, but can also sometimes fail to identify a good release condition. A new algorithm employing simpli ed rotational dynamics and a numeric predictor to minimize a rotational energy metric is proposed. A Monte Carlo analysis of a drogue failure scenario is used to compare the performance of the algorithms. The numeric predictor prevents more of the cases from ipping apex forward, and also results in an improvement in the capsule attitude at main bag extraction. The sensitivity of the numeric predictor to aerodynamic dispersions, errors in the navigated state, and execution rate is investigated, showing little degradation in performance.
NASA Technical Reports Server (NTRS)
Russell, B. Don
1989-01-01
This research concentrated on the application of advanced signal processing, expert system, and digital technologies for the detection and control of low grade, incipient faults on spaceborne power systems. The researchers have considerable experience in the application of advanced digital technologies and the protection of terrestrial power systems. This experience was used in the current contracts to develop new approaches for protecting the electrical distribution system in spaceborne applications. The project was divided into three distinct areas: (1) investigate the applicability of fault detection algorithms developed for terrestrial power systems to the detection of faults in spaceborne systems; (2) investigate the digital hardware and architectures required to monitor and control spaceborne power systems with full capability to implement new detection and diagnostic algorithms; and (3) develop a real-time expert operating system for implementing diagnostic and protection algorithms. Significant progress has been made in each of the above areas. Several terrestrial fault detection algorithms were modified to better adapt to spaceborne power system environments. Several digital architectures were developed and evaluated in light of the fault detection algorithms.
Bonnet, Stéphane; Gonzalez, F; Mathieu, L; Boddaert, G; Hornez, E; Bertani, A; Avaro, J-P; Durand, X; Rongieras, F; Balandraud, P; Rigal, S; Pons, F
2016-10-01
The composition of a French Forward Surgical Team (FST) has remained constant since its creation in the early 1950s: 12 personnel, including a general and an orthopaedic surgeon. The training of military surgeons, however, has had to evolve to adapt to the growing complexities of modern warfare injuries in the context of increasing subspecialisation within surgery. The Advanced Course for Deployment Surgery (ACDS)-called Cours Avancé de Chirurgie en Mission Extérieure (CACHIRMEX)-has been designed to extend, reinforce and adapt the surgical skill set of the FST that will be deployed. Created in 2007 by the French Military Health Service Academy (Ecole du Val-de-Grâce), this annual course is composed of five modules. The surgical knowledge and skills necessary to manage complex military trauma and give medical support to populations during deployment are provided through a combination of didactic lectures, deployment experience reports and hands-on workshops. The course is now a compulsory component of initial surgical training for junior military surgeons and part of the Continuous Medical Education programme for senior military surgeons. From 2012, the standardised content of the ACDS paved the way for the development of two more team-training courses: the FST and the Special Operation Surgical Team training. The content of this French military original war surgery course is described, emphasising its practical implications and future prospects. The military surgical training needs to be regularly assessed to deliver the best quality of care in an context of evolving modern warfare casualties. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Advanced Physiological Estimation of Cognitive Status (APECS)
2009-09-15
REPORT Advanced Physiological Estimation of Cognitive Status (APECS) Final Report 14. ABSTRACT 16. SECURITY CLASSIFICATION OF: EEG...fitness and transmit data to command and control systems. Some of the signals that the physiological sensors measure are readily interpreted, such as...electroencephalogram (EEG) and other signals requires a complex series of mathematical transformations or algorithms. Overall, research on algorithms
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
2012-05-01
Biannual newsletter for the U.S. Department of Energy's Clean Cities initiative. The newsletter includes feature stories on advanced vehicle deployment, idle reduction, and articles on Clean Cities coalition successes across the country.
Boosting Manufacturing through Modular Chemical Process Intensification
None
2018-06-12
Manufacturing USA's Rapid Advancement in Process Intensification Deployment Institute will focus on developing breakthrough technologies to boost domestic energy productivity and energy efficiency by 20 percent in five years through manufacturing processes.
Boosting Manufacturing through Modular Chemical Process Intensification
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
2016-12-09
Manufacturing USA's Rapid Advancement in Process Intensification Deployment Institute will focus on developing breakthrough technologies to boost domestic energy productivity and energy efficiency by 20 percent in five years through manufacturing processes.
Recent Results from the MicroMAS Global Environmental MonitoringNanosatellite Mission
NASA Astrophysics Data System (ADS)
Blackwell, W. J.; Cahoy, K.
2014-12-01
The Micro-sized Microwave Atmospheric Satellite (MicroMAS) is a dual-spinning 3U CubeSat equipped with apassive microwave radiometer that observes in nine channels near the 118.75-GHz oxygen absorption line.MicroMAS is designed to observe convective thunderstorms, tropical cyclones, and hurricanes from a midinclinationorbit. The MicroMAS flight unit was developed by MIT Lincoln Laboratory and the MIT Space SystemsLaboratory and was launched to the International Space Station on July 13, 2014, and scheduled for an earlySeptember deployment for a ~90-day mission. The payload is housed in the "lower" 1U of the dual-spinning 3UCubeSat and mechanically rotated approximately once per second as the spacecraft orbits the Earth, resulting in across-track scanned beam with a full-width half-max (FWHM) beamwidth of 2.4 degrees and an approximately 17-km diameter footprint at nadir incidence from a nominal altitude of 400 km. The relatively low cost of MicroMASenables the deployment of a constellation of sensors, spaced equally around several orbit planes. A small fleet ofMicroMAS systems could yield high-resolution global temperature and water vapor profiles, as well as cloudmicrophysical and precipitation parameters.Significant advancements were made in the Assembly, Integration, and Test phase of the project developmentlifecycle. The flight software and communications architecture was refined and tested in relevant lab facilities. Thepower subsystem was modified to include additional required inhibits for the ISS launch. Hardware in the loop testsas well as simulations of the attitude determination and control system (ADCS) were performed to validate theunique dual-spinning, local vertical, local horizontal (LVLH) stabilized flight design. ADCS algorithms were testedon a 3-axis air bearing and custom rig inside a 3-axis programmable Helmholtz cage. Finally, the integratedspacecraft underwent a series of environmental tests in order to verify the results of thermal modeling analyses,prove the performance of critical design components in relevant environmental conditions, and validate the softwareand concept-of-operations developed for flight. We present these advancements, lessons-learned in developing ascience-oriented CubeSat system, and any available launch/on-orbit updates.
Cross-platform validation and analysis environment for particle physics
NASA Astrophysics Data System (ADS)
Chekanov, S. V.; Pogrebnyak, I.; Wilbern, D.
2017-11-01
A multi-platform validation and analysis framework for public Monte Carlo simulation for high-energy particle collisions is discussed. The front-end of this framework uses the Python programming language, while the back-end is written in Java, which provides a multi-platform environment that can be run from a web browser and can easily be deployed at the grid sites. The analysis package includes all major software tools used in high-energy physics, such as Lorentz vectors, jet algorithms, histogram packages, graphic canvases, and tools for providing data access. This multi-platform software suite, designed to minimize OS-specific maintenance and deployment time, is used for online validation of Monte Carlo event samples through a web interface.
End-to-end commissioning demonstration of the James Webb Space Telescope
NASA Astrophysics Data System (ADS)
Acton, D. Scott; Towell, Timothy; Schwenker, John; Shields, Duncan; Sabatke, Erin; Contos, Adam R.; Hansen, Karl; Shi, Fang; Dean, Bruce; Smith, Scott
2007-09-01
The one-meter Testbed Telescope (TBT) has been developed at Ball Aerospace to facilitate the design and implementation of the wavefront sensing and control (WFSC) capabilities of the James Webb Space Telescope (JWST). We have recently conducted an "end-to-end" demonstration of the flight commissioning process on the TBT. This demonstration started with the Primary Mirror (PM) segments and the Secondary Mirror (SM) in random positions, traceable to the worst-case flight deployment conditions. The commissioning process detected and corrected the deployment errors, resulting in diffraction-limited performance across the entire science FOV. This paper will describe the commissioning demonstration and the WFSC algorithms used at each step in the process.
A roadmap for nuclear energy technology
NASA Astrophysics Data System (ADS)
Sofu, Tanju
2018-01-01
The prospects for the future use of nuclear energy worldwide can best be understood within the context of global population growth, urbanization, rising energy need and associated pollution concerns. As the world continues to urbanize, sustainable development challenges are expected to be concentrated in cities of the lower-middle-income countries where the pace of urbanization is fastest. As these countries continue their trajectory of economic development, their energy need will also outpace their population growth adding to the increased demand for electricity. OECD IEA's energy system deployment pathway foresees doubling of the current global nuclear capacity by 2050 to reduce the impact of rapid urbanization. The pending "retirement cliff" of the existing U.S. nuclear fleet, representing over 60 percent of the nation's emission-free electricity, also poses a large economic and environmental challenge. To meet the challenge, the U.S. DOE has developed the vision and strategy for development and deployment of advanced reactors. As part of that vision, the U.S. government pursues programs that aim to expand the use of nuclear power by supporting sustainability of the existing nuclear fleet, deploying new water-cooled large and small modular reactors to enable nuclear energy to help meet the energy security and climate change goals, conducting R&D for advanced reactor technologies with alternative coolants, and developing sustainable nuclear fuel cycle strategies. Since the current path relying heavily on water-cooled reactors and "once-through" fuel cycle is not sustainable, next generation nuclear energy systems under consideration aim for significant advances over existing and evolutionary water-cooled reactors. Among the spectrum of advanced reactor options, closed-fuel-cycle systems using reactors with fast-neutron spectrum to meet the sustainability goals offer the most attractive alternatives. However, unless the new public-private partnership models emerge to tackle the licensing and demonstration challenges for these advanced reactor concepts, realization of their enormous potential is not likely, at least in the U.S.
Final Technical Report: Commercial Advanced Lighting Control (ALC) Demonstration and Deployment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arnold, Gabe
This three-year demonstration and deployment project sought to address market barriers to accelerating the adoption of Advanced Lighting Controls (ALCs), an underutilized technology with low market penetration. ALCs are defined as networked, addressable lighting control systems that utilize software or intelligent controllers to combine multiple energy-saving lighting control strategies in a single space (e.g., smart-time scheduling, daylight harvesting, task tuning, occupancy control, personal control, variable load-shedding, and plug-load control). The networked intelligent aspect of these systems allows applicable lighting control strategies to be combined in a single space, layered over one another, maximizing overall energy-savings. The project included five realmore » building demonstrations of ALCs across the Northeast US region. The demonstrations provided valuable data and experience to support deployment tasks that are necessary to overcome market barriers. These deployment tasks included development of training resources for building designers, installers, and trades, as well as development of new energy efficiency rebates for the technology from Efficiency Forward’s utility partners. Educating designers, installers, and trades on ALCs is a critical task for reducing the cost of the technology that is currently inflated due to perceived complexity and unfamiliarity with how to design and install the systems. Further, utility and non-utility energy efficiency programs continue to relegate the technology to custom or ill-suited prescriptive program designs that do not effectively deploy the technology at scale. This project developed new, scalable rebate approaches for the technology. Efficiency Forward utilized their DesignLights Consortium® (DLC) brand and network of 81 DLC member utilities to develop and deploy the results of the project. The outputs of the project have included five published case studies, a six-hour ALC technology training curriculum that has already been deployed in five US states, and new rebates offered for the technology that have been deployed by a dozen utilities across the US. Widespread adoption of ALC technology in commercial buildings would provide tremendous benefits. The current market penetration of ALC systems is estimated at <0.1% in commercial buildings. If ALC systems were installed in all commercial buildings, approximately 1,051 TBtu of energy could be saved. This would translate into customer cost savings of approximately $10.7 billion annually.« less
Stitzel, Joel D; Weaver, Ashley A; Talton, Jennifer W; Barnard, Ryan T; Schoell, Samantha L; Doud, Andrea N; Martin, R Shayn; Meredith, J Wayne
2016-06-01
Advanced Automatic Crash Notification algorithms use vehicle telemetry measurements to predict risk of serious motor vehicle crash injury. The objective of the study was to develop an Advanced Automatic Crash Notification algorithm to reduce response time, increase triage efficiency, and improve patient outcomes by minimizing undertriage (<5%) and overtriage (<50%), as recommended by the American College of Surgeons. A list of injuries associated with a patient's need for Level I/II trauma center treatment known as the Target Injury List was determined using an approach based on 3 facets of injury: severity, time sensitivity, and predictability. Multivariable logistic regression was used to predict an occupant's risk of sustaining an injury on the Target Injury List based on crash severity and restraint factors for occupants in the National Automotive Sampling System - Crashworthiness Data System 2000-2011. The Advanced Automatic Crash Notification algorithm was optimized and evaluated to minimize triage rates, per American College of Surgeons recommendations. The following rates were achieved: <50% overtriage and <5% undertriage in side impacts and 6% to 16% undertriage in other crash modes. Nationwide implementation of our algorithm is estimated to improve triage decisions for 44% of undertriaged and 38% of overtriaged occupants. Annually, this translates to more appropriate care for >2,700 seriously injured occupants and reduces unnecessary use of trauma center resources for >162,000 minimally injured occupants. The algorithm could be incorporated into vehicles to inform emergency personnel of recommended motor vehicle crash triage decisions. Lower under- and overtriage was achieved, and nationwide implementation of the algorithm would yield improved triage decision making for an estimated 165,000 occupants annually. Copyright © 2016. Published by Elsevier Inc.
76 FR 46892 - Agency Information Collection Activity Under OMB Review
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-03
... development and deployment of clean fuel and advanced propulsion technologies for transit buses. To meet... propulsion technologies for transit buses by providing funds for clean fuel vehicles and facilities. To meet...
Clean Cities Now Vol. 17, No. 1
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
2013-05-24
Biannual newsletter for the U.S. Department of Energy's Clean Cities initiative. The newsletter includes feature stories on advanced vehicle deployment, idle reduction, and articles on Clean Cities coalition successes across the country.
RF Technologies for Advancing Space Communication Infrastructure
NASA Technical Reports Server (NTRS)
Romanofsky, Robert R.; Bibyk, Irene K.; Wintucky, Edwin G.
2006-01-01
This paper will address key technologies under development at the NASA Glenn Research Center designed to provide architecture-level impacts. Specifically, we will describe deployable antennas, a new type of phased array antenna and novel power amplifiers. The evaluation of architectural influence can be conducted from two perspectives where said architecture can be analyzed from either the top-down to determine the areas where technology improvements will be most beneficial or from the bottom-up where each technology s performance advancement can affect the overall architecture s performance. This paper will take the latter approach with focus on some technology improvement challenges and address architecture impacts. For example, using data rate as a performance metric, future exploration scenarios are expected to demand data rates possibly exceeding 1 Gbps. To support these advancements in a Mars scenario, as an example, Ka-band and antenna aperture sizes on the order of 10 meters will be required from Mars areostationary platforms. Key technical challenges for a large deployable antenna include maximizing the ratio of deployed-to-packaged volume, minimizing aerial density, maintaining RMS surface accuracy to within 1/20 of a wavelength or better, and developing reflector rigidization techniques. Moreover, the high frequencies and large apertures manifest a new problem for microwave engineers that are familiar to optical communications specialists: pointing. The fine beam widths and long ranges dictate the need for electronic or mechanical feed articulation to compensate for spacecraft attitude control limitations.
Campos, Andre N.; Souza, Efren L.; Nakamura, Fabiola G.; Nakamura, Eduardo F.; Rodrigues, Joel J. P. C.
2012-01-01
Target tracking is an important application of wireless sensor networks. The networks' ability to locate and track an object is directed linked to the nodes' ability to locate themselves. Consequently, localization systems are essential for target tracking applications. In addition, sensor networks are often deployed in remote or hostile environments. Therefore, density control algorithms are used to increase network lifetime while maintaining its sensing capabilities. In this work, we analyze the impact of localization algorithms (RPE and DPE) and density control algorithms (GAF, A3 and OGDC) on target tracking applications. We adapt the density control algorithms to address the k-coverage problem. In addition, we analyze the impact of network density, residual integration with density control, and k-coverage on both target tracking accuracy and network lifetime. Our results show that DPE is a better choice for target tracking applications than RPE. Moreover, among the evaluated density control algorithms, OGDC is the best option among the three. Although the choice of the density control algorithm has little impact on the tracking precision, OGDC outperforms GAF and A3 in terms of tracking time. PMID:22969329
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lechman, Jeremy B.; Battaile, Corbett Chandler.; Bolintineanu, Dan
This report summarizes a project in which the authors sought to develop and deploy: (i) experimental techniques to elucidate the complex, multiscale nature of thermal transport in particle-based materials; and (ii) modeling approaches to address current challenges in predicting performance variability of materials (e.g., identifying and characterizing physical- chemical processes and their couplings across multiple length and time scales, modeling information transfer between scales, and statically and dynamically resolving material structure and its evolution during manufacturing and device performance). Experimentally, several capabilities were successfully advanced. As discussed in Chapter 2 a flash diffusivity capability for measuring homogeneous thermal conductivity ofmore » pyrotechnic powders (and beyond) was advanced; leading to enhanced characterization of pyrotechnic materials and properties impacting component development. Chapter 4 describes success for the first time, although preliminary, in resolving thermal fields at speeds and spatial scales relevant to energetic components. Chapter 7 summarizes the first ever (as far as the authors know) application of TDTR to actual pyrotechnic materials. This is the first attempt to actually characterize these materials at the interfacial scale. On the modeling side, new capabilities in image processing of experimental microstructures and direct numerical simulation on complicated structures were advanced (see Chapters 3 and 5). In addition, modeling work described in Chapter 8 led to improved prediction of interface thermal conductance from first principles calculations. Toward the second point, for a model system of packed particles, significant headway was made in implementing numerical algorithms and collecting data to justify the approach in terms of highlighting the phenomena at play and pointing the way forward in developing and informing the kind of modeling approach originally envisioned (see Chapter 6). In both cases much more remains to be accomplished.« less
Dynamic analysis of the large deployable reflector
NASA Technical Reports Server (NTRS)
Calleson, Robert E.; Scott, A. Don
1987-01-01
The Large Deployable Reflector (LDR) is to be an astronomical observatory orbiting above Earth's obscuring atmosphere and operating in the spectral range between 30 microns and 1000 microns wavelength. The LDR will be used to study such astronomical phenomena as stellar and galactic formation, cosmology, and planetary atmospheres. The LDR will be the first observatory to be erected and assembled in space. This distinction brings with it several major technological challenges such as the development of ultra-lightweight deployable mirrors, advanced mirror fabrication techniques, advanced structures, and control of vibrations due to various sources of excitation. The purpose of this analysis is to provide an assessment of the vibrational response due to secondary mirror chopping and LDR slewing. The dynamic response of two 20-m LDR configurations was studied. Two mirror support configurations were investigated for the Ames concept, the first employs a six-strut secondary mirror support structure, while the second uses a triple-bipod support design. All three configurations were modeled using a tetrahedral truss design for the primary mirror support structure. Response resulting from secondary mirror chopping was obtained for the two Ames configurations, and the response of the primary mirror from slewing was obtained for all three configurations.
Fusion And Inference From Multiple And Massive Disparate Distributed Dynamic Data Sets
2017-07-01
principled methodology for two-sample graph testing; designed a provably almost-surely perfect vertex clustering algorithm for block model graphs; proved...3.7 Semi-Supervised Clustering Methodology ...................................................................... 9 3.8 Robust Hypothesis Testing...dimensional Euclidean space – allows the full arsenal of statistical and machine learning methodology for multivariate Euclidean data to be deployed for
Golden Rays - March 2017 | Solar Research | NREL
, test and deploy a data enhanced hierarchical control architecture that adopts a hybrid approach to grid control. A centralized control layer will be complemented by distributed control algorithms for solar inverters and autonomous control of grid edge devices. The other NREL project will develop a novel control
2004-01-01
items are too long or bulky to be stored in the limited number of containers, trailers, or flatbeds that are used to deploy most stocks. • Demand pattern...COVER FRAME; WINDOW; VEHICULAR GLASS LAMINATED FLA GRILLE; RADIATOR; VEH GRILLE; VENTILATION HOOD ASSEMBLY HOOD ENGINE COMPART PAN; DRIP 89 APPENDIX D DCB
Accessing eSDO Solar Image Processing and Visualization through AstroGrid
NASA Astrophysics Data System (ADS)
Auden, E.; Dalla, S.
2008-08-01
The eSDO project is funded by the UK's Science and Technology Facilities Council (STFC) to integrate Solar Dynamics Observatory (SDO) data, algorithms, and visualization tools with the UK's Virtual Observatory project, AstroGrid. In preparation for the SDO launch in January 2009, the eSDO team has developed nine algorithms covering coronal behaviour, feature recognition, and global / local helioseismology. Each of these algorithms has been deployed as an AstroGrid Common Execution Architecture (CEA) application so that they can be included in complex VO workflows. In addition, the PLASTIC-enabled eSDO "Streaming Tool" online movie application allows users to search multi-instrument solar archives through AstroGrid web services and visualise the image data through galleries, an interactive movie viewing applet, and QuickTime movies generated on-the-fly.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jamieson, Kevin; Davis, IV, Warren L.
Active learning methods automatically adapt data collection by selecting the most informative samples in order to accelerate machine learning. Because of this, real-world testing and comparing active learning algorithms requires collecting new datasets (adaptively), rather than simply applying algorithms to benchmark datasets, as is the norm in (passive) machine learning research. To facilitate the development, testing and deployment of active learning for real applications, we have built an open-source software system for large-scale active learning research and experimentation. The system, called NEXT, provides a unique platform for realworld, reproducible active learning research. This paper details the challenges of building themore » system and demonstrates its capabilities with several experiments. The results show how experimentation can help expose strengths and weaknesses of active learning algorithms, in sometimes unexpected and enlightening ways.« less
Algorithm to find distant repeats in a single protein sequence
Banerjee, Nirjhar; Sarani, Rangarajan; Ranjani, Chellamuthu Vasuki; Sowmiya, Govindaraj; Michael, Daliah; Balakrishnan, Narayanasamy; Sekar, Kanagaraj
2008-01-01
Distant repeats in protein sequence play an important role in various aspects of protein analysis. A keen analysis of the distant repeats would enable to establish a firm relation of the repeats with respect to their function and three-dimensional structure during the evolutionary process. Further, it enlightens the diversity of duplication during the evolution. To this end, an algorithm has been developed to find all distant repeats in a protein sequence. The scores from Point Accepted Mutation (PAM) matrix has been deployed for the identification of amino acid substitutions while detecting the distant repeats. Due to the biological importance of distant repeats, the proposed algorithm will be of importance to structural biologists, molecular biologists, biochemists and researchers involved in phylogenetic and evolutionary studies. PMID:19052663
Experimental and simulated control of lift using trailing edge devices
NASA Astrophysics Data System (ADS)
Cooperman, A.; Blaylock, M.; van Dam, C. P.
2014-12-01
Two active aerodynamic load control (AALC) devices coupled with a control algorithm are shown to decrease the change in lift force experienced by an airfoil during a change in freestream velocity. Microtabs are small (1% chord) surfaces deployed perpendicular to an airfoil, while microjets are pneumatic jets with flow perpendicular to the surface of the airfoil near the trailing edge. Both devices are capable of producing a rapid change in an airfoil's lift coefficient. A control algorithm for microtabs has been tested in a wind tunnel using a modified S819 airfoil, and a microjet control algorithm has been simulated for a NACA 0012 airfoil using OVERFLOW. In both cases, the AALC devices have shown the ability to mitigate the changes in lift during a gust.
Non-traditional Infrasound Deployment
NASA Astrophysics Data System (ADS)
McKenna, M. H.; McComas, S.; Simpson, C. P.; Diaz-Alvarez, H.; Costley, R. D.; Hayward, C.; Golden, P.; Endress, A.
2017-12-01
Historically, infrasound arrays have been deployed in rural environments where anthropological noise sources are limited. As interest in monitoring low energy sources at local distances grows in the infrasound community, it will be vital to understand how to monitor infrasound sources in an urban environment. Arrays deployed in urban centers have to overcome the decreased signal-to-noise ratio and reduced amount of real estate available to deploy an array. To advance the understanding of monitoring infrasound sources in urban environments, local and regional infrasound arrays were deployed on building rooftops on the campus at Southern Methodist University (SMU), and data were collected for one seasonal cycle. The data were evaluated for structural source signals (continuous-wave packets), and when a signal was identified, the back azimuth to the source was determined through frequency-wavenumber analysis. This information was used to identify hypothesized structural sources; these sources were verified through direct measurement and dynamic structural analysis modeling. In addition to the rooftop arrays, a camouflaged infrasound sensor was installed on the SMU campus and evaluated to determine its effectiveness for wind noise reduction. Permission to publish was granted by Director, Geotechnical and Structures Laboratory.
Proposed data compression schemes for the Galileo S-band contingency mission
NASA Technical Reports Server (NTRS)
Cheung, Kar-Ming; Tong, Kevin
1993-01-01
The Galileo spacecraft is currently on its way to Jupiter and its moons. In April 1991, the high gain antenna (HGA) failed to deploy as commanded. In case the current efforts to deploy the HGA fails, communications during the Jupiter encounters will be through one of two low gain antenna (LGA) on an S-band (2.3 GHz) carrier. A lot of effort has been and will be conducted to attempt to open the HGA. Also various options for improving Galileo's telemetry downlink performance are being evaluated in the event that the HGA will not open at Jupiter arrival. Among all viable options the most promising and powerful one is to perform image and non-image data compression in software onboard the spacecraft. This involves in-flight re-programming of the existing flight software of Galileo's Command and Data Subsystem processors and Attitude and Articulation Control System (AACS) processor, which have very limited computational and memory resources. In this article we describe the proposed data compression algorithms and give their respective compression performance. The planned image compression algorithm is a 4 x 4 or an 8 x 8 multiplication-free integer cosine transform (ICT) scheme, which can be viewed as an integer approximation of the popular discrete cosine transform (DCT) scheme. The implementation complexity of the ICT schemes is much lower than the DCT-based schemes, yet the performances of the two algorithms are indistinguishable. The proposed non-image compression algorith is a Lempel-Ziv-Welch (LZW) variant, which is a lossless universal compression algorithm based on a dynamic dictionary lookup table. We developed a simple and efficient hashing function to perform the string search.
Bao, Xu; Li, Haijian; Xu, Dongwei; Jia, Limin; Ran, Bin; Rong, Jian
2016-11-06
The jam flow condition is one of the main traffic states in traffic flow theory and the most difficult state for sectional traffic information acquisition. Since traffic information acquisition is the basis for the application of an intelligent transportation system, research on traffic vehicle counting methods for the jam flow conditions has been worthwhile. A low-cost and energy-efficient type of multi-function wireless traffic magnetic sensor was designed and developed. Several advantages of the traffic magnetic sensor are that it is suitable for large-scale deployment and time-sustainable detection for traffic information acquisition. Based on the traffic magnetic sensor, a basic vehicle detection algorithm (DWVDA) with less computational complexity was introduced for vehicle counting in low traffic volume conditions. To improve the detection performance in jam flow conditions with a "tailgating effect" between front vehicles and rear vehicles, an improved vehicle detection algorithm (SA-DWVDA) was proposed and applied in field traffic environments. By deploying traffic magnetic sensor nodes in field traffic scenarios, two field experiments were conducted to test and verify the DWVDA and the SA-DWVDA algorithms. The experimental results have shown that both DWVDA and the SA-DWVDA algorithms yield a satisfactory performance in low traffic volume conditions (scenario I) and both of their mean absolute percent errors are less than 1% in this scenario. However, for jam flow conditions with heavy traffic volumes (scenario II), the SA-DWVDA was proven to achieve better results, and the mean absolute percent error of the SA-DWVDA is 2.54% with corresponding results of the DWVDA 7.07%. The results conclude that the proposed SA-DWVDA can implement efficient and accurate vehicle detection in jam flow conditions and can be employed in field traffic environments.
Bao, Xu; Li, Haijian; Xu, Dongwei; Jia, Limin; Ran, Bin; Rong, Jian
2016-01-01
The jam flow condition is one of the main traffic states in traffic flow theory and the most difficult state for sectional traffic information acquisition. Since traffic information acquisition is the basis for the application of an intelligent transportation system, research on traffic vehicle counting methods for the jam flow conditions has been worthwhile. A low-cost and energy-efficient type of multi-function wireless traffic magnetic sensor was designed and developed. Several advantages of the traffic magnetic sensor are that it is suitable for large-scale deployment and time-sustainable detection for traffic information acquisition. Based on the traffic magnetic sensor, a basic vehicle detection algorithm (DWVDA) with less computational complexity was introduced for vehicle counting in low traffic volume conditions. To improve the detection performance in jam flow conditions with a “tailgating effect” between front vehicles and rear vehicles, an improved vehicle detection algorithm (SA-DWVDA) was proposed and applied in field traffic environments. By deploying traffic magnetic sensor nodes in field traffic scenarios, two field experiments were conducted to test and verify the DWVDA and the SA-DWVDA algorithms. The experimental results have shown that both DWVDA and the SA-DWVDA algorithms yield a satisfactory performance in low traffic volume conditions (scenario I) and both of their mean absolute percent errors are less than 1% in this scenario. However, for jam flow conditions with heavy traffic volumes (scenario II), the SA-DWVDA was proven to achieve better results, and the mean absolute percent error of the SA-DWVDA is 2.54% with corresponding results of the DWVDA 7.07%. The results conclude that the proposed SA-DWVDA can implement efficient and accurate vehicle detection in jam flow conditions and can be employed in field traffic environments. PMID:27827974
Effects of Deployment Investment on the Growth of the Biofuels Industry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vimmerstedt, Laura J.; Bush, Brian W.
2013-12-01
In support of the national goals for biofuel use in the United States, numerous technologies have been developed that convert biomass to biofuels. Some of these biomass to biofuel conversion technology pathways are operating at commercial scales, while others are in earlier stages of development. The advancement of a new pathway toward commercialization involves various types of progress, including yield improvements, process engineering, and financial performance. Actions of private investors and public programs can accelerate the demonstration and deployment of new conversion technology pathways. These investors (both private and public) will pursue a range of pilot, demonstration, and pioneer scalemore » biorefinery investments; the most cost-effective set of investments for advancing the maturity of any given biomass to biofuel conversion technology pathway is unknown. In some cases, whether or not the pathway itself will ultimately be technically and financially successful is also unknown. This report presents results from the Biomass Scenario Model -- a system dynamics model of the biomass to biofuels system -- that estimate effects of investments in biorefineries at different maturity levels and operational scales. The report discusses challenges in estimating effects of such investments and explores the interaction between this deployment investment and a volumetric production incentive. Model results show that investments in demonstration and deployment have a substantial positive effect on the development of the biofuels industry. Results also show that other conditions, such as supportive policies, have major impacts on the effectiveness of such investments.« less
Biomass enables the transition to a carbon-negative power system across western North America
NASA Astrophysics Data System (ADS)
Sanchez, Daniel L.; Nelson, James H.; Johnston, Josiah; Mileva, Ana; Kammen, Daniel M.
2015-03-01
Sustainable biomass can play a transformative role in the transition to a decarbonized economy, with potential applications in electricity, heat, chemicals and transportation fuels. Deploying bioenergy with carbon capture and sequestration (BECCS) results in a net reduction in atmospheric carbon. BECCS may be one of the few cost-effective carbon-negative opportunities available should anthropogenic climate change be worse than anticipated or emissions reductions in other sectors prove particularly difficult. Previous work, primarily using integrated assessment models, has identified the critical role of BECCS in long-term (pre- or post-2100 time frames) climate change mitigation, but has not investigated the role of BECCS in power systems in detail, or in aggressive time frames, even though commercial-scale facilities are starting to be deployed in the transportation sector. Here, we explore the economic and deployment implications for BECCS in the electricity system of western North America under aggressive (pre-2050) time frames and carbon emissions limitations, with rich technology representation and physical constraints. We show that BECCS, combined with aggressive renewable deployment and fossil-fuel emission reductions, can enable a carbon-negative power system in western North America by 2050 with up to 145% emissions reduction from 1990 levels. In most scenarios, the offsets produced by BECCS are found to be more valuable to the power system than the electricity it provides. Advanced biomass power generation employs similar system design to advanced coal technology, enabling a transition strategy to low-carbon energy.
An intelligent surveillance platform for large metropolitan areas with dense sensor deployment.
Fernández, Jorge; Calavia, Lorena; Baladrón, Carlos; Aguiar, Javier M; Carro, Belén; Sánchez-Esguevillas, Antonio; Alonso-López, Jesus A; Smilansky, Zeev
2013-06-07
This paper presents an intelligent surveillance platform based on the usage of large numbers of inexpensive sensors designed and developed inside the European Eureka Celtic project HuSIMS. With the aim of maximizing the number of deployable units while keeping monetary and resource/bandwidth costs at a minimum, the surveillance platform is based on the usage of inexpensive visual sensors which apply efficient motion detection and tracking algorithms to transform the video signal in a set of motion parameters. In order to automate the analysis of the myriad of data streams generated by the visual sensors, the platform's control center includes an alarm detection engine which comprises three components applying three different Artificial Intelligence strategies in parallel. These strategies are generic, domain-independent approaches which are able to operate in several domains (traffic surveillance, vandalism prevention, perimeter security, etc.). The architecture is completed with a versatile communication network which facilitates data collection from the visual sensors and alarm and video stream distribution towards the emergency teams. The resulting surveillance system is extremely suitable for its deployment in metropolitan areas, smart cities, and large facilities, mainly because cheap visual sensors and autonomous alarm detection facilitate dense sensor network deployments for wide and detailed coverage.
DOE Office of Scientific and Technical Information (OSTI.GOV)
2018-01-23
Deploying an ADMS or looking to optimize its value? NREL offers a low-cost, low-risk evaluation platform for assessing ADMS performance. The National Renewable Energy Laboratory (NREL) has developed a vendor-neutral advanced distribution management system (ADMS) evaluation platform and is expanding its capabilities. The platform uses actual grid-scale hardware, large-scale distribution system models, and advanced visualization to simulate realworld conditions for the most accurate ADMS evaluation and experimentation.
ERIC Educational Resources Information Center
Laszewski, Audrey; Wichman, Christina L.; Doering, Jennifer J.; Maletta, Kristyn; Hammel, Jennifer
2016-01-01
Early childhood professionals do many things to support young families. This is true now more than ever, as researchers continue to discover the long-term benefits of early, healthy, nurturing relationships. This article provides an overview of the development of an advanced practice perinatal depression algorithm created as a step-by-step guide…
Estimating traffic volumes for signalized intersections using connected vehicle data
Zheng, Jianfeng; Liu, Henry X.
2017-04-17
Recently connected vehicle (CV) technology has received significant attention thanks to active pilot deployments supported by the US Department of Transportation (USDOT). At signalized intersections, CVs may serve as mobile sensors, providing opportunities of reducing dependencies on conventional vehicle detectors for signal operation. However, most of the existing studies mainly focus on scenarios that penetration rates of CVs reach certain level, e.g., 25%, which may not be feasible in the near future. How to utilize data from a small number of CVs to improve traffic signal operation remains an open question. In this work, we develop an approach to estimatemore » traffic volume, a key input to many signal optimization algorithms, using GPS trajectory data from CV or navigation devices under low market penetration rates. To estimate traffic volumes, we model in this paper vehicle arrivals at signalized intersections as a time-dependent Poisson process, which can account for signal coordination. The estimation problem is formulated as a maximum likelihood problem given multiple observed trajectories from CVs approaching to the intersection. An expectation maximization (EM) procedure is derived to solve the estimation problem. Two case studies were conducted to validate our estimation algorithm. One uses the CV data from the Safety Pilot Model Deployment (SPMD) project, in which around 2800 CVs were deployed in the City of Ann Arbor, MI. The other uses vehicle trajectory data from users of a commercial navigation service in China. Mean absolute percentage error (MAPE) of the estimation is found to be 9–12%, based on benchmark data manually collected and data from loop detectors. Finally, considering the existing scale of CV deployments, the proposed approach could be of significant help to traffic management agencies for evaluating and operating traffic signals, paving the way of using CVs for detector-free signal operation in the future.« less
Estimating traffic volumes for signalized intersections using connected vehicle data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zheng, Jianfeng; Liu, Henry X.
Recently connected vehicle (CV) technology has received significant attention thanks to active pilot deployments supported by the US Department of Transportation (USDOT). At signalized intersections, CVs may serve as mobile sensors, providing opportunities of reducing dependencies on conventional vehicle detectors for signal operation. However, most of the existing studies mainly focus on scenarios that penetration rates of CVs reach certain level, e.g., 25%, which may not be feasible in the near future. How to utilize data from a small number of CVs to improve traffic signal operation remains an open question. In this work, we develop an approach to estimatemore » traffic volume, a key input to many signal optimization algorithms, using GPS trajectory data from CV or navigation devices under low market penetration rates. To estimate traffic volumes, we model in this paper vehicle arrivals at signalized intersections as a time-dependent Poisson process, which can account for signal coordination. The estimation problem is formulated as a maximum likelihood problem given multiple observed trajectories from CVs approaching to the intersection. An expectation maximization (EM) procedure is derived to solve the estimation problem. Two case studies were conducted to validate our estimation algorithm. One uses the CV data from the Safety Pilot Model Deployment (SPMD) project, in which around 2800 CVs were deployed in the City of Ann Arbor, MI. The other uses vehicle trajectory data from users of a commercial navigation service in China. Mean absolute percentage error (MAPE) of the estimation is found to be 9–12%, based on benchmark data manually collected and data from loop detectors. Finally, considering the existing scale of CV deployments, the proposed approach could be of significant help to traffic management agencies for evaluating and operating traffic signals, paving the way of using CVs for detector-free signal operation in the future.« less
(EGS) Geothermal resource assessment High pressure, high temperature reaction systems Research Interests EGS demonstration and deployment Advanced drilling systems research Thermodynamics and process Phenomenological Experimental Demonstrations to Quantitative Understanding." Journal of Supercritical Fluids
Evaluation of the towplow for Caltrans operations.
DOT National Transportation Integrated Search
2015-09-30
Caltrans requested that the Advanced Highway Maintenance and Construction Technology Research Center (AHMCT) research center configure, procure, and deploy two Viking-Cives TowPlow systems and conduct an extensive evaluation to determine the most ben...
Clean Cities Now: Vol. 17, No. 1, Spring 2013 (Brochure)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sutor, J.; Tucker, E.; Thomas, J.
2013-05-01
Biannual newsletter for the U.S. Department of Energy's Clean Cities initiative. The newsletter includes feature stories on advanced vehicle deployment, idle reduction, and articles on Clean Cities coalition successes across the country.
Clean Cities Now: Vol. 16, No. 1, May 2012 (Brochure)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
2012-05-01
Biannual newsletter for the U.S. Department of Energy's Clean Cities initiative. The newsletter includes feature stories on advanced vehicle deployment, idle reduction, and articles on Clean Cities coalition successes across the country.
Indianapolis Area ITS Early Deployment Plan Final Report
DOT National Transportation Integrated Search
1996-07-01
PUBLIC-PRIVATE PARTNERSHIP, TRAFFIC SIGNAL CONTROL, REGIONAL MULTIMODAL TRAVEL INFORMATION, ADVANCED RURAL TRANSPORTATION SYSTEMS OR ARTS : THIS DOCUMENT LAYS OUT A 20-YEAR SCHEDULE FOR THE IMPLEMENTATION OF ITS IN THE INDIANAPOLIS AREA. THE REPOR...
Integrated Modeling for Road Condition Prediction (IMRCP)
DOT National Transportation Integrated Search
2018-01-17
Intelligent transportation system deployments have enabled great advances in operational awareness and response based on the data they gather on the current state of the roadways. Operators have better access to traffic and weather condition informat...
Kentucky commercial vehicle safety applications evaluation : technical report.
DOT National Transportation Integrated Search
2008-01-31
An advanced-technology Integrated Safety and Security Enforcement System (ISSES), now deployed at three commercial vehicle inspection sites along interstate highways in Kentucky, was evaluated from the point of view of system performance, potential e...
A Distributive, Non-Destructive, Real-Time Approach to Snowpack Monitoring
NASA Technical Reports Server (NTRS)
Frolik, Jeff; Skalka, Christian
2012-01-01
This invention is designed to ascertain the snow water equivalence (SWE) of snowpacks with better spatial and temporal resolutions than present techniques. The approach is ground-based, as opposed to some techniques that are air-based. In addition, the approach is compact, non-destructive, and can be communicated with remotely, and thus can be deployed in areas not possible with current methods. Presently there are two principal ground-based techniques for obtaining SWE measurements. The first is manual snow core measurements of the snowpack. This approach is labor-intensive, destructive, and has poor temporal resolution. The second approach is to deploy a large (e.g., 3x3 m) snowpillow, which requires significant infrastructure, is potentially hazardous [uses a approximately equal to 200-gallon (approximately equal to 760-L) antifreeze-filled bladder], and requires deployment in a large, flat area. High deployment costs necessitate few installations, thus yielding poor spatial resolution of data. Both approaches have limited usefulness in complex and/or avalanche-prone terrains. This approach is compact, non-destructive to the snowpack, provides high temporal resolution data, and due to potential low cost, can be deployed with high spatial resolution. The invention consists of three primary components: a robust wireless network and computing platform designed for harsh climates, new SWE sensing strategies, and algorithms for smart sampling, data logging, and SWE computation.
Development of Bonded Joint Technology for a Rigidizable-Inflatable Deployable Truss
NASA Technical Reports Server (NTRS)
Smeltzer, Stanley S., III
2006-01-01
Microwave and Synthetic Aperture Radar antenna systems have been developed as instrument systems using truss structures as their primary support and deployment mechanism for over a decade. NASA Langley Research Center has been investigating fabrication, modular assembly, and deployment methods of lightweight rigidizable/inflatable linear truss structures during that time for large spacecraft systems. The primary goal of the research at Langley Research Center is to advance these existing state-of-the-art joining and deployment concepts to achieve prototype system performance in a relevant space environment. During 2005, the development, fabrication, and testing of a 6.7 meter multi-bay, deployable linear truss was conducted at Langley Research Center to demonstrate functional and precision metrics of a rigidizable/inflatable truss structure. The present paper is intended to summarize aspects of bonded joint technology developed for the 6.7 meter deployable linear truss structure while providing a brief overview of the entire truss fabrication, assembly, and deployment methodology. A description of the basic joint design, surface preparation investigations, and experimental joint testing of component joint test articles will be described. Specifically, the performance of two room temperature adhesives were investigated to obtain qualitative data related to tube folding testing and quantitative data related to tensile shear strength testing. It was determined from the testing that a polyurethane-based adhesive best met the rigidizable/inflatable truss project requirements.
Software for Simulating Air Traffic
NASA Technical Reports Server (NTRS)
Sridhar, Banavar; Bilimoria, Karl; Grabbe, Shon; Chatterji, Gano; Sheth, Kapil; Mulfinger, Daniel
2006-01-01
Future Air Traffic Management Concepts Evaluation Tool (FACET) is a system of software for performing computational simulations for evaluating advanced concepts of advanced air-traffic management. FACET includes a program that generates a graphical user interface plus programs and databases that implement computational models of weather, airspace, airports, navigation aids, aircraft performance, and aircraft trajectories. Examples of concepts studied by use of FACET include aircraft self-separation for free flight; prediction of air-traffic-controller workload; decision support for direct routing; integration of spacecraft-launch operations into the U.S. national airspace system; and traffic- flow-management using rerouting, metering, and ground delays. Aircraft can be modeled as flying along either flight-plan routes or great-circle routes as they climb, cruise, and descend according to their individual performance models. The FACET software is modular and is written in the Java and C programming languages. The architecture of FACET strikes a balance between flexibility and fidelity; as a consequence, FACET can be used to model systemwide airspace operations over the contiguous U.S., involving as many as 10,000 aircraft, all on a single desktop or laptop computer running any of a variety of operating systems. Two notable applications of FACET include: (1) reroute conformance monitoring algorithms that have been implemented in one of the Federal Aviation Administration s nationally deployed, real-time, operational systems; and (2) the licensing and integration of FACET with the commercially available Flight Explorer, which is an Internet- based, real-time flight-tracking system.
NASA Astrophysics Data System (ADS)
Wu, H.; Zhou, L.; Xu, T.; Fang, W. L.; He, W. G.; Liu, H. M.
2017-11-01
In order to improve the situation of voltage violation caused by the grid-connection of photovoltaic (PV) system in a distribution network, a bi-level programming model is proposed for battery energy storage system (BESS) deployment. The objective function of inner level programming is to minimize voltage violation, with the power of PV and BESS as the variables. The objective function of outer level programming is to minimize the comprehensive function originated from inner layer programming and all the BESS operating parameters, with the capacity and rated power of BESS as the variables. The differential evolution (DE) algorithm is applied to solve the model. Based on distribution network operation scenarios with photovoltaic generation under multiple alternative output modes, the simulation results of IEEE 33-bus system prove that the deployment strategy of BESS proposed in this paper is well adapted to voltage violation regulation invariable distribution network operation scenarios. It contributes to regulating voltage violation in distribution network, as well as to improve the utilization of PV systems.
Riddle, Mark S.; Savarino, Stephen J.; Sanders, John W.
2015-01-01
Infectious diarrhea has been among the most common maladies of military deployments throughout time. The U.S. military experienced a significant burden from this disease in the middle eastern and north African campaigns of World War II (WWII). This article compares patterns of disease experienced in WWII with the recent military deployments to the same region for Operation Iraqi Freedom and Operation Enduring Freedom (OIF/OEF). Remarkable similarities in the prevalence and risk factors were noted, which belie the assumed improvements in prevention against these infections. In both campaigns, peaks of diarrhea occurred shortly after arrival of new personnel, which were seasonally associated and were linked to initial lapses in field sanitation and hygiene. It is important to reassess current strategies, especially, in light of emerging evidence of the chronic sequelae of these common infections to include a reemphasis on or reexamination of vaccine development, rapid field diagnostics, treatment algorithms, and antimicrobial prophylaxis. PMID:26350450
Directly data processing algorithm for multi-wavelength pyrometer (MWP).
Xing, Jian; Peng, Bo; Ma, Zhao; Guo, Xin; Dai, Li; Gu, Weihong; Song, Wenlong
2017-11-27
Data processing of multi-wavelength pyrometer (MWP) is a difficult problem because unknown emissivity. So far some solutions developed generally assumed particular mathematical relations for emissivity versus wavelength or emissivity versus temperature. Due to the deviation between the hypothesis and actual situation, the inversion results can be seriously affected. So directly data processing algorithm of MWP that does not need to assume the spectral emissivity model in advance is main aim of the study. Two new data processing algorithms of MWP, Gradient Projection (GP) algorithm and Internal Penalty Function (IPF) algorithm, each of which does not require to fix emissivity model in advance, are proposed. The novelty core idea is that data processing problem of MWP is transformed into constraint optimization problem, then it can be solved by GP or IPF algorithms. By comparison of simulation results for some typical spectral emissivity models, it is found that IPF algorithm is superior to GP algorithm in terms of accuracy and efficiency. Rocket nozzle temperature experiment results show that true temperature inversion results from IPF algorithm agree well with the theoretical design temperature as well. So the proposed combination IPF algorithm with MWP is expected to be a directly data processing algorithm to clear up the unknown emissivity obstacle for MWP.
High-Resolution Monitoring of Himalayan Glacier Dynamics Using Unmanned Aerial Vehicles
NASA Astrophysics Data System (ADS)
Immerzeel, W.; Kraaijenbrink, P. D. A.; Shea, J.; Shrestha, A. B.; Pellicciotti, F.; Bierkens, M. F.; de Jong, S. M.
2014-12-01
Himalayan glacier tongues are commonly debris covered and play an important role in modulating the glacier response to climate . However, they remain relatively unstudied because of the inaccessibility of the terrain and the difficulties in field work caused by the thick debris mantles. Observations of debris-covered glaciers are therefore limited to point locations and airborne remote sensing may bridge the gap between scarce, point field observations and coarse resolution space-borne remote sensing. In this study we deploy an Unmanned Airborne Vehicle (UAV) on two debris covered glaciers in the Nepalese Himalayas: the Lirung and Langtang glacier during four field campaigns in 2013 and 2014. Based on stereo-imaging and the structure for motion algorithm we derive highly detailed ortho-mosaics and digital elevation models (DEMs), which we geometrically correct using differential GPS observations collected in the field. Based on DEM differencing and manual feature tracking we derive the mass loss and the surface velocity of the glacier at a high spatial resolution and accuracy. We also assess spatiotemporal changes in supra-glacial lakes and ice cliffs based on the imagery. On average, mass loss is limited and the surface velocity is very small. However, the spatial variability of melt rates is very high, and ice cliffs and supra-glacial ponds show mass losses that can be an order of magnitude higher than the average. We suggest that future research should focus on the interaction between supra-glacial ponds, ice cliffs and englacial hydrology to further understand the dynamics of debris-covered glaciers. Finally, we conclude that UAV deployment has large potential in glaciology and it represents a substantial advancement over methods currently applied in studying glacier surface features.
An optimized web-based approach for collaborative stereoscopic medical visualization
Kaspar, Mathias; Parsad, Nigel M; Silverstein, Jonathan C
2013-01-01
Objective Medical visualization tools have traditionally been constrained to tethered imaging workstations or proprietary client viewers, typically part of hospital radiology systems. To improve accessibility to real-time, remote, interactive, stereoscopic visualization and to enable collaboration among multiple viewing locations, we developed an open source approach requiring only a standard web browser with no added client-side software. Materials and Methods Our collaborative, web-based, stereoscopic, visualization system, CoWebViz, has been used successfully for the past 2 years at the University of Chicago to teach immersive virtual anatomy classes. It is a server application that streams server-side visualization applications to client front-ends, comprised solely of a standard web browser with no added software. Results We describe optimization considerations, usability, and performance results, which make CoWebViz practical for broad clinical use. We clarify technical advances including: enhanced threaded architecture, optimized visualization distribution algorithms, a wide range of supported stereoscopic presentation technologies, and the salient theoretical and empirical network parameters that affect our web-based visualization approach. Discussion The implementations demonstrate usability and performance benefits of a simple web-based approach for complex clinical visualization scenarios. Using this approach overcomes technical challenges that require third-party web browser plug-ins, resulting in the most lightweight client. Conclusions Compared to special software and hardware deployments, unmodified web browsers enhance remote user accessibility to interactive medical visualization. Whereas local hardware and software deployments may provide better interactivity than remote applications, our implementation demonstrates that a simplified, stable, client approach using standard web browsers is sufficient for high quality three-dimensional, stereoscopic, collaborative and interactive visualization. PMID:23048008
Investigation of iterative image reconstruction in three-dimensional optoacoustic tomography
Wang, Kun; Su, Richard; Oraevsky, Alexander A; Anastasio, Mark A
2012-01-01
Iterative image reconstruction algorithms for optoacoustic tomography (OAT), also known as photoacoustic tomography, have the ability to improve image quality over analytic algorithms due to their ability to incorporate accurate models of the imaging physics, instrument response, and measurement noise. However, to date, there have been few reported attempts to employ advanced iterative image reconstruction algorithms for improving image quality in three-dimensional (3D) OAT. In this work, we implement and investigate two iterative image reconstruction methods for use with a 3D OAT small animal imager: namely, a penalized least-squares (PLS) method employing a quadratic smoothness penalty and a PLS method employing a total variation norm penalty. The reconstruction algorithms employ accurate models of the ultrasonic transducer impulse responses. Experimental data sets are employed to compare the performances of the iterative reconstruction algorithms to that of a 3D filtered backprojection (FBP) algorithm. By use of quantitative measures of image quality, we demonstrate that the iterative reconstruction algorithms can mitigate image artifacts and preserve spatial resolution more effectively than FBP algorithms. These features suggest that the use of advanced image reconstruction algorithms can improve the effectiveness of 3D OAT while reducing the amount of data required for biomedical applications. PMID:22864062
A novel association rule mining approach using TID intermediate itemset.
Aqra, Iyad; Herawan, Tutut; Abdul Ghani, Norjihan; Akhunzada, Adnan; Ali, Akhtar; Bin Razali, Ramdan; Ilahi, Manzoor; Raymond Choo, Kim-Kwang
2018-01-01
Designing an efficient association rule mining (ARM) algorithm for multilevel knowledge-based transactional databases that is appropriate for real-world deployments is of paramount concern. However, dynamic decision making that needs to modify the threshold either to minimize or maximize the output knowledge certainly necessitates the extant state-of-the-art algorithms to rescan the entire database. Subsequently, the process incurs heavy computation cost and is not feasible for real-time applications. The paper addresses efficiently the problem of threshold dynamic updation for a given purpose. The paper contributes by presenting a novel ARM approach that creates an intermediate itemset and applies a threshold to extract categorical frequent itemsets with diverse threshold values. Thus, improving the overall efficiency as we no longer needs to scan the whole database. After the entire itemset is built, we are able to obtain real support without the need of rebuilding the itemset (e.g. Itemset list is intersected to obtain the actual support). Moreover, the algorithm supports to extract many frequent itemsets according to a pre-determined minimum support with an independent purpose. Additionally, the experimental results of our proposed approach demonstrate the capability to be deployed in any mining system in a fully parallel mode; consequently, increasing the efficiency of the real-time association rules discovery process. The proposed approach outperforms the extant state-of-the-art and shows promising results that reduce computation cost, increase accuracy, and produce all possible itemsets.
A novel association rule mining approach using TID intermediate itemset
Ali, Akhtar; Bin Razali, Ramdan; Ilahi, Manzoor; Raymond Choo, Kim-Kwang
2018-01-01
Designing an efficient association rule mining (ARM) algorithm for multilevel knowledge-based transactional databases that is appropriate for real-world deployments is of paramount concern. However, dynamic decision making that needs to modify the threshold either to minimize or maximize the output knowledge certainly necessitates the extant state-of-the-art algorithms to rescan the entire database. Subsequently, the process incurs heavy computation cost and is not feasible for real-time applications. The paper addresses efficiently the problem of threshold dynamic updation for a given purpose. The paper contributes by presenting a novel ARM approach that creates an intermediate itemset and applies a threshold to extract categorical frequent itemsets with diverse threshold values. Thus, improving the overall efficiency as we no longer needs to scan the whole database. After the entire itemset is built, we are able to obtain real support without the need of rebuilding the itemset (e.g. Itemset list is intersected to obtain the actual support). Moreover, the algorithm supports to extract many frequent itemsets according to a pre-determined minimum support with an independent purpose. Additionally, the experimental results of our proposed approach demonstrate the capability to be deployed in any mining system in a fully parallel mode; consequently, increasing the efficiency of the real-time association rules discovery process. The proposed approach outperforms the extant state-of-the-art and shows promising results that reduce computation cost, increase accuracy, and produce all possible itemsets. PMID:29351287
NASA Technical Reports Server (NTRS)
Fernandez, Juan M.
2017-01-01
State of the art deployable structures are mainly being designed for medium to large size satellites. The lack of reliable deployable structural systems for low cost, small volume, rideshare-class spacecraft severely constrains the potential for using small satellite platforms for affordable deep space science and exploration precursor missions that could be realized with solar sails. There is thus a need for reliable, lightweight, high packaging efficiency deployable booms that can serve as the supporting structure for a wide range of small satellite systems including solar sails for propulsion. The National Air and Space Administration (NASA) is currently investing in the development of a new class of advanced deployable shell-based composite booms to support future deep space small satellite missions using solar sails. The concepts are being designed to: meet the unique requirements of small satellites, maximize ground testability, permit the use of low-cost manufacturing processes that will benefit scalability, be scalable for use as elements of hierarchical structures (e.g. trusses), allow long duration storage, have high deployment reliability, and have controlled deployment behavior and predictable deployed dynamics. This paper will present the various rollable boom concepts that are being developed for 5-20 m class size deployable structures that include solar sails with the so-called High Strain Composites (HSC) materials. The deployable composite booms to be presented are being developed to expand the portfolio of available rollable booms for small satellites and maximize their length for a given packaged volume. Given that solar sails are a great example of volume and mass optimization, the booms were designed to comply with nominal solar sail system requirements for 6U CubeSats, which are a good compromise between those of smaller form factors (1U, 2U and 3U CubeSats) and larger ones (12 U and 27 U future CubeSats, and ESPA-class microsatellites). Solar sail missions for such composite boom systems are already under consideration and development at NASA, as well as mission studies that will benefit from planned scaled-up versions of the composite boom technologies to be introduced. The paper presents ongoing research and development of thin-shell rollable composite booms designed under the particular stringent and challenging system requirements of relatively large solar sails housed on small satellites. These requirements will be derived and listed. Several new boom concepts are proposed and other existing ones are improved upon using thin-ply composite materials to yield unprecedented compact deployable structures. Some of these booms are shown in Fig. 1. For every boom to be introduced the scalable fabrication process developed to keep the overall boom system cost down will be shown. Finally, the initial results of purposely designed boom structural characterization test methods with gravity off-loading will be presented to compare their structural performance under expected and general load cases.
Software-Defined Architectures for Spectrally Efficient Cognitive Networking in Extreme Environments
NASA Astrophysics Data System (ADS)
Sklivanitis, Georgios
The objective of this dissertation is the design, development, and experimental evaluation of novel algorithms and reconfigurable radio architectures for spectrally efficient cognitive networking in terrestrial, airborne, and underwater environments. Next-generation wireless communication architectures and networking protocols that maximize spectrum utilization efficiency in congested/contested or low-spectral availability (extreme) communication environments can enable a rich body of applications with unprecedented societal impact. In recent years, underwater wireless networks have attracted significant attention for military and commercial applications including oceanographic data collection, disaster prevention, tactical surveillance, offshore exploration, and pollution monitoring. Unmanned aerial systems that are autonomously networked and fully mobile can assist humans in extreme or difficult-to-reach environments and provide cost-effective wireless connectivity for devices without infrastructure coverage. Cognitive radio (CR) has emerged as a promising technology to maximize spectral efficiency in dynamically changing communication environments by adaptively reconfiguring radio communication parameters. At the same time, the fast developing technology of software-defined radio (SDR) platforms has enabled hardware realization of cognitive radio algorithms for opportunistic spectrum access. However, existing algorithmic designs and protocols for shared spectrum access do not effectively capture the interdependencies between radio parameters at the physical (PHY), medium-access control (MAC), and network (NET) layers of the network protocol stack. In addition, existing off-the-shelf radio platforms and SDR programmable architectures are far from fulfilling runtime adaptation and reconfiguration across PHY, MAC, and NET layers. Spectrum allocation in cognitive networks with multi-hop communication requirements depends on the location, network traffic load, and interference profile at each network node. As a result, the development and implementation of algorithms and cross-layer reconfigurable radio platforms that can jointly treat space, time, and frequency as a unified resource to be dynamically optimized according to inter- and intra-network interference constraints is of fundamental importance. In the next chapters, we present novel algorithmic and software/hardware implementation developments toward the deployment of spectrally efficient terrestrial, airborne, and underwater wireless networks. In Chapter 1 we review the state-of-art in commercially available SDR platforms, describe their software and hardware capabilities, and classify them based on their ability to enable rapid prototyping and advance experimental research in wireless networks. Chapter 2 discusses system design and implementation details toward real-time evaluation of a software-radio platform for all-spectrum cognitive channelization in the presence of narrowband or wideband primary stations. All-spectrum channelization is achieved by designing maximum signal-to-interference-plus-noise ratio (SINR) waveforms that span the whole continuum of the device-accessible spectrum, while satisfying peak power and interference temperature (IT) constraints for the secondary and primary users, respectively. In Chapter 3, we introduce the concept of all-spectrum channelization based on max-SINR optimized sparse-binary waveforms, we propose optimal and suboptimal waveform design algorithms, and evaluate their SINR and bit-error-rate (BER) performance in an SDR testbed. Chapter 4 considers the problem of channel estimation with minimal pilot signaling in multi-cell multi-user multi-input multi-output (MIMO) systems with very large antenna arrays at the base station, and proposes a least-squares (LS)-type algorithm that iteratively extracts channel and data estimates from a short record of data measurements. Our algorithmic developments toward spectrally-efficient cognitive networking through joint optimization of channel access code-waveforms and routes in a multi-hop network are described in Chapter 5. Algorithmic designs are software optimized on heterogeneous multi-core general-purpose processor (GPP)-based SDR architectures by leveraging a novel software-radio framework that offers self-optimization and real-time adaptation capabilities at the PHY, MAC, and NET layers of the network protocol stack. Our system design approach is experimentally validated under realistic conditions in a large-scale hybrid ground-air testbed deployment. Chapter 6 reviews the state-of-art in software and hardware platforms for underwater wireless networking and proposes a software-defined acoustic modem prototype that enables (i) cognitive reconfiguration of PHY/MAC parameters, and (ii) cross-technology communication adaptation. The proposed modem design is evaluated in terms of effective communication data rate in both water tank and lake testbed setups. In Chapter 7, we present a novel receiver configuration for code-waveform-based multiple-access underwater communications. The proposed receiver is fully reconfigurable and executes (i) all-spectrum cognitive channelization, and (ii) combined synchronization, channel estimation, and demodulation. Experimental evaluation in terms of SINR and BER show that all-spectrum channelization is a powerful proposition for underwater communications. At the same time, the proposed receiver design can significantly enhance bandwidth utilization. Finally, in Chapter 8, we focus on challenging practical issues that arise in underwater acoustic sensor network setups where co-located multi-antenna sensor deployment is not feasible due to power, computation, and hardware limitations, and design, implement, and evaluate an underwater receiver structure that accounts for multiple carrier frequency and timing offsets in virtual (distributed) MIMO underwater systems.
An approximate dynamic programming approach to resource management in multi-cloud scenarios
NASA Astrophysics Data System (ADS)
Pietrabissa, Antonio; Priscoli, Francesco Delli; Di Giorgio, Alessandro; Giuseppi, Alessandro; Panfili, Martina; Suraci, Vincenzo
2017-03-01
The programmability and the virtualisation of network resources are crucial to deploy scalable Information and Communications Technology (ICT) services. The increasing demand of cloud services, mainly devoted to the storage and computing, requires a new functional element, the Cloud Management Broker (CMB), aimed at managing multiple cloud resources to meet the customers' requirements and, simultaneously, to optimise their usage. This paper proposes a multi-cloud resource allocation algorithm that manages the resource requests with the aim of maximising the CMB revenue over time. The algorithm is based on Markov decision process modelling and relies on reinforcement learning techniques to find online an approximate solution.
Region Based CNN for Foreign Object Debris Detection on Airfield Pavement
Cao, Xiaoguang; Wang, Peng; Meng, Cai; Gong, Guoping; Liu, Miaoming; Qi, Jun
2018-01-01
In this paper, a novel algorithm based on convolutional neural network (CNN) is proposed to detect foreign object debris (FOD) based on optical imaging sensors. It contains two modules, the improved region proposal network (RPN) and spatial transformer network (STN) based CNN classifier. In the improved RPN, some extra select rules are designed and deployed to generate high quality candidates with fewer numbers. Moreover, the efficiency of CNN detector is significantly improved by introducing STN layer. Compared to faster R-CNN and single shot multiBox detector (SSD), the proposed algorithm achieves better result for FOD detection on airfield pavement in the experiment. PMID:29494524
Region Based CNN for Foreign Object Debris Detection on Airfield Pavement.
Cao, Xiaoguang; Wang, Peng; Meng, Cai; Bai, Xiangzhi; Gong, Guoping; Liu, Miaoming; Qi, Jun
2018-03-01
In this paper, a novel algorithm based on convolutional neural network (CNN) is proposed to detect foreign object debris (FOD) based on optical imaging sensors. It contains two modules, the improved region proposal network (RPN) and spatial transformer network (STN) based CNN classifier. In the improved RPN, some extra select rules are designed and deployed to generate high quality candidates with fewer numbers. Moreover, the efficiency of CNN detector is significantly improved by introducing STN layer. Compared to faster R-CNN and single shot multiBox detector (SSD), the proposed algorithm achieves better result for FOD detection on airfield pavement in the experiment.
2015-09-30
Wireless Networks (WUWNet’14), Rome, Italy, Nov. 12 14, 2014. J. Preisig, “ Underwater Acoustic Communications: Enabling the Next Generation at the...on Wireless Communication. M. Pajovic, J. Preisig, “Performance Analytics and Optimal Design of Multichannel Equalizers for Underwater Acoustic Communications”, to appear in IEEE Journal of Oceanic Engineering. 6 ...Exploiting Structured Dependencies in the Design of Adaptive Algorithms for Underwater Communication Award #3
Sodium azide-associated laryngospasm after air bag deployment.
Francis, David; Warren, Samuel A; Warner, Keir J; Harris, William; Copass, Michael K; Bulger, Eileen M
2010-09-01
The advent and incorporation of the air bag into motor vehicles has resulted in the mitigation of many head and truncal injuries in motor vehicle collisions. However, air bag deployment is not risk free. We present a case of sodium azide-induced laryngospasm after air bag deployment. An unrestrained male driver was in a moderate-speed motor vehicle collision with air bag deployment. Medics found him awake, gasping for air with stridorous respirations and guarding his neck. The patient had no external signs of trauma and was presumed to have tracheal injury. The patient was greeted by the Anesthesiology service, which intubated him using glidescope-assisted laryngoscopy. The patient was admitted for overnight observation and treatment of alkaline ocular injury and laryngospasm. Although air bags represent an important advance in automobile safety, their use is not without risk. Bruising and tracheal rupture secondary to air bag deployment have been reported in out-of-position occupants. Additionally, alkaline by-products from the combustion of sodium azide in air bags have been implicated in ocular injury and facial burns. Laryngospasm after sodium azide exposure presents another diagnostic challenge for providers. Therefore, it is incumbent to maintain vigilance in the physical examination and diagnosis of occult injuries after air bag deployment. Copyright © 2010 Elsevier Inc. All rights reserved.
Contextual cloud-based service oriented architecture for clinical workflow.
Moreno-Conde, Jesús; Moreno-Conde, Alberto; Núñez-Benjumea, Francisco J; Parra-Calderón, Carlos
2015-01-01
Given that acceptance of systems within the healthcare domain multiple papers highlighted the importance of integrating tools with the clinical workflow. This paper analyse how clinical context management could be deployed in order to promote the adoption of cloud advanced services and within the clinical workflow. This deployment will be able to be integrated with the eHealth European Interoperability Framework promoted specifications. Throughout this paper, it is proposed a cloud-based service-oriented architecture. This architecture will implement a context management system aligned with the HL7 standard known as CCOW.
Experience with ActiveX control for simple channel access
DOE Office of Scientific and Technical Information (OSTI.GOV)
Timossi, C.; Nishimura, H.; McDonald, J.
2003-05-15
Accelerator control system applications at Berkeley Lab's Advanced Light Source (ALS) are typically deployed on operator consoles running Microsoft Windows 2000 and utilize EPICS[2]channel access for data access. In an effort to accommodate the wide variety of Windows based development tools and developers with little experience in network programming, ActiveX controls have been deployed on the operator stations. Use of ActiveX controls for use in the accelerator control environment has been presented previously[1]. Here we report on some of our experiences with the use and development of these controls.
An advanced analysis method of initial orbit determination with too short arc data
NASA Astrophysics Data System (ADS)
Li, Binzhe; Fang, Li
2018-02-01
This paper studies the initial orbit determination (IOD) based on space-based angle measurement. Commonly, these space-based observations have short durations. As a result, classical initial orbit determination algorithms give poor results, such as Laplace methods and Gauss methods. In this paper, an advanced analysis method of initial orbit determination is developed for space-based observations. The admissible region and triangulation are introduced in the method. Genetic algorithm is also used for adding some constraints of parameters. Simulation results show that the algorithm can successfully complete the initial orbit determination.
NASA Technical Reports Server (NTRS)
Delaat, John C.; Merrill, Walter C.
1990-01-01
The objective of the Advanced Detection, Isolation, and Accommodation Program is to improve the overall demonstrated reliability of digital electronic control systems for turbine engines. For this purpose, an algorithm was developed which detects, isolates, and accommodates sensor failures by using analytical redundancy. The performance of this algorithm was evaluated on a real time engine simulation and was demonstrated on a full scale F100 turbofan engine. The real time implementation of the algorithm is described. The implementation used state-of-the-art microprocessor hardware and software, including parallel processing and high order language programming.
Evaluation of the Metropolitan Atlanta Rapid Transit Authority intelligent transportation system
DOT National Transportation Integrated Search
2000-07-01
This report documents the implementation and operation of the Metropolitan Atlanta Rapid Transit Authority's Advanced Public Transportation System (ITS MARTA '96) as part of a showcase of Intelligent Transportation Systems technologies deployed for t...
Region 4 ATMS local evaluation report
DOT National Transportation Integrated Search
2005-07-01
In March, 1996, the Rochester Areawide Advanced Transportation Management System Report (6) established the need for an ITS as well as a strategic implementation/deployment plan. This plan has, in part, been implemented through the design and constru...
Transportation planning for electric vehicles and associated infrastructure.
DOT National Transportation Integrated Search
2017-05-01
Planning is the key to successful adoption and deployment of any new technology, and : it is particularly important when that advancement involves a paradigm shift such as : electrified transportation. At its core, electric transportation is largely ...
Promising transit applications of fuel cells and alternative fuels
DOT National Transportation Integrated Search
2002-06-01
For over a decade, the Volpe Center has been providing technical support to the Federal Transit Administration (FTA) Office of Research, Demonstration and Innovation towards the development, deployment, field test and safety evaluation of advanced tr...
Transit bus applications of lithium ion batteries : progress and prospects
DOT National Transportation Integrated Search
2012-12-31
This report provides an overview of diverse transit bus applications of advanced Lithium Ion Batteries (LIBs). The report highlights and illustrates several FTA programs that fostered the successful development, demonstration, and deployment of fuel-...
Scalable Deployment of Advanced Building Energy Management Systems
2013-05-01
150 Figure J.5 Sensor Schema...151 Figure J.6 Temperature Sensor Schema...augments an existing BMS with additional sensors /meters and uses a reduced-order model and diagnostic software to make performance deviations visible
Big data analytics to aid developing livable communities.
DOT National Transportation Integrated Search
2015-12-31
In transportation, ubiquitous deployment of low-cost sensors combined with powerful : computer hardware and high-speed network makes big data available. USDOT defines big : data research in transportation as a number of advanced techniques applied to...
Feasibility of advanced vehicle control systems for transit buses
DOT National Transportation Integrated Search
1997-01-01
In the course of developing automated vehicle-roadway systems, opportunities to deploy vehicle control systems at intermediate stages of development may emerge. Some of these systems may provide a significant efficiency or safety enhancement to exist...
OKI evaluation of intelligent transportation system
DOT National Transportation Integrated Search
2000-06-01
The Advanced Regional Traffic Interactive Management & Information System (ARTIMIS) is one of the earliest ITS systems deployed in the US with preliminary studies being initiated in the late 1980s and early 1990s. ARTIMIS provides traffic management ...
The Gateway Garden — A Prototype Food Production Facility for Deep Space Exploration
NASA Astrophysics Data System (ADS)
Fritsche, R. F.; Romeyn, M. W.; Massa, G.
2018-02-01
CIS-lunar space provides a unique opportunity to perform deep space microgravity crop science research while also addressing and advancing food production technologies that will be deployed on the Deep Space Transport.
Evaluation of Cyber Sensors for Enhancing Situational Awareness in the ICS environment
2013-06-01
24 3.2 Gumstix Size . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26 3.3 Raspberry Pi Sensor...Approach For this research a logging algorithm is deployed across three platforms, a baseline laptop, a Gumstix Overo Earth COM, and a Raspberry Pi . The...protocols. This research is limited to two cyber sensors: Gumstix Overo Earth COM, and the Raspberry Pi . Additionally, one custom Snort signature is
Dynamic Modeling and Grid Interaction of a Tidal and River Generator
DOE Office of Scientific and Technical Information (OSTI.GOV)
Muljadi, Eduard; Gevorgian, Vahan; Donegan, James
This presentation provides a high-level overview of the deployment of a river generator installed in a small system. The turbine dynamics of a river generator, electrical generator, and power converter are modeled in detail. Various simulations can be exercised, and the impact of different control algorithms, failures of power switches, and corresponding impacts can be examined.
Art concept of Magellan spacecraft in cruise configuration
NASA Technical Reports Server (NTRS)
1988-01-01
Magellan spacecraft cruise configuration is illustrated in this artist concept. With solar panels deployed and having jettisoned the inertial upper stage (IUS), Magellan approaches the sun which it will orbit approximately 1.6 times before encountering Venus. Magellan, named after the 16th century Portuguese explorer, will orbit Venus about once every three hours, acquiring radar data for 37 minutes of each orbit when it is closest to the surface. Using an advanced instrument called a synthetic aperture radar (SAR), it will map more than 90 per cent of the surface with resolution ten times better than the best from prior spacecraft. Magellan is managed by the Jet Propulsion Laboratory (JPL); Martin Marietta Aerospace is developing the spacecraft and Hughes Aircraft Company, the advanced imaging radar. Magellan will be deployed from payload bay (PLB) of Atlantis, Orbiter Vehicle (OV) 104, during the STS-30 mission.
Advances in on-line drinking water quality monitoring and early warning systems.
Storey, Michael V; van der Gaag, Bram; Burns, Brendan P
2011-01-01
Significant advances have been made in recent years in technologies to monitor drinking water quality for source water protection, treatment operations, and distribution system management, in the event of accidental (or deliberate) contamination. Reports prepared through the Global Water Research Coalition (GWRC) and United States Environment Protection Agency (USEPA) agree that while many emerging technologies show promise, they are still some years from being deployed on a large scale. Further underpinning their viability is a need to interpret data in real time and implement a management strategy in response. This review presents the findings of an international study into the state of the art in this field. These results are based on visits to leading water utilities, research organisations and technology providers throughout Europe, the United States and Singapore involved in the development and deployment of on-line monitoring technology for the detection of contaminants in water. Copyright © 2010 Elsevier Ltd. All rights reserved.
A manipulator arm for zero-g simulations
NASA Technical Reports Server (NTRS)
Brodie, S. B.; Grant, C.; Lazar, J. J.
1975-01-01
A 12-ft counterbalanced Slave Manipulator Arm (SMA) was designed and fabricated to be used for resolving the questions of operational applications, capabilities, and limitations for such remote manned systems as the Payload Deployment and Retrieval Mechanism (PDRM) for the shuttle, the Free-Flying Teleoperator System, the Advanced Space Tug, and Planetary Rovers. As a developmental tool for the shuttle manipulator system (or PDRM), the SMA represents an approximate one-quarter scale working model for simulating and demonstrating payload handling, docking assistance, and satellite servicing. For the Free-Flying Teleoperator System and the Advanced Tug, the SMA provides a near full-scale developmental tool for satellite servicing, docking, and deployment/retrieval procedures, techniques, and support equipment requirements. For the Planetary Rovers, it provides an oversize developmental tool for sample handling and soil mechanics investigations. The design of the SMA was based on concepts developed for a 40-ft NASA technology arm to be used for zero-g shuttle manipulator simulations.
Advanced In-Pile Instrumentation for Materials Testing Reactors
NASA Astrophysics Data System (ADS)
Rempe, J. L.; Knudson, D. L.; Daw, J. E.; Unruh, T. C.; Chase, B. M.; Davis, K. L.; Palmer, A. J.; Schley, R. S.
2014-08-01
The U.S. Department of Energy sponsors the Advanced Test Reactor (ATR) National Scientific User Facility (NSUF) program to promote U.S. research in nuclear science and technology. By attracting new research users - universities, laboratories, and industry - the ATR NSUF facilitates basic and applied nuclear research and development, advancing U.S. energy security needs. A key component of the ATR NSUF effort is to design, develop, and deploy new in-pile instrumentation techniques that are capable of providing real-time measurements of key parameters during irradiation. This paper describes the strategy developed by the Idaho National Laboratory (INL) for identifying instrumentation needed for ATR irradiation tests and the program initiated to obtain these sensors. New sensors developed from this effort are identified, and the progress of other development efforts is summarized. As reported in this paper, INL researchers are currently involved in several tasks to deploy real-time length and flux detection sensors, and efforts have been initiated to develop a crack growth test rig. Tasks evaluating `advanced' technologies, such as fiber-optics based length detection and ultrasonic thermometers, are also underway. In addition, specialized sensors for real-time detection of temperature and thermal conductivity are not only being provided to NSUF reactors, but are also being provided to several international test reactors.
Cross-platform validation and analysis environment for particle physics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chekanov, S. V.; Pogrebnyak, I.; Wilbern, D.
A multi-platform validation and analysis framework for public Monte Carlo simulation for high-energy particle collisions is discussed. The front-end of this framework uses the Python programming language, while the back-end is written in Java, which provides a multi-platform environment that can be run from a web browser and can easily be deployed at the grid sites. The analysis package includes all major software tools used in high-energy physics, such as Lorentz vectors, jet algorithms, histogram packages, graphic canvases, and tools for providing data access. This multi-platform software suite, designed to minimize OS-specific maintenance and deployment time, is used for onlinemore » validation of Monte Carlo event samples through a web interface.« less
Membrane Shell Reflector Segment Antenna
NASA Technical Reports Server (NTRS)
Fang, Houfei; Im, Eastwood; Lin, John; Moore, James
2012-01-01
The mesh reflector is the only type of large, in-space deployable antenna that has successfully flown in space. However, state-of-the-art large deployable mesh antenna systems are RF-frequency-limited by both global shape accuracy and local surface quality. The limitations of mesh reflectors stem from two factors. First, at higher frequencies, the porosity and surface roughness of the mesh results in loss and scattering of the signal. Second, the mesh material does not have any bending stiffness and thus cannot be formed into true parabolic (or other desired) shapes. To advance the deployable reflector technology at high RF frequencies from the current state-of-the-art, significant improvements need to be made in three major aspects: a high-stability and highprecision deployable truss; a continuously curved RF reflecting surface (the function of the surface as well as its first derivative are both continuous); and the RF reflecting surface should be made of a continuous material. To meet these three requirements, the Membrane Shell Reflector Segment (MSRS) antenna was developed.