Sample records for distributed utility application

  1. Wireless remote control clinical image workflow: utilizing a PDA for offsite distribution

    NASA Astrophysics Data System (ADS)

    Liu, Brent J.; Documet, Luis; Documet, Jorge; Huang, H. K.; Muldoon, Jean

    2004-04-01

    Last year we presented in RSNA an application to perform wireless remote control of PACS image distribution utilizing a handheld device such as a Personal Digital Assistant (PDA). This paper describes the clinical experiences including workflow scenarios of implementing the PDA application to route exams from the clinical PACS archive server to various locations for offsite distribution of clinical PACS exams. By utilizing this remote control application, radiologists can manage image workflow distribution with a single wireless handheld device without impacting their clinical workflow on diagnostic PACS workstations. A PDA application was designed and developed to perform DICOM Query and C-Move requests by a physician from a clinical PACS Archive to a CD-burning device for automatic burning of PACS data for the distribution to offsite. In addition, it was also used for convenient routing of historical PACS exams to the local web server, local workstations, and teleradiology systems. The application was evaluated by radiologists as well as other clinical staff who need to distribute PACS exams to offsite referring physician"s offices and offsite radiologists. An application for image workflow management utilizing wireless technology was implemented in a clinical environment and evaluated. A PDA application was successfully utilized to perform DICOM Query and C-Move requests from the clinical PACS archive to various offsite exam distribution devices. Clinical staff can utilize the PDA to manage image workflow and PACS exam distribution conveniently for offsite consultations by referring physicians and radiologists. This solution allows the radiologist to expand their effectiveness in health care delivery both within the radiology department as well as offisite by improving their clinical workflow.

  2. Developing Use Cases for Evaluation of ADMS Applications to Accelerate Technology Adoption: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Veda, Santosh; Wu, Hongyu; Martin, Maurice

    Grid modernization for the distribution systems comprise of the ability to effectively monitor and manage unplanned events while ensuring reliable operations. Integration of Distributed Energy Resources (DERs) and proliferation of autonomous smart controllers like microgrids and smart inverters in the distribution networks challenge the status quo of distribution system operations. Advanced Distribution Management System (ADMS) technologies are being increasingly deployed to manage the complexities of operating distribution systems. The ability to evaluate the ADMS applications in specific utility environments and for future scenarios will accelerate wider adoption of the ADMS and will lower the risks and costs of their implementation.more » This paper addresses the first step - identify and define the use cases for evaluating these applications. The applications that are selected for this discussion include Volt-VAr Optimization (VVO), Fault Location Isolation and Service Restoration (FLISR), Online Power Flow (OLPF)/Distribution System State Estimation (DSSE) and Market Participation. A technical description and general operational requirements for each of these applications is presented. The test scenarios that are most relevant to the utility challenges are also addressed.« less

  3. System and Method for Providing a Climate Data Analytic Services Application Programming Interface Distribution Package

    NASA Technical Reports Server (NTRS)

    Tamkin, Glenn S. (Inventor); Duffy, Daniel Q. (Inventor); Schnase, John L. (Inventor)

    2016-01-01

    A system, method and computer-readable storage devices for providing a climate data analytic services application programming interface distribution package. The example system can provide various components. The system provides a climate data analytic services application programming interface library that enables software applications running on a client device to invoke the capabilities of a climate data analytic service. The system provides a command-line interface that provides a means of interacting with a climate data analytic service by issuing commands directly to the system's server interface. The system provides sample programs that call on the capabilities of the application programming interface library and can be used as templates for the construction of new client applications. The system can also provide test utilities, build utilities, service integration utilities, and documentation.

  4. 2012 Market Report on Wind Technologies in Distributed Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Orrell, Alice C.

    2013-08-01

    An annual report on U.S. wind power in distributed applications – expanded to include small, mid-size, and utility-scale installations – including key statistics, economic data, installation, capacity, and generation statistics, and more.

  5. Selection and development of small solar thermal power applications

    NASA Technical Reports Server (NTRS)

    Bluhm, S. A.; Kuehn, T. J.; Gurfield, R. M.

    1979-01-01

    The paper discusses the approach of the JPL Point Focusing Thermal and Electric Power Applications Project to selecting and developing applications for point-focusing distributed-receiver solar thermal electric power systems. Six application categories are defined. Results of application studies of U.S. utilities are presented. The economic value of solar thermal power systems was found to range from $900 to $2100/kWe in small community utilities of the Southwest.

  6. Reconfiguration in Robust Distributed Real-Time Systems Based on Global Checkpoints

    DTIC Science & Technology

    1991-12-01

    achieved by utilizing distributed systems in which a single application program executes on multiple processors, connected to a network. The distributed...single application program executes on multiple proces- sors, connected to a network. The distributed nature of such systems make it possible to ...resident at every node. How - ever, the responsibility for execution of a particular function is assigned to only one node in this framework. This function

  7. Distributed utility technology cost, performance, and environmental characteristics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wan, Y; Adelman, S

    1995-06-01

    Distributed Utility (DU) is an emerging concept in which modular generation and storage technologies sited near customer loads in distribution systems and specifically targeted demand-side management programs are used to supplement conventional central station generation plants to meet customer energy service needs. Research has shown that implementation of the DU concept could provide substantial benefits to utilities. This report summarizes the cost, performance, and environmental and siting characteristics of existing and emerging modular generation and storage technologies that are applicable under the DU concept. It is intended to be a practical reference guide for utility planners and engineers seeking informationmore » on DU technology options. This work was funded by the Office of Utility Technologies of the US Department of Energy.« less

  8. Data Transparency | Distributed Generation Interconnection Collaborative |

    Science.gov Websites

    quality and availability are increasingly vital for reducing the costs of distributed generation completion in certain areas, increasing accountability for utility application processing. As distributed PV NREL, HECO, TSRG Improving Data Transparency for the Distributed PV Interconnection Process: Emergent

  9. Optimal planning and design of a renewable energy based supply system for microgrids

    DOE PAGES

    Hafez, Omar; Bhattacharya, Kankar

    2012-03-03

    This paper presents a technique for optimal planning and design of hybrid renewable energy systems for microgrid applications. The Distributed Energy Resources Customer Adoption Model (DER-CAM) is used to determine the optimal size and type of distributed energy resources (DERs) and their operating schedules for a sample utility distribution system. Using the DER-CAM results, an evaluation is performed to evaluate the electrical performance of the distribution circuit if the DERs selected by the DER-CAM optimization analyses are incorporated. Results of analyses regarding the economic benefits of utilizing the optimal locations identified for the selected DER within the system are alsomore » presented. The actual Brookhaven National Laboratory (BNL) campus electrical network is used as an example to show the effectiveness of this approach. The results show that these technical and economic analyses of hybrid renewable energy systems are essential for the efficient utilization of renewable energy resources for microgird applications.« less

  10. Supplemental Information for New York State Standardized Interconnection Requirements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ingram, Michael; Narang, David J.; Mather, Barry A.

    This document is intended to aid in the understanding and application of the New York State Standardized Interconnection Requirements (SIR) and Application Process for New Distributed Generators 5 MW or Less Connected in Parallel with Utility Distribution Systems, and it aims to provide supplemental information and discussion on selected topics relevant to the SIR. This guide focuses on technical issues that have to date resulted in the majority of utility findings within the context of interconnecting photovoltaic (PV) inverters. This guide provides background on the overall issue and related mitigation measures for selected topics, including substation backfeeding, anti-islanding and considerationsmore » for monitoring and controlling distributed energy resources (DER).« less

  11. Implementation of a Publish-Subscribe Protocol in Microgrid Islanding and Resynchronization with Self-Discovery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Starke, M.; Herron, A.; King, D.

    Communications systems and protocols are becoming second nature to utilities operating distribution systems. Traditionally, centralized communication approaches are often used, while recently in microgrid applications, distributed communication and control schema emerge offering several advantages such as improved system reliability, plug-and-play operation and distributed intelligence. Still, operation and control of microgrids including distributed communication schema have been less of a discussion in the literature. To address the challenge of multiple-inverter microgrid synchronization, a publish-subscribe protocol based, Data Distribution Service (DDS), communication schema for microgrids is proposed in this paper. The communication schema is discussed in details for individual devices such asmore » generators, photovoltaic systems, energy storage systems, microgrid point of common coupling switch, and supporting applications. In conclusion, islanding and resynchronization of a microgrid are demonstrated on a test-bed utilizing this schema.« less

  12. Implementation of a Publish-Subscribe Protocol in Microgrid Islanding and Resynchronization with Self-Discovery

    DOE PAGES

    Starke, M.; Herron, A.; King, D.; ...

    2017-08-24

    Communications systems and protocols are becoming second nature to utilities operating distribution systems. Traditionally, centralized communication approaches are often used, while recently in microgrid applications, distributed communication and control schema emerge offering several advantages such as improved system reliability, plug-and-play operation and distributed intelligence. Still, operation and control of microgrids including distributed communication schema have been less of a discussion in the literature. To address the challenge of multiple-inverter microgrid synchronization, a publish-subscribe protocol based, Data Distribution Service (DDS), communication schema for microgrids is proposed in this paper. The communication schema is discussed in details for individual devices such asmore » generators, photovoltaic systems, energy storage systems, microgrid point of common coupling switch, and supporting applications. In conclusion, islanding and resynchronization of a microgrid are demonstrated on a test-bed utilizing this schema.« less

  13. ScyFlow: An Environment for the Visual Specification and Execution of Scientific Workflows

    NASA Technical Reports Server (NTRS)

    McCann, Karen M.; Yarrow, Maurice; DeVivo, Adrian; Mehrotra, Piyush

    2004-01-01

    With the advent of grid technologies, scientists and engineers are building more and more complex applications to utilize distributed grid resources. The core grid services provide a path for accessing and utilizing these resources in a secure and seamless fashion. However what the scientists need is an environment that will allow them to specify their application runs at a high organizational level, and then support efficient execution across any given set or sets of resources. We have been designing and implementing ScyFlow, a dual-interface architecture (both GUT and APT) that addresses this problem. The scientist/user specifies the application tasks along with the necessary control and data flow, and monitors and manages the execution of the resulting workflow across the distributed resources. In this paper, we utilize two scenarios to provide the details of the two modules of the project, the visual editor and the runtime workflow engine.

  14. Load Segmentation for Convergence of Distribution Automation and Advanced Metering Infrastructure Systems

    NASA Astrophysics Data System (ADS)

    Pamulaparthy, Balakrishna; KS, Swarup; Kommu, Rajagopal

    2014-12-01

    Distribution automation (DA) applications are limited to feeder level today and have zero visibility outside of the substation feeder and reaching down to the low-voltage distribution network level. This has become a major obstacle in realizing many automated functions and enhancing existing DA capabilities. Advanced metering infrastructure (AMI) systems are being widely deployed by utilities across the world creating system-wide communications access to every monitoring and service point, which collects data from smart meters and sensors in short time intervals, in response to utility needs. DA and AMI systems convergence provides unique opportunities and capabilities for distribution grid modernization with the DA system acting as a controller and AMI system acting as feedback to DA system, for which DA applications have to understand and use the AMI data selectively and effectively. In this paper, we propose a load segmentation method that helps the DA system to accurately understand and use the AMI data for various automation applications with a suitable case study on power restoration.

  15. Network Model of Decreased Context Utilization in Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Beversdorf, David Q.; Narayanan, Ananth; Hillier, Ashleigh; Hughes, John D.

    2007-01-01

    Individuals with autism spectrum disorders (ASD) demonstrate impaired utilization of context, which allows for superior performance on the "false memory" task. We report the application of a simplified parallel distributed processing model of context utilization to the false memory task. For individuals without ASD, experiments support a model…

  16. Innovative applications of energy storage in a restructured electricity marketplace : Phase III final report : a study for the DOE Energy Storage Systems Program.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eyer, James M.; Erdman, Bill; Iannucci, Joseph J., Jr.

    2005-03-01

    This report describes Phase III of a project entitled Innovative Applications of Energy Storage in a Restructured Electricity Marketplace. For this study, the authors assumed that it is feasible to operate an energy storage plant simultaneously for two primary applications: (1) energy arbitrage, i.e., buy-low-sell-high, and (2) to reduce peak loads in utility ''hot spots'' such that the utility can defer their need to upgrade transmission and distribution (T&D) equipment. The benefits from the arbitrage plus T&D deferral applications were estimated for five cases based on the specific requirements of two large utilities operating in the Eastern U.S. A numbermore » of parameters were estimated for the storage plant ratings required to serve the combined application: power output (capacity) and energy discharge duration (energy storage). In addition to estimating the various financial expenditures and the value of electricity that could be realized in the marketplace, technical characteristics required for grid-connected distributed energy storage used for capacity deferral were also explored.« less

  17. Distributed semantic networks and CLIPS

    NASA Technical Reports Server (NTRS)

    Snyder, James; Rodriguez, Tony

    1991-01-01

    Semantic networks of frames are commonly used as a method of reasoning in many problems. In most of these applications the semantic network exists as a single entity in a single process environment. Advances in workstation hardware provide support for more sophisticated applications involving multiple processes, interacting in a distributed environment. In these applications the semantic network may well be distributed over several concurrently executing tasks. This paper describes the design and implementation of a frame based, distributed semantic network in which frames are accessed both through C Language Integrated Production System (CLIPS) expert systems and procedural C++ language programs. The application area is a knowledge based, cooperative decision making model utilizing both rule based and procedural experts.

  18. Energy Systems Integration News - September 2016 | Energy Systems

    Science.gov Websites

    , Smarter Grid Solutions demonstrated a new distributed energy resources (DER) software control platform utility interconnections require distributed generation (DG) devices to disconnect from the grid during OpenFMB distributed applications on the microgrid test site to locally optimize renewable energy resources

  19. California methanol assessment. Volume 2: Technical report

    NASA Technical Reports Server (NTRS)

    Otoole, R.; Dutzi, E.; Gershman, R.; Heft, R.; Kalema, W.; Maynard, D.

    1983-01-01

    Energy feedstock sources for methanol; methanol and other synfuels; transport, storage, and distribution; air quality impact of methanol use in vehicles, chemical methanol production and use; methanol utilization in vehicles; methanol utilization in stationary applications; and environmental and regulatory constraints are discussed.

  20. DISTRIBUTION OF CHIRAL PCBS IN SELECTED TISSUES IN THE LABORATORY RAT

    EPA Science Inventory

    Polychlorinated biphenyls (PCBs) were manufactured for a large number of technical applications including for use in transformers and capacitors. The widespread commercial utilization of PCBs and their persistence in the environment have resulted in their worldwide distribution. ...

  1. 2012 Market Report on U.S. Wind Technologies in Distributed Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Orrell, Alice C.; Flowers, L. T.; Gagne, M. N.

    2013-08-06

    At the end of 2012, U.S. wind turbines in distributed applications reached a 10-year cumulative installed capacity of more than 812 MW from more than 69,000 units across all 50 states. In 2012 alone, nearly 3,800 wind turbines totaling 175 MW of distributed wind capacity were documented in 40 states and in the U.S. Virgin Islands, with 138 MW using utility-scale turbines (i.e., greater than 1 MW in size), 19 MW using mid-size turbines (i.e., 101 kW to 1 MW in size), and 18.4 MW using small turbines (i.e., up to 100 kW in size). Distributed wind is defined inmore » terms of technology application based on a wind project’s location relative to end-use and power-distribution infrastructure, rather than on technology size or project size. Distributed wind systems are either connected on the customer side of the meter (to meet the onsite load) or directly to distribution or micro grids (to support grid operations or offset large loads nearby). Estimated capacity-weighted average costs for 2012 U.S. distributed wind installations was $2,540/kW for utility-scale wind turbines, $2,810/kW for mid-sized wind turbines, and $6,960/kW for newly manufactured (domestic and imported) small wind turbines. An emerging trend observed in 2012 was an increased use of refurbished turbines. The estimated capacity-weighted average cost of refurbished small wind turbines installed in 2012 was $4,080/kW. As a result of multiple projects using utility-scale turbines, Iowa deployed the most new overall distributed wind capacity, 37 MW, in 2012. Nevada deployed the most small wind capacity in 2012, with nearly 8 MW of small wind turbines installed in distributed applications. In the case of mid-size turbines, Ohio led all states in 2012 with 4.9 MW installed in distributed applications. State and federal policies and incentives continued to play a substantial role in the development of distributed wind projects. In 2012, U.S. Treasury Section 1603 payments and grants and loans from the U.S. Department of Agriculture’s Rural Energy for America Program were the main sources of federal funding for distributed wind projects. State and local funding varied across the country, from rebates to loans, tax credits, and other incentives. Reducing utility bills and hedging against potentially rising electricity rates remain drivers of distributed wind installations. In 2012, other drivers included taking advantage of the expiring U.S. Treasury Section 1603 program and a prosperous year for farmers. While 2012 saw a large addition of distributed wind capacity, considerable barriers and challenges remain, such as a weak domestic economy, inconsistent state incentives, and very competitive solar photovoltaic and natural gas prices. The industry remains committed to improving the distributed wind marketplace by advancing the third-party certification process and introducing alternative financing models, such as third-party power purchase agreements and lease-to-own agreements more typical in the solar photovoltaic market. Continued growth is expected in 2013.« less

  2. Photovoltaic technology for sustainability: An investigation of the distributed utility concept as a policy framework

    NASA Astrophysics Data System (ADS)

    Letendre, Steven Emery

    The U.S. electric utility sector in its current configuration is unsustainable. The majority of electricity in the United States is produced using finite fossil fuels. In addition, significant potential exists to improve the nation's efficient use of energy. A sustainable electric utility sector will be characterized by increased use of renewable energy sources and high levels of end-use efficiency. This dissertation analyzes two alternative policy approaches designed to move the U.S. electric utility sector toward sustainability. One approach is labeled incremental which involves maintaining the centralized structure of the electric utility sector but facilitating the introduction of renewable energy and efficiency into the electrical system through the pricing mechanism. A second policy approach was described in which structural changes are encouraged based on the emerging distributed utility (DU) concept. A structural policy orientation attempts to capture the unique localized benefits that distributed renewable resources and energy efficiency offer to electric utility companies and their customers. A market penetration analysis of PV in centralized energy supply and distributed peak-shaving applications is conducted for a case-study electric utility company. Sensitivity analysis was performed based on incremental and structural policy orientations. The analysis provides compelling evidence which suggests that policies designed to bring about structural change in the electric utility sector are needed to move the industry toward sustainability. Specifically, the analysis demonstrates that PV technology, a key renewable energy option likely to play an important role in a renewable energy future, will begin to penetrate the electrical system in distributed peak-shaving applications long before the technology is introduced as a centralized energy supply option. Most policies to date, which I term incremental, attempt to encourage energy efficiency and renewables through the pricing system. Based on past policy experience, it is unlikely that such an approach would allow PV to compete in Delaware as an energy supply option in the next ten to twenty years. Alternatively, a market-based, or green pricing, approach will not create significant market opportunities for PV as a centralized energy supply option. However, structural policies designed to encourage the explicit recognition of the localized benefits of distributed resources could result in PV being introduced into the electrical system early in the next century.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Agalgaonkar, Yashodhan P.; Hammerstrom, Donald J.

    The Pacific Northwest Smart Grid Demonstration (PNWSGD) was a smart grid technology performance evaluation project that included multiple U.S. states and cooperation from multiple electric utilities in the northwest region. One of the local objectives for the project was to achieve improved distribution system reliability. Toward this end, some PNWSGD utilities automated their distribution systems, including the application of fault detection, isolation, and restoration and advanced metering infrastructure. In light of this investment, a major challenge was to establish a correlation between implementation of these smart grid technologies and actual improvements of distribution system reliability. This paper proposes using Welch’smore » t-test to objectively determine and quantify whether distribution system reliability is improving over time. The proposed methodology is generic, and it can be implemented by any utility after calculation of the standard reliability indices. The effectiveness of the proposed hypothesis testing approach is demonstrated through comprehensive practical results. It is believed that wider adoption of the proposed approach can help utilities to evaluate a realistic long-term performance of smart grid technologies.« less

  4. A METHODOLOGY FOR ESTIMATING UNCERTAINTY OF A DISTRIBUTED HYDROLOGIC MODEL: APPLICATION TO POCONO CREEK WATERSHED

    EPA Science Inventory

    Utility of distributed hydrologic and water quality models for watershed management and sustainability studies should be accompanied by rigorous model uncertainty analysis. However, the use of complex watershed models primarily follows the traditional {calibrate/validate/predict}...

  5. PVUSA: The value of photovoltaics in the distribution system. The Kerman Grid-Support Project

    NASA Astrophysics Data System (ADS)

    Wenger, Howard J.; Hoff, Thomas E.

    1995-05-01

    As part of the Photovoltaics for Utility Scale Applications Applications (PVUSA) Project Pacific Gas Electric Company (PG&E) built the Kerman 500-kW photovoltaic power plant. Located near the end of a distribution feeder in a rural section of Fresno County, the plant was not built so much to demonstrate PV technology, but to evaluate its interaction with the local distribution grid and quantify available nontraditional grid-support benefits (those other than energy and capacity). As demand for new generation began to languish in the 1980s, and siting and permitting of power plants and transmission lines became more involved, utilities began considering smaller, distributed power sources. Potential benefits include shorter construction lead time, less capital outlay, and better utilization of existing assets. The results of a PG&E study in 1990/1991 of the benefits from a PV system to the distribution grid prompted the PVUSA Project to construct a plant at Kerman. Completed in 1993, the plant is believed to be the first one specifically built to evaluate the multiple benefits to the grid of a strategically sited plant. Each of nine discrete benefits were evaluated in detail by first establishing the technical impact, then translating the results into present economic value. Benefits span the entire system from distribution feeder to the generation fleet. This work breaks new ground in evaluation of distributed resources, and suggests that resource planning practices be expanded to account for these non-traditional benefits.

  6. Single-shot coherent diffraction imaging of microbunched relativistic electron beams for free-electron laser applications.

    PubMed

    Marinelli, A; Dunning, M; Weathersby, S; Hemsing, E; Xiang, D; Andonian, G; O'Shea, F; Miao, Jianwei; Hast, C; Rosenzweig, J B

    2013-03-01

    With the advent of coherent x rays provided by the x-ray free-electron laser (FEL), strong interest has been kindled in sophisticated diffraction imaging techniques. In this Letter, we exploit such techniques for the diagnosis of the density distribution of the intense electron beams typically utilized in an x-ray FEL itself. We have implemented this method by analyzing the far-field coherent transition radiation emitted by an inverse-FEL microbunched electron beam. This analysis utilizes an oversampling phase retrieval method on the transition radiation angular spectrum to reconstruct the transverse spatial distribution of the electron beam. This application of diffraction imaging represents a significant advance in electron beam physics, having critical applications to the diagnosis of high-brightness beams, as well as the collective microbunching instabilities afflicting these systems.

  7. Airblast Simulator Studies.

    DTIC Science & Technology

    1984-02-01

    RAREFACTION WAVE ELIMINATOR CONSIDERATIONS 110 5.1 FLIP CALCULATIONS 110 5.2 A PASSIVE/ACTIVE RWE 118 6 DISTRIBUTED FUEL AIR EXPLOSIVES 120 REFERENCES 123 TA...conventional and distributed-charge fuel- air explosive charges used in a study of the utility of distributed charge FAE systems for blast simulation. The...limited investigation of distributed charge fuel air explosive configurations for blast simulator applications. During the course of this study

  8. A practical large scale/high speed data distribution system using 8 mm libraries

    NASA Technical Reports Server (NTRS)

    Howard, Kevin

    1993-01-01

    Eight mm tape libraries are known primarily for their small size, large storage capacity, and low cost. However, many applications require an additional attribute which, heretofore, has been lacking -- high transfer rate. Transfer rate is particularly important in a large scale data distribution environment -- an environment in which 8 mm tape should play a very important role. Data distribution is a natural application for 8 mm for several reasons: most large laboratories have access to 8 mm tape drives, 8 mm tapes are upwardly compatible, 8 mm media are very inexpensive, 8 mm media are light weight (important for shipping purposes), and 8 mm media densely pack data (5 gigabytes now and 15 gigabytes on the horizon). If the transfer rate issue were resolved, 8 mm could offer a good solution to the data distribution problem. To that end Exabyte has analyzed four ways to increase its transfer rate: native drive transfer rate increases, data compression at the drive level, tape striping, and homogeneous drive utilization. Exabyte is actively pursuing native drive transfer rate increases and drive level data compression. However, for non-transmitted bulk data applications (which include data distribution) the other two methods (tape striping and homogeneous drive utilization) hold promise.

  9. Standby Rates for Combined Heat and Power Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sedano, Richard; Selecky, James; Iverson, Kathryn

    2014-02-01

    Improvements in technology, low natural gas prices, and more flexible and positive attitudes in government and utilities are making distributed generation more viable. With more distributed generation, notably combined heat and power, comes an increase in the importance of standby rates, the cost of services utilities provide when customer generation is not operating or is insufficient to meet full load. This work looks at existing utility standby tariffs in five states. It uses these existing rates and terms to showcase practices that demonstrate a sound application of regulatory principles and ones that do not. The paper also addresses areas formore » improvement in standby rates.« less

  10. Multi-agent systems and their applications

    DOE PAGES

    Xie, Jing; Liu, Chen-Ching

    2017-07-14

    The number of distributed energy components and devices continues to increase globally. As a result, distributed control schemes are desirable for managing and utilizing these devices, together with the large amount of data. In recent years, agent-based technology becomes a powerful tool for engineering applications. As a computational paradigm, multi agent systems (MASs) provide a good solution for distributed control. Here in this paper, MASs and applications are discussed. A state-of-the-art literature survey is conducted on the system architecture, consensus algorithm, and multi-agent platform, framework, and simulator. In addition, a distributed under-frequency load shedding (UFLS) scheme is proposed using themore » MAS. Simulation results for a case study are presented. The future of MASs is discussed in the conclusion.« less

  11. Multi-agent systems and their applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xie, Jing; Liu, Chen-Ching

    The number of distributed energy components and devices continues to increase globally. As a result, distributed control schemes are desirable for managing and utilizing these devices, together with the large amount of data. In recent years, agent-based technology becomes a powerful tool for engineering applications. As a computational paradigm, multi agent systems (MASs) provide a good solution for distributed control. Here in this paper, MASs and applications are discussed. A state-of-the-art literature survey is conducted on the system architecture, consensus algorithm, and multi-agent platform, framework, and simulator. In addition, a distributed under-frequency load shedding (UFLS) scheme is proposed using themore » MAS. Simulation results for a case study are presented. The future of MASs is discussed in the conclusion.« less

  12. A Transparent Translation from Legacy System Model into Common Information Model: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ding, Fei; Simpson, Jeffrey; Zhang, Yingchen

    Advance in smart grid is forcing utilities towards better monitoring, control and analysis of distribution systems, and requires extensive cyber-based intelligent systems and applications to realize various functionalities. The ability of systems, or components within systems, to interact and exchange services or information with each other is the key to the success of smart grid technologies, and it requires efficient information exchanging and data sharing infrastructure. The Common Information Model (CIM) is a standard that allows different applications to exchange information about an electrical system, and it has become a widely accepted solution for information exchange among different platforms andmore » applications. However, most existing legacy systems are not developed using CIM, but using their own languages. Integrating such legacy systems is a challenge for utilities, and the appropriate utilization of the integrated legacy systems is even more intricate. Thus, this paper has developed an approach and open-source tool in order to translate legacy system models into CIM format. The developed tool is tested for a commercial distribution management system and simulation results have proved its effectiveness.« less

  13. MONOCHLORAMINE MICROELECTRODE FOR IN SITU APPLICATION WITHIN THE BIOFILM OF CHLORAMINATED DRINKING WATER DISTRIBUTION SYSTEMS- Abstract

    EPA Science Inventory

    Many utilities in the United States are using monochloramine as a secondary disinfectant as a result of the implementation of the Stage 1 and Stage 2 Disinfectants and Disinfection Byproduct Rules. A recent survey suggests that an additional 12% of drinking water utilities are c...

  14. A Novel 24 Ghz One-Shot Rapid and Portable Microwave Imaging System (Camera)

    NASA Technical Reports Server (NTRS)

    Ghasr, M.T.; Abou-Khousa, M.A.; Kharkovsky, S.; Zoughi, R.; Pommerenke, D.

    2008-01-01

    A novel 2D microwave imaging system at 24 GHz based on MST techniques. Enhanced sensitivity and SNR by utilizing PIN diode-loaded resonant slots. Specific slot and array design to increase transmission and reduce cross -coupling. Real-time imaging at a rate in excess of 30 images per second. Reflection as well transmission mode capabilities. Utility and application for electric field distribution mapping related to: Nondestructive Testing (NDT), imaging applications (SAR, Holography), and antenna pattern measurements.

  15. Application of Autonomous Smart Inverter Volt-VAR Function for Voltage Reduction Energy Savings and Power Quality in Electric Distribution Systems: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ding, Fei; Nagarajan, Adarsh; Baggu, Murali

    This paper evaluated the impact of smart inverter Volt-VAR function on voltage reduction energy saving and power quality in electric power distribution systems. A methodology to implement the voltage reduction optimization was developed by controlling the substation LTC and capacitor banks, and having smart inverters participate through their autonomous Volt-VAR control. In addition, a power quality scoring methodology was proposed and utilized to quantify the effect on power distribution system power quality. All of these methodologies were applied to a utility distribution system model to evaluate the voltage reduction energy saving and power quality under various PV penetrations and smartmore » inverter densities.« less

  16. ADMS State of the Industry and Gap Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Agalgaonkar, Yashodhan P.; Marinovici, Maria C.; Vadari, Subramanian V.

    2016-03-31

    An Advanced distribution management system (ADMS) is a platform for optimized distribution system operational management. This platform comprises of distribution management system (DMS) applications, supervisory control and data acquisition (SCADA), outage management system (OMS), and distributed energy resource management system (DERMS). One of the primary objectives of this work is to study and analyze several ADMS component and auxiliary systems. All the important component and auxiliary systems, SCADA, GISs, DMSs, AMRs/AMIs, OMSs, and DERMS, are discussed in this report. Their current generation technologies are analyzed, and their integration (or evolution) with an ADMS technology is discussed. An ADMS technology statemore » of the art and gap analysis is also presented. There are two technical gaps observed. The integration challenge between the component operational systems is the single largest challenge for ADMS design and deployment. Another significant challenge noted is concerning essential ADMS applications, for instance, fault location, isolation, and service restoration (FLISR), volt-var optimization (VVO), etc. There are a relatively small number of ADMS application developers as ADMS software platform is not open source. There is another critical gap and while not being technical in nature (when compared the two above) is still important to consider. The data models currently residing in utility GIS systems are either incomplete or inaccurate or both. This data is essential for planning and operations because it is typically one of the primary sources from which power system model are created. To achieve the full potential of ADMS, the ability to execute acute Power Flow solution is an important pre-requisite. These critical gaps are hindering wider Utility adoption of an ADMS technology. The development of an open architecture platform can eliminate many of these barriers and also aid seamless integration of distribution Utility legacy systems with an ADMS.« less

  17. Regulation of distribution network business

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roman, J.; Gomez, T.; Munoz, A.

    1999-04-01

    The traditional distribution function actually comprises two separate activities: distribution network and retailing. Retailing, which is also termed supply, consists of trading electricity at the wholesale level and selling it to the end users. The distribution network business, or merely distribution, is a natural monopoly and it must be regulated. Increasing attention is presently being paid to the regulation of distribution pricing. Distribution pricing, comprises two major tasks: global remuneration of the distribution utility and tariff setting by allocation of the total costs among all the users of the network services. In this paper, the basic concepts for establishing themore » global remuneration of a distribution utility are presented. A remuneration scheme which recognizes adequate investment and operation costs, promotes losses reduction and incentivates the control of the quality of service level is proposed. Efficient investment and operation costs are calculated by using different types of strategic planning and regression analysis models. Application examples that have been used during the distribution regulation process in Spain are also presented.« less

  18. Technology survey of electrical power generation and distribution for MIUS application

    NASA Technical Reports Server (NTRS)

    Gill, W. L.; Redding, T. E.

    1975-01-01

    Candidate electrical generation power systems for the modular integrated utility systems (MIUS) program are described. Literature surveys were conducted to cover both conventional and exotic generators. Heat-recovery equipment associated with conventional power systems and supporting equipment are also discussed. Typical ranges of operating conditions and generating efficiencies are described. Power distribution is discussed briefly. Those systems that appear to be applicable to MIUS have been indicated, and the criteria for equipment selection are discussed.

  19. National Utility Rate Database: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ong, S.; McKeel, R.

    2012-08-01

    When modeling solar energy technologies and other distributed energy systems, using high-quality expansive electricity rates is essential. The National Renewable Energy Laboratory (NREL) developed a utility rate platform for entering, storing, updating, and accessing a large collection of utility rates from around the United States. This utility rate platform lives on the Open Energy Information (OpenEI) website, OpenEI.org, allowing the data to be programmatically accessed from a web browser, using an application programming interface (API). The semantic-based utility rate platform currently has record of 1,885 utility rates and covers over 85% of the electricity consumption in the United States.

  20. 34 CFR 12.7 - How is surplus Federal real property disposed of when there is more than one applicant?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Education DISPOSAL AND UTILIZATION OF SURPLUS FEDERAL REAL PROPERTY FOR EDUCATIONAL PURPOSES Distribution of...'s proposed educational use at the site of the surplus Federal real property; (2) Considering the quality of each applicant's proposed program and plan of use; and (3) Considering each applicant's ability...

  1. Distribution System Reliability Analysis for Smart Grid Applications

    NASA Astrophysics Data System (ADS)

    Aljohani, Tawfiq Masad

    Reliability of power systems is a key aspect in modern power system planning, design, and operation. The ascendance of the smart grid concept has provided high hopes of developing an intelligent network that is capable of being a self-healing grid, offering the ability to overcome the interruption problems that face the utility and cost it tens of millions in repair and loss. To address its reliability concerns, the power utilities and interested parties have spent extensive amount of time and effort to analyze and study the reliability of the generation and transmission sectors of the power grid. Only recently has attention shifted to be focused on improving the reliability of the distribution network, the connection joint between the power providers and the consumers where most of the electricity problems occur. In this work, we will examine the effect of the smart grid applications in improving the reliability of the power distribution networks. The test system used in conducting this thesis is the IEEE 34 node test feeder, released in 2003 by the Distribution System Analysis Subcommittee of the IEEE Power Engineering Society. The objective is to analyze the feeder for the optimal placement of the automatic switching devices and quantify their proper installation based on the performance of the distribution system. The measures will be the changes in the reliability system indices including SAIDI, SAIFI, and EUE. The goal is to design and simulate the effect of the installation of the Distributed Generators (DGs) on the utility's distribution system and measure the potential improvement of its reliability. The software used in this work is DISREL, which is intelligent power distribution software that is developed by General Reliability Co.

  2. 40 CFR 423.10 - Applicability.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... ELECTRIC POWER GENERATING POINT SOURCE CATEGORY § 423.10 Applicability. The provisions of this part are... engaged in the generation of electricity for distribution and sale which results primarily from a process utilizing fossil-type fuel (coal, oil, or gas) or nuclear fuel in conjunction with a thermal cycle employing...

  3. 40 CFR 423.10 - Applicability.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... ELECTRIC POWER GENERATING POINT SOURCE CATEGORY § 423.10 Applicability. The provisions of this part are... engaged in the generation of electricity for distribution and sale which results primarily from a process utilizing fossil-type fuel (coal, oil, or gas) or nuclear fuel in conjunction with a thermal cycle employing...

  4. 40 CFR 423.10 - Applicability.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... ELECTRIC POWER GENERATING POINT SOURCE CATEGORY § 423.10 Applicability. The provisions of this part are... engaged in the generation of electricity for distribution and sale which results primarily from a process utilizing fossil-type fuel (coal, oil, or gas) or nuclear fuel in conjunction with a thermal cycle employing...

  5. 40 CFR 423.10 - Applicability.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... ELECTRIC POWER GENERATING POINT SOURCE CATEGORY § 423.10 Applicability. The provisions of this part are... engaged in the generation of electricity for distribution and sale which results primarily from a process utilizing fossil-type fuel (coal, oil, or gas) or nuclear fuel in conjunction with a thermal cycle employing...

  6. 40 CFR 423.10 - Applicability.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... ELECTRIC POWER GENERATING POINT SOURCE CATEGORY § 423.10 Applicability. The provisions of this part are... engaged in the generation of electricity for distribution and sale which results primarily from a process utilizing fossil-type fuel (coal, oil, or gas) or nuclear fuel in conjunction with a thermal cycle employing...

  7. Final Technical Report for Contract No. DE-EE0006332, "Integrated Simulation Development and Decision Support Tool-Set for Utility Market and Distributed Solar Power Generation"

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cormier, Dallas; Edra, Sherwin; Espinoza, Michael

    This project will enable utilities to develop long-term strategic plans that integrate high levels of renewable energy generation, and to better plan power system operations under high renewable penetration. The program developed forecast data streams for decision support and effective integration of centralized and distributed solar power generation in utility operations. This toolset focused on real time simulation of distributed power generation within utility grids with the emphasis on potential applications in day ahead (market) and real time (reliability) utility operations. The project team developed and demonstrated methodologies for quantifying the impact of distributed solar generation on core utility operations,more » identified protocols for internal data communication requirements, and worked with utility personnel to adapt the new distributed generation (DG) forecasts seamlessly within existing Load and Generation procedures through a sophisticated DMS. This project supported the objectives of the SunShot Initiative and SUNRISE by enabling core utility operations to enhance their simulation capability to analyze and prepare for the impacts of high penetrations of solar on the power grid. The impact of high penetration solar PV on utility operations is not only limited to control centers, but across many core operations. Benefits of an enhanced DMS using state-of-the-art solar forecast data were demonstrated within this project and have had an immediate direct operational cost savings for Energy Marketing for Day Ahead generation commitments, Real Time Operations, Load Forecasting (at an aggregate system level for Day Ahead), Demand Response, Long term Planning (asset management), Distribution Operations, and core ancillary services as required for balancing and reliability. This provided power system operators with the necessary tools and processes to operate the grid in a reliable manner under high renewable penetration.« less

  8. Solid State Remote Power Controllers for high voltage DC distribution systems

    NASA Technical Reports Server (NTRS)

    Billings, W. W.; Sundberg, G. R.

    1977-01-01

    Presently, hybrid Remote Power Controllers (RPC's) are in production and prototype units are available for systems utilizing 28VDC, 120VDC, 115VAC/400 Hz and 230VAC/400 Hz. This paper describes RPC development in a new area of application: HVDC distribution systems utilizing 270/300VDC. Two RPC current ratings, 1 amp and 2 amps, were selected for development as they are adequate to control 90% of projected system loads. The various aspects and trade-offs encountered in circuit development are discussed with special focus placed on the circuits that see the duress of the high dc potentials. The comprehensive evaluation tests are summarized which confirmed the RPC compliance with the specification and with system/load compatibility requirements. In addition, present technology status and new applications are summarized.

  9. Evaluation of two typical distributed energy systems

    NASA Astrophysics Data System (ADS)

    Han, Miaomiao; Tan, Xiu

    2018-03-01

    According to the two-natural gas distributed energy system driven by gas engine driven and gas turbine, in this paper, the first and second laws of thermodynamics are used to measure the distributed energy system from the two parties of “quantity” and “quality”. The calculation results show that the internal combustion engine driven distributed energy station has a higher energy efficiency, but the energy efficiency is low; the gas turbine driven distributed energy station energy efficiency is high, but the primary energy utilization rate is relatively low. When configuring the system, we should determine the applicable natural gas distributed energy system technology plan and unit configuration plan according to the actual load factors of the project and the actual factors such as the location, background and environmental requirements of the project. “quality” measure, the utilization of waste heat energy efficiency index is proposed.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garlapati, Shravan; Kuruganti, Teja; Buehrer, Michael R.

    The utilization of state-of-the-art 3G cellular CDMA technologies in a utility owned AMI network results in a large amount of control traffic relative to data traffic, increases the average packet delay and hence are not an appropriate choice for smart grid distribution applications. Like the CDG, we consider a utility owned cellular like CDMA network for smart grid distribution applications and classify the distribution smart grid data as scheduled data and random data. Also, we propose SMAC protocol, which changes its mode of operation based on the type of the data being collected to reduce the data collection latency andmore » control overhead when compared to 3G cellular CDMA2000 MAC. The reduction in the data collection latency and control overhead aids in increasing the number of smart meters served by a base station within the periodic data collection interval, which further reduces the number of base stations needed by a utility or reduces the bandwidth needed to collect data from all the smart meters. The reduction in the number of base stations and/or the reduction in the data transmission bandwidth reduces the CAPital EXpenditure (CAPEX) and OPerational EXpenditure (OPEX) of the AMI network. Finally, the proposed SMAC protocol is analyzed using markov chain, analytical expressions for average throughput and average packet delay are derived, and simulation results are also provided to verify the analysis.« less

  11. Application of laser velocimetry to aircraft wake-vortex measurements

    NASA Technical Reports Server (NTRS)

    Ciffone, D. L.; Orloff, K. L.

    1977-01-01

    The theory and use of a laser velocimeter that makes simultaneous measurements of vertical and longitudinal velocities while rapidly scanning a flow field laterally are described, and its direct application to trailing wake-vortex research is discussed. Pertinent measurements of aircraft wake-vortex velocity distributions obtained in a wind tunnel and water towing tank are presented. The utility of the velocimeter to quantitatively assess differences in wake velocity distributions due to wake dissipating devices and span loading changes on the wake-generating model is also demonstrated.

  12. Distributed File System Utilities to Manage Large DatasetsVersion 0.5

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2014-05-21

    FileUtils provides a suite of tools to manage large datasets typically created by large parallel MPI applications. They are written in C and use standard POSIX I/Ocalls. The current suite consists of tools to copy, compare, remove, and list. The tools provide dramatic speedup over existing Linux tools, which often run as a single process.

  13. 78 FR 52827 - Food Distribution Program on Indian Reservations: Income Deductions and Resource Eligibility

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-27

    ... standard deduction. Both commenters observed that an applicant's statement is acceptable as proof to receive the standard deduction under SNAP. SNAP allows for self-declaration of shelter/utility expenses at or below the applicable standard. However in SNAP, all expenses a household wishes to claim or which...

  14. Modelling healthcare systems with phase-type distributions.

    PubMed

    Fackrell, Mark

    2009-03-01

    Phase-type distributions constitute a very versatile class of distributions. They have been used in a wide range of stochastic modelling applications in areas as diverse as telecommunications, finance, biostatistics, queueing theory, drug kinetics, and survival analysis. Their use in modelling systems in the healthcare industry, however, has so far been limited. In this paper we introduce phase-type distributions, give a survey of where they have been used in the healthcare industry, and propose some ideas on how they could be further utilized.

  15. Modular Cascaded H-Bridge Multilevel PV Inverter with Distributed MPPT for Grid-Connected Applications

    DOE PAGES

    Xiao, Bailu; Hang, Lijun; Mei, Jun; ...

    2014-09-04

    This paper presents a modular cascaded H-bridge multilevel photovoltaic (PV) inverter for single- or three-phase grid-connected applications. The modular cascaded multilevel topology helps to improve the efficiency and flexibility of PV systems. To realize better utilization of PV modules and maximize the solar energy extraction, a distributed maximum power point tracking (MPPT) control scheme is applied to both single-phase and three-phase multilevel inverters, which allows the independent control of each dc-link voltage. For three-phase grid-connected applications, PV mismatches may introduce unbalanced supplied power, leading to unbalanced grid current. To solve this issue, a control scheme with modulation compensation is alsomore » proposed. An experimental three-phase 7-level cascaded H-bridge inverter has been built utilizing 9 H-bridge modules (3 modules per phase). Each H-bridge module is connected to a 185 W solar panel. Simulation and experimental results are presented to verify the feasibility of the proposed approach.« less

  16. Introduction to Concurrent Engineering: Electronic Circuit Design and Production Applications

    DTIC Science & Technology

    1992-09-01

    STD-1629. Failure mode distribution data for many different types of parts may be found in RAC publication FMD -91. FMEA utilizes inductive logic in a...contrasts with a Fault Tree Analysis ( FTA ) which utilizes deductive logic in a "top down" approach. In FTA , a system failure is assumed and traced down...Analysis ( FTA ) is a graphical method of risk analysis used to identify critical failure modes within a system or equipment. Utilizing a pictorial approach

  17. SMAC: A soft MAC to reduce control overhead and latency in CDMA-based AMI networks

    DOE PAGES

    Garlapati, Shravan; Kuruganti, Teja; Buehrer, Michael R.; ...

    2015-10-26

    The utilization of state-of-the-art 3G cellular CDMA technologies in a utility owned AMI network results in a large amount of control traffic relative to data traffic, increases the average packet delay and hence are not an appropriate choice for smart grid distribution applications. Like the CDG, we consider a utility owned cellular like CDMA network for smart grid distribution applications and classify the distribution smart grid data as scheduled data and random data. Also, we propose SMAC protocol, which changes its mode of operation based on the type of the data being collected to reduce the data collection latency andmore » control overhead when compared to 3G cellular CDMA2000 MAC. The reduction in the data collection latency and control overhead aids in increasing the number of smart meters served by a base station within the periodic data collection interval, which further reduces the number of base stations needed by a utility or reduces the bandwidth needed to collect data from all the smart meters. The reduction in the number of base stations and/or the reduction in the data transmission bandwidth reduces the CAPital EXpenditure (CAPEX) and OPerational EXpenditure (OPEX) of the AMI network. Finally, the proposed SMAC protocol is analyzed using markov chain, analytical expressions for average throughput and average packet delay are derived, and simulation results are also provided to verify the analysis.« less

  18. Solar thermal power systems point-focusing thermal and electric applications projects. Volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    Marriott, A.

    1980-01-01

    The activities of the Point-Focusing Thermal and Electric Applications (PETEA) project for the fiscal year 1979 are summarized. The main thrust of the PFTEA Project, the small community solar thermal power experiment, was completed. Concept definition studies included a small central receiver approach, a point-focusing distributed receiver system with central power generation, and a point-focusing distributed receiver concept with distributed power generation. The first experiment in the Isolated Application Series was initiated. Planning for the third engineering experiment series, which addresses the industrial market sector, was also initiated. In addition to the experiment-related activities, several contracts to industry were let and studies were conducted to explore the market potential for point-focusing distributed receiver (PFDR) systems. System analysis studies were completed that looked at PFDR technology relative to other small power system technology candidates for the utility market sector.

  19. Fiber in the Local Loop: The Role of Electric Utilities

    NASA Astrophysics Data System (ADS)

    Meehan, Charles M.

    1990-01-01

    Electric utilities are beginning to make heavy use of fiber for a number of applications beyond transmission of voice and data among operating centers and plant facilities which employed fiber on the electric transmission systems. These additional uses include load management and automatic meter reading. Thus, utilities are beginning to place fiber on the electric distribution systems which, in many cases covers the same customer base as the "local loop". This shift to fiber on the distribution system is due to the advantages offered by fiber and because of congestion in the radio bands used for load management. This shift to fiber has been facilitated by a regulatory policy permitting utilities to lease reserve capacity on their fiber systems on an unregulated basis. This, in turn, has interested electric utilities in building fiber to their residential and commercial customers for voice, data and video. This will also provide for sophisticated load management systems and, possibly, generation of revenue.

  20. 7 CFR 1951.9 - Distribution of payments when a borrower owes more than one type of FmHA or its successor agency...

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ...-COOPERATIVE SERVICE, RURAL UTILITIES SERVICE, AND FARM SERVICE AGENCY, DEPARTMENT OF AGRICULTURE (CONTINUED... Public Law 103-354 loans. In making such distribution consider the principal balance outstanding on each...) Application of payments. After the decision is reached as to the amount of each payment that is to be...

  1. Real-Time Multiprocessor Programming Language (RTMPL) user's manual

    NASA Technical Reports Server (NTRS)

    Arpasi, D. J.

    1985-01-01

    A real-time multiprocessor programming language (RTMPL) has been developed to provide for high-order programming of real-time simulations on systems of distributed computers. RTMPL is a structured, engineering-oriented language. The RTMPL utility supports a variety of multiprocessor configurations and types by generating assembly language programs according to user-specified targeting information. Many programming functions are assumed by the utility (e.g., data transfer and scaling) to reduce the programming chore. This manual describes RTMPL from a user's viewpoint. Source generation, applications, utility operation, and utility output are detailed. An example simulation is generated to illustrate many RTMPL features.

  2. Probability Distributome: A Web Computational Infrastructure for Exploring the Properties, Interrelations, and Applications of Probability Distributions.

    PubMed

    Dinov, Ivo D; Siegrist, Kyle; Pearl, Dennis K; Kalinin, Alexandr; Christou, Nicolas

    2016-06-01

    Probability distributions are useful for modeling, simulation, analysis, and inference on varieties of natural processes and physical phenomena. There are uncountably many probability distributions. However, a few dozen families of distributions are commonly defined and are frequently used in practice for problem solving, experimental applications, and theoretical studies. In this paper, we present a new computational and graphical infrastructure, the Distributome , which facilitates the discovery, exploration and application of diverse spectra of probability distributions. The extensible Distributome infrastructure provides interfaces for (human and machine) traversal, search, and navigation of all common probability distributions. It also enables distribution modeling, applications, investigation of inter-distribution relations, as well as their analytical representations and computational utilization. The entire Distributome framework is designed and implemented as an open-source, community-built, and Internet-accessible infrastructure. It is portable, extensible and compatible with HTML5 and Web2.0 standards (http://Distributome.org). We demonstrate two types of applications of the probability Distributome resources: computational research and science education. The Distributome tools may be employed to address five complementary computational modeling applications (simulation, data-analysis and inference, model-fitting, examination of the analytical, mathematical and computational properties of specific probability distributions, and exploration of the inter-distributional relations). Many high school and college science, technology, engineering and mathematics (STEM) courses may be enriched by the use of modern pedagogical approaches and technology-enhanced methods. The Distributome resources provide enhancements for blended STEM education by improving student motivation, augmenting the classical curriculum with interactive webapps, and overhauling the learning assessment protocols.

  3. Probability Distributome: A Web Computational Infrastructure for Exploring the Properties, Interrelations, and Applications of Probability Distributions

    PubMed Central

    Dinov, Ivo D.; Siegrist, Kyle; Pearl, Dennis K.; Kalinin, Alexandr; Christou, Nicolas

    2015-01-01

    Probability distributions are useful for modeling, simulation, analysis, and inference on varieties of natural processes and physical phenomena. There are uncountably many probability distributions. However, a few dozen families of distributions are commonly defined and are frequently used in practice for problem solving, experimental applications, and theoretical studies. In this paper, we present a new computational and graphical infrastructure, the Distributome, which facilitates the discovery, exploration and application of diverse spectra of probability distributions. The extensible Distributome infrastructure provides interfaces for (human and machine) traversal, search, and navigation of all common probability distributions. It also enables distribution modeling, applications, investigation of inter-distribution relations, as well as their analytical representations and computational utilization. The entire Distributome framework is designed and implemented as an open-source, community-built, and Internet-accessible infrastructure. It is portable, extensible and compatible with HTML5 and Web2.0 standards (http://Distributome.org). We demonstrate two types of applications of the probability Distributome resources: computational research and science education. The Distributome tools may be employed to address five complementary computational modeling applications (simulation, data-analysis and inference, model-fitting, examination of the analytical, mathematical and computational properties of specific probability distributions, and exploration of the inter-distributional relations). Many high school and college science, technology, engineering and mathematics (STEM) courses may be enriched by the use of modern pedagogical approaches and technology-enhanced methods. The Distributome resources provide enhancements for blended STEM education by improving student motivation, augmenting the classical curriculum with interactive webapps, and overhauling the learning assessment protocols. PMID:27158191

  4. Satellite fixed communications service: A forecast of potential domestic demand through the year 2000. Volume 3: Appendices

    NASA Technical Reports Server (NTRS)

    Kratochvil, D.; Bowyer, J.; Bhushan, C.; Steinnagel, K.; Kaushal, D.; Al-Kinani, G.

    1983-01-01

    Voice applications, data applications, video applications, impacted baseline forecasts, market distribution model, net long haul forecasts, trunking earth station definition and costs, trunking space segment cost, trunking entrance/exit links, trunking network costs and crossover distances with terrestrial tariffs, net addressable forecasts, capacity requirements, improving spectrum utilization, satellite system market development, and the 30/20 net accessible market are considered.

  5. Satellite fixed communications service: A forecast of potential domestic demand through the year 2000. Volume 3: Appendices

    NASA Astrophysics Data System (ADS)

    Kratochvil, D.; Bowyer, J.; Bhushan, C.; Steinnagel, K.; Kaushal, D.; Al-Kinani, G.

    1983-09-01

    Voice applications, data applications, video applications, impacted baseline forecasts, market distribution model, net long haul forecasts, trunking earth station definition and costs, trunking space segment cost, trunking entrance/exit links, trunking network costs and crossover distances with terrestrial tariffs, net addressable forecasts, capacity requirements, improving spectrum utilization, satellite system market development, and the 30/20 net accessible market are considered.

  6. Volume effect of non-polar solvent towards the synthesis of hydrophilic polymer nanoparticles prepares via inverse miniemulsion polymerization

    NASA Astrophysics Data System (ADS)

    Kamaruddin, Nur Nasyita; Kassim, Syara; Harun, Noor Aniza

    2017-09-01

    Polymeric nanoparticles have drawn tremendous attention to researchers and have utilized in diverse fields especially in biomedical applications. Nevertheless, question has raised about the safety and hydrophilicity of the nanoparticles to be utilized in medical and biological applications. One promising solution to this problem is to develop biodegradable polymeric nanoparticles with improve hydrophilicity. This study is focusing to develop safer and "greener" polymeric nanoparticles via inverse miniemulsion polymerization techniques, a robust and convenient method to produce water-soluble polymer nanoparticles. Acrylamide (Am), acrylic acid (AA) and methacrylic acid (MAA) monomers have chosen, as they are biocompatible, non-toxic and ecological. The effect of different volumes of cyclohexane towards the formation of polymer nanoparticles, particle size, particle size distribution and morphology of polymer nanoparticles are investigated. The formation and morphology of polymer nanoparticles are determined using FTIR and SEM respectively. The mean diameters of the polymer nanoparticles were in a range of 80 - 250 nm and with broad particle size distributions as determined by dynamic light scattering (DLS). Hydrophilic polyacrylamide (pAm), poly(acrylic acid) (pAA) and poly(methacrylic acid) (pMAA) nanoparticles were successfully achieved by inverse miniemulsion polymerization and have potentiality to be further utilized in the fabrication of hybrid polymer composite nanoparticles especially in biological and medical applications.

  7. Photovoltaics as a terrestrial energy source. Volume 1: An introduction

    NASA Technical Reports Server (NTRS)

    Smith, J. L.

    1980-01-01

    Photovoltaic (PV) systems were examined their potential for terrestrial application and future development. Photovoltaic technology, existing and potential photovoltaic applications, and the National Photovoltaics Program are reviewed. The competitive environment for this electrical source, affected by the presence or absence of utility supplied power is evaluated in term of systems prices. The roles of technological breakthroughs, directed research and technology development, learning curves, and commercial demonstrations in the National Program are discussed. The potential for photovoltaics to displace oil consumption is examined, as are the potential benefits of employing PV in either central-station or non-utility owned, small, distributed systems.

  8. A Web-based Distributed Voluntary Computing Platform for Large Scale Hydrological Computations

    NASA Astrophysics Data System (ADS)

    Demir, I.; Agliamzanov, R.

    2014-12-01

    Distributed volunteer computing can enable researchers and scientist to form large parallel computing environments to utilize the computing power of the millions of computers on the Internet, and use them towards running large scale environmental simulations and models to serve the common good of local communities and the world. Recent developments in web technologies and standards allow client-side scripting languages to run at speeds close to native application, and utilize the power of Graphics Processing Units (GPU). Using a client-side scripting language like JavaScript, we have developed an open distributed computing framework that makes it easy for researchers to write their own hydrologic models, and run them on volunteer computers. Users will easily enable their websites for visitors to volunteer sharing their computer resources to contribute running advanced hydrological models and simulations. Using a web-based system allows users to start volunteering their computational resources within seconds without installing any software. The framework distributes the model simulation to thousands of nodes in small spatial and computational sizes. A relational database system is utilized for managing data connections and queue management for the distributed computing nodes. In this paper, we present a web-based distributed volunteer computing platform to enable large scale hydrological simulations and model runs in an open and integrated environment.

  9. Application of a Fiber Optic Distributed Strain Sensor System to Woven E-Glass Composite

    NASA Technical Reports Server (NTRS)

    Anastasi, Robert F.; Lopatin, Craig

    2001-01-01

    A distributed strain sensing system utilizing a series of identically written Bragg gratings along an optical fiber is examined for potential application to Composite Armored Vehicle health monitoring. A vacuum assisted resin transfer molding process was used to fabricate a woven fabric E-glass/composite panel with an embedded fiber optic strain sensor. Test samples machined from the panel were mechanically tested in 4-point bending. Experimental results are presented that show the mechanical strain from foil strain gages comparing well to optical strain from the embedded sensors. Also, it was found that the distributed strain along the sample length was consistent with the loading configuration.

  10. Area-Specific Marginal Costing for Electric Utilities: a Case Study of Transmission and Distribution Costs

    NASA Astrophysics Data System (ADS)

    Orans, Ren

    1990-10-01

    Existing procedures used to develop marginal costs for electric utilities were not designed for applications in an increasingly competitive market for electric power. The utility's value of receiving power, or the costs of selling power, however, depend on the exact location of the buyer or seller, the magnitude of the power and the period of time over which the power is used. Yet no electric utility in the United States has disaggregate marginal costs that reflect differences in costs due to the time, size or location of the load associated with their power or energy transactions. The existing marginal costing methods used by electric utilities were developed in response to the Public Utilities Regulatory Policy Act (PURPA) in 1978. The "ratemaking standards" (Title 1) established by PURPA were primarily concerned with the appropriate segmentation of total revenues to various classes-of-service, designing time-of-use rating periods, and the promotion of efficient long-term resource planning. By design, the methods were very simple and inexpensive to implement. Now, more than a decade later, the costing issues facing electric utilities are becoming increasingly complex, and the benefits of developing more specific marginal costs will outweigh the costs of developing this information in many cases. This research develops a framework for estimating total marginal costs that vary by the size, timing, and the location of changes in loads within an electric distribution system. To complement the existing work at the Electric Power Research Institute (EPRI) and Pacific Gas and Electric Company (PGandE) on estimating disaggregate generation and transmission capacity costs, this dissertation focuses on the estimation of distribution capacity costs. While the costing procedure is suitable for the estimation of total (generation, transmission and distribution) marginal costs, the empirical work focuses on the geographic disaggregation of marginal costs related to electric utility distribution investment. The study makes use of data from an actual distribution planning area, located within PGandE's service territory, to demonstrate the important characteristics of this new costing approach. The most significant result of this empirical work is that geographic differences in the cost of capacity in distribution systems can be as much as four times larger than the current system average utility estimates. Furthermore, lumpy capital investment patterns can lead to significant cost differences over time.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schauder, C.

    This subcontract report was completed under the auspices of the NREL/SCE High-Penetration Photovoltaic (PV) Integration Project, which is co-funded by the U.S. Department of Energy (DOE) Office of Energy Efficiency and Renewable Energy (EERE) and the California Solar Initiative (CSI) Research, Development, Demonstration, and Deployment (RD&D) program funded by the California Public Utility Commission (CPUC) and managed by Itron. This project is focused on modeling, quantifying, and mitigating the impacts of large utility-scale PV systems (generally 1-5 MW in size) that are interconnected to the distribution system. This report discusses the concerns utilities have when interconnecting large PV systems thatmore » interconnect using PV inverters (a specific application of frequency converters). Additionally, a number of capabilities of PV inverters are described that could be implemented to mitigate the distribution system-level impacts of high-penetration PV integration. Finally, the main issues that need to be addressed to ease the interconnection of large PV systems to the distribution system are presented.« less

  12. Distributed Wind Market Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Forsyth, T.; Baring-Gould, I.

    2007-11-01

    Distributed wind energy systems provide clean, renewable power for on-site use and help relieve pressure on the power grid while providing jobs and contributing to energy security for homes, farms, schools, factories, private and public facilities, distribution utilities, and remote locations. America pioneered small wind technology in the 1920s, and it is the only renewable energy industry segment that the United States still dominates in technology, manufacturing, and world market share. The series of analyses covered by this report were conducted to assess some of the most likely ways that advanced wind turbines could be utilized apart from large, centralmore » station power systems. Each chapter represents a final report on specific market segments written by leading experts in this field. As such, this document does not speak with one voice but rather a compendium of different perspectives, which are documented from a variety of people in the U.S. distributed wind field.« less

  13. Telecommunications: Opportunities in the emerging technologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schultheis, R.W.

    1994-12-31

    A series of slides present opportunities for Utility Telecommunications. The following aspects are covered: (1) Technology in a period of revolution; (2) Technology Management by the Energy Utility, (3) Contemporary Telecommunication Network Architectures, (4) Opportunity Management, (5) Strategic Planning for Profits and Growth, (6) Energy Industry in a period of challenge. Management topics and applications are presented in a matrix for generation, transmission, distribution, customer service and new business revenue growth subjects.

  14. Dark Fiber and Distributed Acoustic Sensing: Applications to Monitoring Seismicity and Near-Surface Properties

    NASA Astrophysics Data System (ADS)

    Ajo Franklin, J. B.; Lindsey, N.; Dou, S.; Freifeld, B. M.; Daley, T. M.; Tracy, C.; Monga, I.

    2017-12-01

    "Dark Fiber" refers to the large number of fiber-optic lines installed for telecommunication purposes but not currently utilized. With the advent of distributed acoustic sensing (DAS), these unused fibers have the potential to become a seismic sensing network with unparalleled spatial extent and density with applications to monitoring both natural seismicity as well as near-surface soil properties. While the utility of DAS for seismic monitoring has now been conclusively shown on built-for-purpose networks, dark fiber deployments have been challenged by the heterogeneity of fiber installation procedures in telecommunication as well as access limitations. However, the potential of telecom networks to augment existing broadband monitoring stations provides a strong incentive to explore their utilization. We present preliminary results demonstrating the application of DAS to seismic monitoring on a 20 km run of "dark" telecommunications fiber between West Sacramento, CA and Woodland CA, part of the Dark Fiber Testbed maintained by the DOE's ESnet user facility. We show a small catalog of local and regional earthquakes detected by the array and evaluate fiber coupling by using variations in recorded frequency content. Considering the low density of broadband stations across much of the Sacramento Basin, such DAS recordings could provide a crucial data source to constrain small-magnitude local events. We also demonstrate the application of ambient noise interferometry using DAS-recorded waveforms to estimate soil properties under selected sections of the dark fiber transect; the success of this test suggests that the network could be utilized for environmental monitoring at the basin scale. The combination of these two examples demonstrates the exciting potential for combining DAS with ubiquitous dark fiber to greatly extend the reach of existing seismic monitoring networks.

  15. Optimization of over-provisioned clouds

    NASA Astrophysics Data System (ADS)

    Balashov, N.; Baranov, A.; Korenkov, V.

    2016-09-01

    The functioning of modern applications in cloud-centers is characterized by a huge variety of computational workloads generated. This causes uneven workload distribution and as a result leads to ineffective utilization of cloud-centers' hardware. The proposed article addresses the possible ways to solve this issue and demonstrates that it is a matter of necessity to optimize cloud-centers' hardware utilization. As one of the possible ways to solve the problem of the inefficient resource utilization in heterogeneous cloud-environments an algorithm of dynamic re-allocation of virtual resources is suggested.

  16. Composites review

    NASA Technical Reports Server (NTRS)

    Hordonneau, A.

    1987-01-01

    The properties and applications of composite materials are reviewed. Glass, carbon, Kevlar, ceramic, whisker, and metal fibers are discussed along with polyester, epoxy, polyimide, Peek, carbon, ceramic, and metal matrices. The quantitative distribution of high technology fiber in various applications is given. The role of aerospace industry in the development and promotion of composite utilization is discussed. Consumption trends indicate a rapid development of the composite market.

  17. Inter-Vehicle Communication System Utilizing Autonomous Distributed Transmit Power Control

    NASA Astrophysics Data System (ADS)

    Hamada, Yuji; Sawa, Yoshitsugu; Goto, Yukio; Kumazawa, Hiroyuki

    In ad-hoc network such as inter-vehicle communication (IVC) system, safety applications that vehicles broadcast the information such as car velocity, position and so on periodically are considered. In these applications, if there are many vehicles broadcast data in a communication area, congestion incurs a problem decreasing communication reliability. We propose autonomous distributed transmit power control method to keep high communication reliability. In this method, each vehicle controls its transmit power using feed back control. Furthermore, we design a communication protocol to realize the proposed method, and we evaluate the effectiveness of proposed method using computer simulation.

  18. Eliciting and Combining Decision Criteria Using a Limited Palette of Utility Functions and Uncertainty Distributions: Illustrated by Application to Pest Risk Analysis.

    PubMed

    Holt, Johnson; Leach, Adrian W; Schrader, Gritta; Petter, Françoise; MacLeod, Alan; van der Gaag, Dirk Jan; Baker, Richard H A; Mumford, John D

    2014-01-01

    Utility functions in the form of tables or matrices have often been used to combine discretely rated decision-making criteria. Matrix elements are usually specified individually, so no one rule or principle can be easily stated for the utility function as a whole. A series of five matrices are presented that aggregate criteria two at a time using simple rules that express a varying degree of constraint of the lower rating over the higher. A further nine possible matrices were obtained by using a different rule either side of the main axis of the matrix to describe situations where the criteria have a differential influence on the outcome. Uncertainties in the criteria are represented by three alternative frequency distributions from which the assessors select the most appropriate. The output of the utility function is a distribution of rating frequencies that is dependent on the distributions of the input criteria. In pest risk analysis (PRA), seven of these utility functions were required to mimic the logic by which assessors for the European and Mediterranean Plant Protection Organization arrive at an overall rating of pest risk. The framework enables the development of PRAs that are consistent and easy to understand, criticize, compare, and change. When tested in workshops, PRA practitioners thought that the approach accorded with both the logic and the level of resolution that they used in the risk assessments. © 2013 Society for Risk Analysis.

  19. 48 CFR 41.102 - Applicability.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... interagency agreements (see 41.206); (2) Utility services obtained by purchase, exchange, or otherwise by a Federal power or water marketing agency incident to that agency's marketing or distribution program; (3) Cable television (CATV) and telecommunications services; (4) Acquisition of natural or manufactured gas...

  20. Mapping of polycrystalline films of biological fluids utilizing the Jones-matrix formalism

    NASA Astrophysics Data System (ADS)

    Ushenko, Vladimir A.; Dubolazov, Alexander V.; Pidkamin, Leonid Y.; Sakchnovsky, Michael Yu; Bodnar, Anna B.; Ushenko, Yuriy A.; Ushenko, Alexander G.; Bykov, Alexander; Meglinski, Igor

    2018-02-01

    Utilizing a polarized light approach, we reconstruct the spatial distribution of birefringence and optical activity in polycrystalline films of biological fluids. The Jones-matrix formalism is used for an accessible quantitative description of these types of optical anisotropy. We demonstrate that differentiation of polycrystalline films of biological fluids can be performed based on a statistical analysis of the distribution of rotation angles and phase shifts associated with the optical activity and birefringence, respectively. Finally, practical operational characteristics, such as sensitivity, specificity and accuracy of the Jones-matrix reconstruction of optical anisotropy, were identified with special emphasis on biomedical application, specifically for differentiation of bile films taken from healthy donors and from patients with cholelithiasis.

  1. Watershed Management Tool for Selection and Spacial Allocation of Non-Point Source Pollution Control Practices

    EPA Science Inventory

    Distributed-parameter watershed models are often utilized for evaluating the effectiveness of sediment and nutrient abatement strategies through the traditional {calibrate→ validate→ predict} approach. The applicability of the method is limited due to modeling approximations. In ...

  2. Adjustable Optical-Fiber Attenuator

    NASA Technical Reports Server (NTRS)

    Buzzetti, Mike F.

    1994-01-01

    Adjustable fiber-optic attenuator utilizes bending loss to reduce strength of light transmitted along it. Attenuator functions without introducing measurable back-reflection or insertion loss. Relatively insensitive to vibration and changes in temperature. Potential applications include cable television, telephone networks, other signal-distribution networks, and laboratory instrumentation.

  3. Implementation of a Wireless Time Distribution Testbed Protected with Quantum Key Distribution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bonior, Jason D; Evans, Philip G; Sheets, Gregory S

    2017-01-01

    Secure time transfer is critical for many timesensitive applications. the Global Positioning System (GPS) which is often used for this purpose has been shown to be susceptible to spoofing attacks. Quantum Key Distribution offers a way to securely generate encryption keys at two locations. Through careful use of this information it is possible to create a system that is more resistant to spoofing attacks. In this paper we describe our work to create a testbed which utilizes QKD and traditional RF links. This testbed will be used for the development of more secure and spoofing resistant time distribution protocols.

  4. Agricultural Aircraft Aid

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Farmers are increasingly turning to aerial applications of pesticides, fertilizers and other materials. Sometimes uneven distribution of the chemicals is caused by worn nozzles, improper alignment of spray nozzles or system leaks. If this happens, job must be redone with added expense to both the pilot and customer. Traditional pattern analysis techniques take days or weeks. Utilizing NASA's wind tunnel and computer validation technology, Dr. Roth, Oklahoma State University (OSU), developed a system for providing answers within minutes. Called the Rapid Distribution Pattern Evaluation System, the OSU system consists of a 100-foot measurement frame tied in to computerized analysis and readout equipment. System is mobile, delivered by trailer to airfields in agricultural areas where OSU conducts educational "fly-ins." A fly-in typically draws 50 to 100 aerial applicators, researchers, chemical suppliers and regulatory officials. An applicator can have his spray pattern checked. A computerized readout, available in five to 12 minutes, provides information for correcting shortcomings in the distribution pattern.

  5. States of Cybersecurity: Electricity Distribution System Discussions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pena, Ivonne; Ingram, Michael; Martin, Maurice

    State and local entities that oversee the reliable, affordable provision of electricity are faced with growing and evolving threats from cybersecurity risks to our nation's electricity distribution system. All-hazards system resilience is a shared responsibility among electric utilities and their regulators or policy-setting boards of directors. Cybersecurity presents new challenges and should be a focus for states, local governments, and Native American tribes that are developing energy-assurance plans to protect critical infrastructure. This research sought to investigate the implementation of governance and policy at the distribution utility level that facilitates cybersecurity preparedness to inform the U.S. Department of Energy (DOE),more » Office of Energy Policy and Systems Analysis; states; local governments; and other stakeholders on the challenges, gaps, and opportunities that may exist for future analysis. The need is urgent to identify the challenges and inconsistencies in how cybersecurity practices are being applied across the United States to inform the development of best practices, mitigations, and future research and development investments in securing the electricity infrastructure. By examining the current practices and applications of cybersecurity preparedness, this report seeks to identify the challenges and persistent gaps between policy and execution and reflect the underlying motivations of distinct utility structures as they play out at the local level. This study aims to create an initial baseline of cybersecurity preparedness within the distribution electricity sector. The focus of this study is on distribution utilities not bound by the cybersecurity guidelines of the North American Electric Reliability Corporation (NERC) to examine the range of mechanisms taken by state regulators, city councils that own municipal utilities, and boards of directors of rural cooperatives.« less

  6. Distribution of Software Changes for Battlefield Computer Systems: A lingering Problem

    DTIC Science & Technology

    1983-06-03

    Defense, 10 June 1963), pp. 1-4. 3 Ibid. 4Automatic Data Processing Systems, Book - 1 Introduction (U.S. Army Signal School, Fort Monmouth, New Jersey, 15...January 1960) , passim. 5Automatic Data Processing Systems, Book - 2 Army Use of ADPS (U.S. Army Signal School, Fort Monmouth, New Jersey, 15 October...execute an application or utility program. It controls how the computer functions during a given operation. Utility programs are merely general use

  7. 18 CFR 806.25 - Water conservation standards.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ..., as applicable, by all classes of users. (ii) Prepare and distribute literature to customers... meters or other suitable devices or utilize acceptable flow measuring methods for accurate determination of water use by various parts of the company operation. (3) Install flow control devices which match...

  8. 18 CFR 806.25 - Water conservation standards.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ..., as applicable, by all classes of users. (ii) Prepare and distribute literature to customers... meters or other suitable devices or utilize acceptable flow measuring methods for accurate determination of water use by various parts of the company operation. (3) Install flow control devices which match...

  9. 18 CFR 806.25 - Water conservation standards.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ..., as applicable, by all classes of users. (ii) Prepare and distribute literature to customers... meters or other suitable devices or utilize acceptable flow measuring methods for accurate determination of water use by various parts of the company operation. (3) Install flow control devices which match...

  10. Demonstration and Evaluation of an Innovative Water Main Rehabilitation Technology: Spray-on Polymeric Lining

    EPA Science Inventory

    Many utilities are seeking innovative rehabilitation technologies to extend the life and fix larger portions of their water distribution systems with current funding levels. The information on the capabilities and applicability of new technologies is not always readily available...

  11. Alternatives to the 15% Rule

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Broderick, Robert Joseph; Quiroz, Jimmy Edward; Reno, Matthew J.

    2015-11-01

    The third solicitation of the California Solar Initiative (CSI) Research, Development, Demonstration and Deployment (RD&D) Program established by the California Public Utility Commission (CPUC) is supporting the Electric Power Research Institute (EPRI), National Renewable Energy Laboratory (NREL), and Sandia National Laboratories (SNL) with collaboration from Pacific Gas and Electric (PG&E), Southern California Edison (SCE), and San Diego Gas and Electric (SDG&E), in research to improve the Utility Application Review and Approval process for interconnecting distributed energy resources to the distribution system. Currently this process is the most time - consuming of any step on the path to generating power onmore » the distribution system. This CSI RD&D solicitation three project has completed the tasks of collecting data from the three utilities, clustering feeder characteristic data to attain representative feeders, detailed modeling of 16 representative feeders, analysis of PV impacts to those feeders, refinement of current screening processes, and validation of those suggested refinements. In this report each task is summarized to produce a final summary of all components of the overall project.« less

  12. An update on Lab Rover: A hospital material transporter

    NASA Technical Reports Server (NTRS)

    Mattaboni, Paul

    1994-01-01

    The development of a hospital material transporter, 'Lab Rover', is described. Conventional material transport now utilizes people power, push carts, pneumatic tubes and tracked vehicles. Hospitals are faced with enormous pressure to reduce operating costs. Cyberotics, Inc. developed an Autonomous Intelligent Vehicle (AIV). This battery operated service robot was designed specifically for health care institutions. Applications for the AIV include distribution of clinical lab samples, pharmacy drugs, administrative records, x-ray distribution, meal tray delivery, and certain emergency room applications. The first AIV was installed at Lahey Clinic in Burlington, Mass. Lab Rover was beta tested for one year and has been 'on line' for an additional 2 years.

  13. Application of a stepwise method for analyzing fouling in shell-and-tube exchangers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prieto, M.M.; Miranda, J.; Sigales, B.

    1999-12-01

    This article presents the results of the application of a quite simple method for analyzing shell-side fouling in shell-and-tube exchangers, capable of taking into account the formation or irregular fouling deposits with variable thermal conductivity. This method, based on the utilization of elementary heat exchangers, has been implemented for E-shell TEMA-type heat exchangers with two tube passes. Several fouling deposit distributions have been simulated so as to ascertain their effects on the heat transfer rate. These distributions consider that fouling is concentrated in zones where the temperature of the fluids is maximum or minimum.

  14. JPL - Small Power Systems Applications Project. [for solar thermal power plant development and commercialization

    NASA Technical Reports Server (NTRS)

    Ferber, R. R.; Marriott, A. T.; Truscello, V.

    1978-01-01

    The Small Power Systems Applications (SPSA) Project has been established to develop and commercialize small solar thermal power plants. The technologies of interest include all distributed and central receiver technologies which are potentially economically viable in power plant sizes of one to 10 MWe. The paper presents an overview of the SPSA Project and briefly discusses electric utility involvement in the Project.

  15. Research and Development of Zinc Air Fuel Cell To Achieve Commercialization Final Report CRADA No. TC-1544-98

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cooper, J. F.; Haley, H. D.

    The specific goal of this project was to advance the development of the zinc air fuel cell (ZAFC) towards commercial readiness in different mobile applications, including motor bikes, passenger cars, vans, buses and off-road vehicles (golf carts, factory equipment), and different stationary applications including generator sets, uninterruptible power systems and electric utility loading leveling and distributive power.

  16. [Applications of GIS in biomass energy source research].

    PubMed

    Su, Xian-Ming; Wang, Wu-Kui; Li, Yi-Wei; Sun, Wen-Xiang; Shi, Hai; Zhang, Da-Hong

    2010-03-01

    Biomass resources have the characteristics of widespread and dispersed distribution, which have close relations to the environment, climate, soil, and land use, etc. Geographic information system (GIS) has the functions of spatial analysis and the flexibility of integrating with other application models and algorithms, being of predominance to the biomass energy source research. This paper summarized the researches on the GIS applications in biomass energy source research, with the focus in the feasibility study of bioenergy development, assessment of biomass resources amount and distribution, layout of biomass exploitation and utilization, evaluation of gaseous emission from biomass burning, and biomass energy information system. Three perspectives of GIS applications in biomass energy source research were proposed, i. e., to enrich the data source, to improve the capacity on data processing and decision-support, and to generate the online proposal.

  17. Applications of space observations to the management and utilization of coastal fishery resources

    NASA Technical Reports Server (NTRS)

    Kemmerer, A. J.; Savastano, K. J.; Faller, K. H.

    1977-01-01

    Information needs of those concerned with the harvest and management of coastal fishery resources can be satisfied in part through applications of satellite remote sensing. Recently completed and ongoing investigations have demonstrated potentials for defining fish distribution patterns from multispectral data, monitoring fishing distribution and effort with synthetic aperture radar systems, forecasting recruitment of certain estuarine-dependent species, and tracking marine mammals. These investigations, which are reviewed in this paper, have relied on Landsat 1 and 2, Skylab-3, and Nimbus-6 supported sensors and sensors carried by aircraft and mounted on surface platforms to simulate applications from Seasat-A and other future spacecraft systems. None of the systems are operational as all were designed to identify and demonstrate applications and to aid in the specification of requirements for future spaceborne systems.

  18. Remote mineral mapping using AVIRIS data at Summitville, Colorado and the adjacent San Juan Mountains

    NASA Technical Reports Server (NTRS)

    King, Trude V. V.; Clark, Roger N.; Ager, Cathy; Swayze, Gregg A.

    1995-01-01

    We have demonstrated the unique utility of imaging spectroscopy in mapping mineral distribution. In the Summitville mining region we have shown that the mine site does not contribute clay minerals to the Alamosa River, but does contribute Fe-bearing minerals. Such minerals have the potential to carry heavy metals. This application illustrates only one specific environmental application of imaging spectroscopy data. For instance, the types of minerals we can map with confidence are those frequently associated with environmental problems related to active and abandoned mine lands. Thus, the potential utility of this technology to the field of environmental science has yet to be fully explored.

  19. Proceedings of the American Power Conference. Volume 58-I

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McBride, A.E.

    1996-10-01

    This is volume 58-I of the proceedings of the American Power Conference, 1996, Technology for Competition and Globalization. The topics of the papers include power plant DC issues; cost of environmental compliance; advanced coal systems -- environmental performance; technology for competition in dispersed generation; superconductivity technologies for electric utility applications; power generation trends and challenges in China; aging in nuclear power plants; innovative and competitive repowering options; structural examinations, modifications and repairs; electric load forecasting; distribution planning; EMF effects; fuzzy logic and neural networks for power plant applications; electrokinetic decontamination of soils; integrated gasification combined cycle; advances in fusion; coolingmore » towers; relays; plant controls; flue gas desulfurization; waste product utilization; and improved technologies.« less

  20. Energy loss analysis of an integrated space power distribution system

    NASA Technical Reports Server (NTRS)

    Kankam, M. D.; Ribeiro, P. F.

    1992-01-01

    The results of studies related to conceptual topologies of an integrated utility-like space power system are described. The system topologies are comparatively analyzed by considering their transmission energy losses as functions of mainly distribution voltage level and load composition. The analysis is expedited by use of a Distribution System Analysis and Simulation (DSAS) software. This recently developed computer program by the Electric Power Research Institute (EPRI) uses improved load models to solve the power flow within the system. However, present shortcomings of the software with regard to space applications, and incompletely defined characteristics of a space power system make the results applicable to only the fundamental trends of energy losses of the topologies studied. Accountability, such as included, for the effects of the various parameters on the system performance can constitute part of a planning tool for a space power distribution system.

  1. Developments in Characterizing Capture Zone Distributions in Island Growth

    NASA Astrophysics Data System (ADS)

    Einstein, T. L.; Pimpinelli, Alberto; GonzáLez, Diego Luis; Sathiyanarayanan, Rajesh

    2013-03-01

    The utility of using the distribution of capture zones (CZD) to characterize epitaxial growth continues to mount. For non-Poisson deposition (i.e. when island nucleation is not fully random) the areas of these Voronoi cells (proximity polygons) can be well described by the generalized Wigner distribution (GWD), particularly in the central region around the mean area. We discuss several recent applications to experimental systems, showing how this perspective leads to insights about the critical nucleus size. In contrast, several studies have shown that the GWD may not describe the numerical data from painstaking simulations in both tails. We discuss some refinements that have been proposed. Finally, we comment on applications to social phenomena such as area distributions of secondary administrative units (like counties) and of Voronoi cells around Metro stops. Work at UMD supported by NSF-MRSEC Grant DMR 05-20471 and NSF CHE 07-49949

  2. Interconnecting PV on New York City's Secondary Network Distribution System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, K; Coddington, M; Burman, K

    2009-11-01

    The U.S. Department of Energy (DOE) has teamed with cities across the country through the Solar America Cities (SAC) partnership program to help reduce barriers and accelerate implementation of solar energy. The New York City SAC team is a partnership between the City University of New York (CUNY), the New York City Mayor s Office of Long-term Planning and Sustainability, and the New York City Economic Development Corporation (NYCEDC).The New York City SAC team is working with DOE s National Renewable Energy Laboratory (NREL) and Con Edison, the local utility, to develop a roadmap for photovoltaic (PV) installations in themore » five boroughs. The city set a goal to increase its installed PV capacity from1.1 MW in 2005 to 8.1 MW by 2015 (the maximum allowed in 2005). A key barrier to reaching this goal, however, is the complexity of the interconnection process with the local utility. Unique challenges are associated with connecting distributed PV systems to secondary network distribution systems (simplified to networks in this report). Although most areas of the country use simpler radial distribution systems to distribute electricity, larger metropolitan areas like New York City typically use networks to increase reliability in large load centers. Unlike the radial distribution system, where each customer receives power through a single line, a network uses a grid of interconnected lines to deliver power to each customer through several parallel circuits and sources. This redundancy improves reliability, but it also requires more complicated coordination and protection schemes that can be disrupted by energy exported from distributed PV systems. Currently, Con Edison studies each potential PV system in New York City to evaluate the system s impact on the network, but this is time consuming for utility engineers and may delay the customer s project or add cost for larger installations. City leaders would like to streamline this process to facilitate faster, simpler, and less expensive distributed PV system interconnections. To assess ways to improve the interconnection process, NREL conducted a four-part study with support from DOE. The NREL team then compiled the final reports from each study into this report. In Section 1PV Deployment Analysis for New York City we analyze the technical potential for rooftop PV systems in the city. This analysis evaluates potential PV power production in ten Con Edison networks of various locations and building densities (ranging from high density apartments to lower density single family homes). Next, we compare the potential power production to network loads to determine where and when PV generation is most likely to exceed network load and disrupt network protection schemes. The results of this analysis may assist Con Edison in evaluating future PV interconnection applications and in planning future network protection system upgrades. This analysis may also assist other utilities interconnecting PV systems to networks by defining a method for assessing the technical potential of PV in the network and its impact on network loads. Section 2. A Briefing for Policy Makers on Connecting PV to a Network Grid presents an overview intended for nontechnical stakeholders. This section describes the issues associated with interconnecting PV systems to networks, along with possible solutions. Section 3. Technical Review of Concerns and Solutions to PV Interconnection in New York City summarizes common concerns of utility engineers and network experts about interconnecting PV systems to secondary networks. This section also contains detailed descriptions of nine solutions, including advantages and disadvantages, potential impacts, and road maps for deployment. Section 4. Utility Application Process Reviewlooks at utility interconnection application processes across the country and identifies administrative best practices for efficient PV interconnection.« less

  3. Compulsory Birth Control and Fertility Measures in India.

    ERIC Educational Resources Information Center

    Halli, S. S.

    1983-01-01

    Discussion of possible applications of the microsimulation approach to analysis of population policy proposes compulsory sterilization policy for all of India. Topics covered include India's population problem, methods for generating a distribution of couples to be sterilized, model validation, data utilized, data analysis, program limitations,…

  4. Solar thermal plant impact analysis and requirements definition

    NASA Technical Reports Server (NTRS)

    Gupta, Y. P.

    1980-01-01

    Progress on a continuing study comprising of ten tasks directed at defining impact and requirements for solar thermal power systems (SPS), 1 to 10 MWe each in capacity, installed during 1985 through year 2000 in a utility or a nonutility load in the United States is summarized. The point focus distributed receiver (PFDR) solar power systems are emphasized. Tasks 1 through 4, completed to date, include the development of a comprehensive data base on SPS configurations, their performance, cost, availability, and potential applications; user loads, regional characteristics, and an analytic methodology that incorporates the generally accepted utility financial planning methods and several unique modifications to treat the significant and specific characteristics of solar power systems deployed in either central or distributed power generation modes, are discussed.

  5. The development and performance of smud grid-connected photovoltaic projects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Osborn, D.E.; Collier, D.E.

    1995-11-01

    The utility grid-connected market has been identified as a key market to be developed to accelerate the commercialization of photovoltaics. The Sacramento Municipal Utility District (SMUD) has completed the first two years of a continuing commercialization effort based on two years of a continuing commercialization effort based on the sustained, orderly development of the grid-connected, utility PV market. This program is aimed at developing the experience needed to successfully integrate PV as distributed generation into the utility system and to stimulate the collaborative processes needed to accelerate the cost reductions necessary for PV to be cost-effective in these applications bymore » the year 2000. In the first two years, SMUD has installed over 240 residential and commercial building, grid-connected, rooftop, {open_quotes}PV Pioneer{close_quotes} systems totaling over 1MW of capacity and four substation sited, grid-support PV systems totaling 600 kW bringing the SMUD distributed PV power systems to over 3.7 MW. The 1995 SMUD PV Program will add another approximately 800 kW of PV systems to the District`s distributed PV power system. SMUD also established a partnership with its customers through the PV Pioneer {open_quotes}green pricing{close_quotes} program to advance PV commercialization.« less

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tweedie, A.; Doris, E.

    Establishing interconnection to the grid is a recognized barrier to the deployment of distributed energy generation. This report compares interconnection processes for photovoltaic projects in California and Germany. This report summarizes the steps of the interconnection process for developers and utilities, the average length of time utilities take to process applications, and paperwork required of project developers. Based on a review of the available literature, this report finds that while the interconnection procedures and timelines are similar in California and Germany, differences in the legal and regulatory frameworks are substantial.

  7. Tools for the IDL widget set within the X-windows environment

    NASA Technical Reports Server (NTRS)

    Turgeon, B.; Aston, A.

    1992-01-01

    New tools using the IDL widget set are presented. In particular, a utility allowing the easy creation and update of slide presentations, XSlideManager, is explained in detail and examples of its application are shown. In addition to XSlideManager, other mini-utilities are discussed. These various pieces of software follow the philosophy of the X-Windows distribution system and are made available to anyone within the Internet network. Acquisition procedures through anonymous ftp are clearly explained.

  8. Comparing the Performance of Two Dynamic Load Distribution Methods

    NASA Technical Reports Server (NTRS)

    Kale, L. V.

    1987-01-01

    Parallel processing of symbolic computations on a message-passing multi-processor presents one challenge: To effectively utilize the available processors, the load must be distributed uniformly to all the processors. However, the structure of these computations cannot be predicted in advance. go, static scheduling methods are not applicable. In this paper, we compare the performance of two dynamic, distributed load balancing methods with extensive simulation studies. The two schemes are: the Contracting Within a Neighborhood (CWN) scheme proposed by us, and the Gradient Model proposed by Lin and Keller. We conclude that although simpler, the CWN is significantly more effective at distributing the work than the Gradient model.

  9. Methods to Determine Recommended Feeder-Wide Advanced Inverter Settings for Improving Distribution System Performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rylander, Matthew; Reno, Matthew J.; Quiroz, Jimmy E.

    This paper describes methods that a distribution engineer could use to determine advanced inverter settings to improve distribution system performance. These settings are for fixed power factor, volt-var, and volt-watt functionality. Depending on the level of detail that is desired, different methods are proposed to determine single settings applicable for all advanced inverters on a feeder or unique settings for each individual inverter. Seven distinctly different utility distribution feeders are analyzed to simulate the potential benefit in terms of hosting capacity, system losses, and reactive power attained with each method to determine the advanced inverter settings.

  10. Technology-enhanced Interactive Teaching of Marginal, Joint and Conditional Probabilities: The Special Case of Bivariate Normal Distribution

    PubMed Central

    Dinov, Ivo D.; Kamino, Scott; Bhakhrani, Bilal; Christou, Nicolas

    2014-01-01

    Summary Data analysis requires subtle probability reasoning to answer questions like What is the chance of event A occurring, given that event B was observed? This generic question arises in discussions of many intriguing scientific questions such as What is the probability that an adolescent weighs between 120 and 140 pounds given that they are of average height? and What is the probability of (monetary) inflation exceeding 4% and housing price index below 110? To address such problems, learning some applied, theoretical or cross-disciplinary probability concepts is necessary. Teaching such courses can be improved by utilizing modern information technology resources. Students’ understanding of multivariate distributions, conditional probabilities, correlation and causation can be significantly strengthened by employing interactive web-based science educational resources. Independent of the type of a probability course (e.g. majors, minors or service probability course, rigorous measure-theoretic, applied or statistics course) student motivation, learning experiences and knowledge retention may be enhanced by blending modern technological tools within the classical conceptual pedagogical models. We have designed, implemented and disseminated a portable open-source web-application for teaching multivariate distributions, marginal, joint and conditional probabilities using the special case of bivariate Normal distribution. A real adolescent height and weight dataset is used to demonstrate the classroom utilization of the new web-application to address problems of parameter estimation, univariate and multivariate inference. PMID:25419016

  11. Technology-enhanced Interactive Teaching of Marginal, Joint and Conditional Probabilities: The Special Case of Bivariate Normal Distribution.

    PubMed

    Dinov, Ivo D; Kamino, Scott; Bhakhrani, Bilal; Christou, Nicolas

    2013-01-01

    Data analysis requires subtle probability reasoning to answer questions like What is the chance of event A occurring, given that event B was observed? This generic question arises in discussions of many intriguing scientific questions such as What is the probability that an adolescent weighs between 120 and 140 pounds given that they are of average height? and What is the probability of (monetary) inflation exceeding 4% and housing price index below 110? To address such problems, learning some applied, theoretical or cross-disciplinary probability concepts is necessary. Teaching such courses can be improved by utilizing modern information technology resources. Students' understanding of multivariate distributions, conditional probabilities, correlation and causation can be significantly strengthened by employing interactive web-based science educational resources. Independent of the type of a probability course (e.g. majors, minors or service probability course, rigorous measure-theoretic, applied or statistics course) student motivation, learning experiences and knowledge retention may be enhanced by blending modern technological tools within the classical conceptual pedagogical models. We have designed, implemented and disseminated a portable open-source web-application for teaching multivariate distributions, marginal, joint and conditional probabilities using the special case of bivariate Normal distribution. A real adolescent height and weight dataset is used to demonstrate the classroom utilization of the new web-application to address problems of parameter estimation, univariate and multivariate inference.

  12. A network architecture supporting consistent rich behavior in collaborative interactive applications.

    PubMed

    Marsh, James; Glencross, Mashhuda; Pettifer, Steve; Hubbold, Roger

    2006-01-01

    Network architectures for collaborative virtual reality have traditionally been dominated by client-server and peer-to-peer approaches, with peer-to-peer strategies typically being favored where minimizing latency is a priority, and client-server where consistency is key. With increasingly sophisticated behavior models and the demand for better support for haptics, we argue that neither approach provides sufficient support for these scenarios and, thus, a hybrid architecture is required. We discuss the relative performance of different distribution strategies in the face of real network conditions and illustrate the problems they face. Finally, we present an architecture that successfully meets many of these challenges and demonstrate its use in a distributed virtual prototyping application which supports simultaneous collaboration for assembly, maintenance, and training applications utilizing haptics.

  13. Power Budget Analysis for High Altitude Airships

    NASA Technical Reports Server (NTRS)

    Choi, Sang H.; Elliott, James R.; King, Glen C.

    2006-01-01

    The High Altitude Airship (HAA) has various potential applications and mission scenarios that require onboard energy harvesting and power distribution systems. The energy source considered for the HAA s power budget is solar photon energy that allows the use of either photovoltaic (PV) cells or advanced thermoelectric (ATE) converters. Both PV cells and an ATE system utilizing high performance thermoelectric materials were briefly compared to identify the advantages of ATE for HAA applications in this study. The ATE can generate a higher quantity of harvested energy than PV cells by utilizing the cascaded efficiency of a three-staged ATE in a tandem mode configuration. Assuming that each stage of ATE material has the figure of merit of 5, the cascaded efficiency of a three-staged ATE system approaches the overall conversion efficiency greater than 60%. Based on this estimated efficiency, the configuration of a HAA and the power utility modules are defined.

  14. 76 FR 62762 - Certain Large Diameter Carbon and Alloy Seamless Standard, Line and Pressure Pipe From Japan...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-11

    ... line applications such as oil, gas, or water pipeline, or utility distribution systems. Seamless pressure pipes are intended for the conveyance of water, steam, petrochemicals, chemicals, oil products... Fahrenheit, at various American Society of Mechanical Engineers (``ASME'') code stress levels. Alloy pipes...

  15. DGIC Interconnection Insights | Distributed Generation Interconnection

    Science.gov Websites

    time and resources from utilities, customers, and local permitting authorities. Past research by the interconnection processes can benefit all parties by reducing the financial and time commitments involved. In this susceptible to time-consuming setbacks-for example, if an application is submitted with incomplete information

  16. A Learning Management System Enhanced with Internet of Things Applications

    ERIC Educational Resources Information Center

    Mershad, Khaleel; Wakim, Pilar

    2018-01-01

    A breakthrough in the development of online learning occurred with the utilization of Learning Management Systems (LMS) as a tool for creating, distributing, tracking, and managing various types of educational and training material. Since the appearance of the first LMS, major technological enhancements transformed this tool into a powerful…

  17. Demonstration and evaluation of an innovative water main rehabilitation technology: Cured-in-Place Pipe (CIPP) lining

    EPA Science Inventory

    As many water utilities are seeking new and innovative rehabilitation technologies to extend the life of their water distribution systems, information on the capabilities and applicability of new technologies is not always readily available from an independent source. The U.S. E...

  18. Nowcasting Earthquakes: A Comparison of Induced Earthquakes in Oklahoma and at the Geysers, California

    NASA Astrophysics Data System (ADS)

    Luginbuhl, Molly; Rundle, John B.; Hawkins, Angela; Turcotte, Donald L.

    2018-01-01

    Nowcasting is a new method of statistically classifying seismicity and seismic risk (Rundle et al. 2016). In this paper, the method is applied to the induced seismicity at the Geysers geothermal region in California and the induced seismicity due to fluid injection in Oklahoma. Nowcasting utilizes the catalogs of seismicity in these regions. Two earthquake magnitudes are selected, one large say M_{λ } ≥ 4, and one small say M_{σ } ≥ 2. The method utilizes the number of small earthquakes that occurs between pairs of large earthquakes. The cumulative probability distribution of these values is obtained. The earthquake potential score (EPS) is defined by the number of small earthquakes that has occurred since the last large earthquake, the point where this number falls on the cumulative probability distribution of interevent counts defines the EPS. A major advantage of nowcasting is that it utilizes "natural time", earthquake counts, between events rather than clock time. Thus, it is not necessary to decluster aftershocks and the results are applicable if the level of induced seismicity varies in time. The application of natural time to the accumulation of the seismic hazard depends on the applicability of Gutenberg-Richter (GR) scaling. The increasing number of small earthquakes that occur after a large earthquake can be scaled to give the risk of a large earthquake occurring. To illustrate our approach, we utilize the number of M_{σ } ≥ 2.75 earthquakes in Oklahoma to nowcast the number of M_{λ } ≥ 4.0 earthquakes in Oklahoma. The applicability of the scaling is illustrated during the rapid build-up of injection-induced seismicity between 2012 and 2016, and the subsequent reduction in seismicity associated with a reduction in fluid injections. The same method is applied to the geothermal-induced seismicity at the Geysers, California, for comparison.

  19. PECASE: Nanostructure Hybrid Organic/Inorganic Materials for Active Opto-Electronic Devices

    DTIC Science & Technology

    2011-01-03

    FWHM= 30 nm), green-emitting core–shell material ( 4 nm in diameter) suitable for QD- LED display applications (Figure 1b). An alloyed material for...electroluminescence (EL) that can be of use in fields as diverse as optical communications, spectroscopy, and environmental and industrial sensing. The RC structure...variety of QD size distributions (of Gaussian size profile). Such QD monoalyers have already been utilized in a number of thin-film applications , QD

  20. A quantum leap into the IED age

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patterson, R.C.

    1996-11-01

    The integration of pattern recognition, artificial intelligence and advanced communication technologies in utility substation IED`s (Intelligent Electronic Devices) has opened the door to practical and cost effective automation of power distribution systems. A major driver for the application of these new technologies has been the research directed toward the detection of high-impedance faults. The commercial products which embody these complex detection functions have already expanded to include most of the protection, control, and monitoring required at a utility substation. These new Super-IED`s enable major utility initiatives, such as power quality management, improved public safety, operation and maintenance productivity, and powermore » system automation.« less

  1. Cost Benefit and Alternatives Analysis of Distribution Systems with Energy Storage Systems: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harris, Tom; Nagarajan, Adarsh; Baggu, Murali

    This paper explores monetized and non-monetized benefits from storage interconnected to distribution system through use cases illustrating potential applications for energy storage in California's electric utility system. This work supports SDG&E in its efforts to quantify, summarize, and compare the cost and benefit streams related to implementation and operation of energy storage on its distribution feeders. This effort develops the cost benefit and alternatives analysis platform, integrated with QSTS feeder simulation capability, and analyzed use cases to explore the cost-benefit of implementation and operation of energy storage for feeder support and market participation.

  2. Energy-efficient sensing in wireless sensor networks using compressed sensing.

    PubMed

    Razzaque, Mohammad Abdur; Dobson, Simon

    2014-02-12

    Sensing of the application environment is the main purpose of a wireless sensor network. Most existing energy management strategies and compression techniques assume that the sensing operation consumes significantly less energy than radio transmission and reception. This assumption does not hold in a number of practical applications. Sensing energy consumption in these applications may be comparable to, or even greater than, that of the radio. In this work, we support this claim by a quantitative analysis of the main operational energy costs of popular sensors, radios and sensor motes. In light of the importance of sensing level energy costs, especially for power hungry sensors, we consider compressed sensing and distributed compressed sensing as potential approaches to provide energy efficient sensing in wireless sensor networks. Numerical experiments investigating the effectiveness of compressed sensing and distributed compressed sensing using real datasets show their potential for efficient utilization of sensing and overall energy costs in wireless sensor networks. It is shown that, for some applications, compressed sensing and distributed compressed sensing can provide greater energy efficiency than transform coding and model-based adaptive sensing in wireless sensor networks.

  3. Vaccines and IP Rights: A Multifaceted Relationship.

    PubMed

    Durell, Karen

    2016-01-01

    Just as there are many forms of vaccines and components to vaccines-particular compositions, delivery systems, components, and distribution networks-there are a variety of intellectual property (IP) protections applicable for vaccines. IP rights such as patent, copyright, trademarks, plant breeders' rights, and trade secrets may all be applicable to vaccines. Thus, discussion of IP rights and vaccines should not begin and end with the application of one IP right to a vaccine. The discussion should engage considerations of multiple IP rights applicable to a vaccine and how these can be utilized in an integrated manner in a strategy aimed at supporting the development and distribution of the vaccine. Such an approach to IP rights to vaccines allows for the integrated rights to be considered in light of the justifications for protecting vaccines with IP rights, as well as the issues relating to specific IP rights for vaccines, such as compulsory license regimes, available humanitarian purpose IP credits, etc. To view vaccines as the subject of multiple IP protections involves a refocusing, but the outcome can provide significant benefits for vaccine development and distribution.

  4. Interpreting drinking water quality in the distribution system using Dempster-Shafer theory of evidence.

    PubMed

    Sadiq, Rehan; Rodriguez, Manuel J

    2005-04-01

    Interpreting water quality data routinely generated for control and monitoring purposes in water distribution systems is a complicated task for utility managers. In fact, data for diverse water quality indicators (physico-chemical and microbiological) are generated at different times and at different locations in the distribution system. To simplify and improve the understanding and the interpretation of water quality, methodologies for aggregation and fusion of data must be developed. In this paper, the Dempster-Shafer theory also called theory of evidence is introduced as a potential methodology for interpreting water quality data. The conceptual basis of this methodology and the process for its implementation are presented by two applications. The first application deals with the interpretation of spatial water quality data fusion, while the second application deals with the development of water quality index based on key monitored indicators. Based on the obtained results, the authors discuss the potential contribution of theory of evidence as a decision-making tool for water quality management.

  5. Remote voice training: A case study on space shuttle applications, appendix C

    NASA Technical Reports Server (NTRS)

    Mollakarimi, Cindy; Hamid, Tamin

    1990-01-01

    The Tile Automation System includes applications of automation and robotics technology to all aspects of the Shuttle tile processing and inspection system. An integrated set of rapid prototyping testbeds was developed which include speech recognition and synthesis, laser imaging systems, distributed Ada programming environments, distributed relational data base architectures, distributed computer network architectures, multi-media workbenches, and human factors considerations. Remote voice training in the Tile Automation System is discussed. The user is prompted over a headset by synthesized speech for the training sequences. The voice recognition units and the voice output units are remote from the user and are connected by Ethernet to the main computer system. A supervisory channel is used to monitor the training sequences. Discussions include the training approaches as well as the human factors problems and solutions for this system utilizing remote training techniques.

  6. Automated power distribution system hardware. [for space station power supplies

    NASA Technical Reports Server (NTRS)

    Anderson, Paul M.; Martin, James A.; Thomason, Cindy

    1989-01-01

    An automated power distribution system testbed for the space station common modules has been developed. It incorporates automated control and monitoring of a utility-type power system. Automated power system switchgear, control and sensor hardware requirements, hardware design, test results, and potential applications are discussed. The system is designed so that the automated control and monitoring of the power system is compatible with both a 208-V, 20-kHz single-phase AC system and a high-voltage (120 to 150 V) DC system.

  7. A probabilistic approach to photovoltaic generator performance prediction

    NASA Astrophysics Data System (ADS)

    Khallat, M. A.; Rahman, S.

    1986-09-01

    A method for predicting the performance of a photovoltaic (PV) generator based on long term climatological data and expected cell performance is described. The equations for cell model formulation are provided. Use of the statistical model for characterizing the insolation level is discussed. The insolation data is fitted to appropriate probability distribution functions (Weibull, beta, normal). The probability distribution functions are utilized to evaluate the capacity factors of PV panels or arrays. An example is presented revealing the applicability of the procedure.

  8. Towards Noise Tomography and Passive Monitoring Using Distributed Acoustic Sensing

    NASA Astrophysics Data System (ADS)

    Paitz, P.; Fichtner, A.

    2017-12-01

    Distributed Acoustic Sensing (DAS) has the potential to revolutionize the field of seismic data acquisition. Thanks to their cost-effectiveness, fiber-optic cables may have the capability of complementing conventional geophones and seismometers by filling a niche of applications utilizing large amounts of data. Therefore, DAS may serve as an additional tool to investigate the internal structure of the Earth and its changes over time; on scales ranging from hydrocarbon or geothermal reservoirs to the entire globe. An additional potential may be in the existence of large fibre networks deployed already for telecommunication purposes. These networks that already exist today could serve as distributed seismic antennas. We investigate theoretically how ambient noise tomography may be used with DAS data. For this we extend the theory of seismic interferometry to the measurement of strain. With numerical, 2D finite-difference examples we investigate the impact of source and receiver effects. We study the effect of heterogeneous source distributions and the cable orientation by assessing similarities and differences to the Green's function. We also compare the obtained interferometric waveforms from strain interferometry to displacement interferometric wave fields obtained with existing methods. Intermediate results show that the obtained interferometric waveforms can be connected to the Green's Functions and provide consistent information about the propagation medium. These simulations will be extended to reservoir scale subsurface structures. Future work will include the application of the theory to real-data examples. The presented research depicts the early stage of a combination of theoretical investigations, numerical simulations and real-world data applications. We will therefore evaluate the potentials and shortcomings of DAS in reservoir monitoring and seismology at the current state, with a long-term vision of global seismic tomography utilizing DAS data from existing fiber-optic cable networks.

  9. Assessment of distributed photovoltair electric-power systems

    NASA Astrophysics Data System (ADS)

    Neal, R. W.; Deduck, P. F.; Marshall, R. N.

    1982-10-01

    The development of a methodology to assess the potential impacts of distributed photovoltaic (PV) systems on electric utility systems, including subtransmission and distribution networks, and to apply that methodology to several illustrative examples was developed. The investigations focused upon five specific utilities. Impacts upon utility system operations and generation mix were assessed using accepted utility planning methods in combination with models that simulate PV system performance and life cycle economics. Impacts on the utility subtransmission and distribution systems were also investigated. The economic potential of distributed PV systems was investigated for ownership by the utility as well as by the individual utility customer.

  10. mGrid: A load-balanced distributed computing environment for the remote execution of the user-defined Matlab code

    PubMed Central

    Karpievitch, Yuliya V; Almeida, Jonas S

    2006-01-01

    Background Matlab, a powerful and productive language that allows for rapid prototyping, modeling and simulation, is widely used in computational biology. Modeling and simulation of large biological systems often require more computational resources then are available on a single computer. Existing distributed computing environments like the Distributed Computing Toolbox, MatlabMPI, Matlab*G and others allow for the remote (and possibly parallel) execution of Matlab commands with varying support for features like an easy-to-use application programming interface, load-balanced utilization of resources, extensibility over the wide area network, and minimal system administration skill requirements. However, all of these environments require some level of access to participating machines to manually distribute the user-defined libraries that the remote call may invoke. Results mGrid augments the usual process distribution seen in other similar distributed systems by adding facilities for user code distribution. mGrid's client-side interface is an easy-to-use native Matlab toolbox that transparently executes user-defined code on remote machines (i.e. the user is unaware that the code is executing somewhere else). Run-time variables are automatically packed and distributed with the user-defined code and automated load-balancing of remote resources enables smooth concurrent execution. mGrid is an open source environment. Apart from the programming language itself, all other components are also open source, freely available tools: light-weight PHP scripts and the Apache web server. Conclusion Transparent, load-balanced distribution of user-defined Matlab toolboxes and rapid prototyping of many simple parallel applications can now be done with a single easy-to-use Matlab command. Because mGrid utilizes only Matlab, light-weight PHP scripts and the Apache web server, installation and configuration are very simple. Moreover, the web-based infrastructure of mGrid allows for it to be easily extensible over the Internet. PMID:16539707

  11. mGrid: a load-balanced distributed computing environment for the remote execution of the user-defined Matlab code.

    PubMed

    Karpievitch, Yuliya V; Almeida, Jonas S

    2006-03-15

    Matlab, a powerful and productive language that allows for rapid prototyping, modeling and simulation, is widely used in computational biology. Modeling and simulation of large biological systems often require more computational resources then are available on a single computer. Existing distributed computing environments like the Distributed Computing Toolbox, MatlabMPI, Matlab*G and others allow for the remote (and possibly parallel) execution of Matlab commands with varying support for features like an easy-to-use application programming interface, load-balanced utilization of resources, extensibility over the wide area network, and minimal system administration skill requirements. However, all of these environments require some level of access to participating machines to manually distribute the user-defined libraries that the remote call may invoke. mGrid augments the usual process distribution seen in other similar distributed systems by adding facilities for user code distribution. mGrid's client-side interface is an easy-to-use native Matlab toolbox that transparently executes user-defined code on remote machines (i.e. the user is unaware that the code is executing somewhere else). Run-time variables are automatically packed and distributed with the user-defined code and automated load-balancing of remote resources enables smooth concurrent execution. mGrid is an open source environment. Apart from the programming language itself, all other components are also open source, freely available tools: light-weight PHP scripts and the Apache web server. Transparent, load-balanced distribution of user-defined Matlab toolboxes and rapid prototyping of many simple parallel applications can now be done with a single easy-to-use Matlab command. Because mGrid utilizes only Matlab, light-weight PHP scripts and the Apache web server, installation and configuration are very simple. Moreover, the web-based infrastructure of mGrid allows for it to be easily extensible over the Internet.

  12. Utility of the Mantel-Haenszel Procedure for Detecting Differential Item Functioning in Small Samples

    ERIC Educational Resources Information Center

    Fidalgo, Angel M.; Ferreres, Doris; Muniz, Jose

    2004-01-01

    Sample-size restrictions limit the contingency table approaches based on asymptotic distributions, such as the Mantel-Haenszel (MH) procedure, for detecting differential item functioning (DIF) in many practical applications. Within this framework, the present study investigated the power and Type I error performance of empirical and inferential…

  13. 76 FR 51401 - Manufacturer of Controlled Substances; Notice of Application

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-18

    ... Methamphetamine (1105), a basic class of controlled substance listed in schedule II. The above listed controlled... substance. The methamphetamine will not be sold as a commercial product. The company plans to utilize a bulk... substance, and further distribution to its customers. The methamphetamine will not be sold. Any other such...

  14. Application of the remote-sensing communication model to a time-sensitive wildfire remote-sensing system

    Treesearch

    Christopher D. Lippitt; Douglas A. Stow; Philip J. Riggan

    2016-01-01

    Remote sensing for hazard response requires a priori identification of sensor, transmission, processing, and distribution methods to permit the extraction of relevant information in timescales sufficient to allow managers to make a given time-sensitive decision. This study applies and demonstrates the utility of the Remote Sensing Communication...

  15. Renewables-Friendly Grid Development Strategies. Experience in the United States, Potential Lessons for China

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hurlbut, David; Zhou, Ella; Porter, Kevin

    2015-10-01

    This report aims to help China's reform effort by providing a concise summary of experience in the United States with "renewables-friendly"" grid management, focusing on experiences that might be applicable to China. It focuses on utility-scale renewables and sets aside issues related to distributed generation.

  16. Automatic Data Distribution for CFD Applications on Structured Grids

    NASA Technical Reports Server (NTRS)

    Frumkin, Michael; Yan, Jerry

    2000-01-01

    Data distribution is an important step in implementation of any parallel algorithm. The data distribution determines data traffic, utilization of the interconnection network and affects the overall code efficiency. In recent years a number data distribution methods have been developed and used in real programs for improving data traffic. We use some of the methods for translating data dependence and affinity relations into data distribution directives. We describe an automatic data alignment and placement tool (ADAFT) which implements these methods and show it results for some CFD codes (NPB and ARC3D). Algorithms for program analysis and derivation of data distribution implemented in ADAFT are efficient three pass algorithms. Most algorithms have linear complexity with the exception of some graph algorithms having complexity O(n(sup 4)) in the worst case.

  17. Automatic Data Distribution for CFD Applications on Structured Grids

    NASA Technical Reports Server (NTRS)

    Frumkin, Michael; Yan, Jerry

    1999-01-01

    Data distribution is an important step in implementation of any parallel algorithm. The data distribution determines data traffic, utilization of the interconnection network and affects the overall code efficiency. In recent years a number data distribution methods have been developed and used in real programs for improving data traffic. We use some of the methods for translating data dependence and affinity relations into data distribution directives. We describe an automatic data alignment and placement tool (ADAPT) which implements these methods and show it results for some CFD codes (NPB and ARC3D). Algorithms for program analysis and derivation of data distribution implemented in ADAPT are efficient three pass algorithms. Most algorithms have linear complexity with the exception of some graph algorithms having complexity O(n(sup 4)) in the worst case.

  18. Distributed data transmitter

    DOEpatents

    Brown, Kenneth Dewayne [Grain Valley, MO; Dunson, David [Kansas City, MO

    2006-08-08

    A distributed data transmitter (DTXR) which is an adaptive data communication microwave transmitter having a distributable architecture of modular components, and which incorporates both digital and microwave technology to provide substantial improvements in physical and operational flexibility. The DTXR has application in, for example, remote data acquisition involving the transmission of telemetry data across a wireless link, wherein the DTXR is integrated into and utilizes available space within a system (e.g., a flight vehicle). In a preferred embodiment, the DTXR broadly comprises a plurality of input interfaces; a data modulator; a power amplifier; and a power converter, all of which are modularly separate and distinct so as to be substantially independently physically distributable and positionable throughout the system wherever sufficient space is available.

  19. Distributed data transmitter

    DOEpatents

    Brown, Kenneth Dewayne [Grain Valley, MO; Dunson, David [Kansas City, MO

    2008-06-03

    A distributed data transmitter (DTXR) which is an adaptive data communication microwave transmitter having a distributable architecture of modular components, and which incorporates both digital and microwave technology to provide substantial improvements in physical and operational flexibility. The DTXR has application in, for example, remote data acquisition involving the transmission of telemetry data across a wireless link, wherein the DTXR is integrated into and utilizes available space within a system (e.g., a flight vehicle). In a preferred embodiment, the DTXR broadly comprises a plurality of input interfaces; a data modulator; a power amplifier; and a power converter, all of which are modularly separate and distinct so as to be substantially independently physically distributable and positionable throughout the system wherever sufficient space is available.

  20. Advanced Inverter Functions and Communication Protocols for Distribution Management

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nagarajan, Adarsh; Palmintier, Bryan; Baggu, Murali

    2016-05-05

    This paper aims at identifying the advanced features required by distribution management systems (DMS) service providers to bring inverter-connected distributed energy resources into use as an intelligent grid resource. This work explores the standard functions needed in the future DMS for enterprise integration of distributed energy resources (DER). The important DMS functionalities such as DER management in aggregate groups, including the discovery of capabilities, status monitoring, and dispatch of real and reactive power are addressed in this paper. It is intended to provide the industry with a point of reference for DER integration with other utility applications and to providemore » guidance to research and standards development organizations.« less

  1. Advanced algorithms for distributed fusion

    NASA Astrophysics Data System (ADS)

    Gelfand, A.; Smith, C.; Colony, M.; Bowman, C.; Pei, R.; Huynh, T.; Brown, C.

    2008-03-01

    The US Military has been undergoing a radical transition from a traditional "platform-centric" force to one capable of performing in a "Network-Centric" environment. This transformation will place all of the data needed to efficiently meet tactical and strategic goals at the warfighter's fingertips. With access to this information, the challenge of fusing data from across the batttlespace into an operational picture for real-time Situational Awareness emerges. In such an environment, centralized fusion approaches will have limited application due to the constraints of real-time communications networks and computational resources. To overcome these limitations, we are developing a formalized architecture for fusion and track adjudication that allows the distribution of fusion processes over a dynamically created and managed information network. This network will support the incorporation and utilization of low level tracking information within the Army Distributed Common Ground System (DCGS-A) or Future Combat System (FCS). The framework is based on Bowman's Dual Node Network (DNN) architecture that utilizes a distributed network of interlaced fusion and track adjudication nodes to build and maintain a globally consistent picture across all assets.

  2. Application of Distribution Transformer Thermal Life Models to Electrified Vehicle Charging Loads Using Monte-Carlo Method: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kuss, M.; Markel, T.; Kramer, W.

    Concentrated purchasing patterns of plug-in vehicles may result in localized distribution transformer overload scenarios. Prolonged periods of transformer overloading causes service life decrements, and in worst-case scenarios, results in tripped thermal relays and residential service outages. This analysis will review distribution transformer load models developed in the IEC 60076 standard, and apply the model to a neighborhood with plug-in hybrids. Residential distribution transformers are sized such that night-time cooling provides thermal recovery from heavy load conditions during the daytime utility peak. It is expected that PHEVs will primarily be charged at night in a residential setting. If not managed properly,more » some distribution transformers could become overloaded, leading to a reduction in transformer life expectancy, thus increasing costs to utilities and consumers. A Monte-Carlo scheme simulated each day of the year, evaluating 100 load scenarios as it swept through the following variables: number of vehicle per transformer, transformer size, and charging rate. A general method for determining expected transformer aging rate will be developed, based on the energy needs of plug-in vehicles loading a residential transformer.« less

  3. Transmission and Distribution Efficiency Improvement Rearch and Development Survey.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brooks, C.L.; Westinghouse Electric Corporation. Advanced Systems Technology.

    Purpose of this study was to identify and quantify those technologies for improving transmission and distribution (T and D) system efficiency that could provide the greatest benefits for utility customers in the Pacific Northwest. Improving the efficiency of transmission and distribution systems offers a potential source of conservation within the utility sector. An extensive review of this field resulted in a list of 49 state-of-the-art technologies and 39 future technologies. Of these, 15 from the former list and 7 from the latter were chosen as the most promising and then submitted to an evaluative test - a modeled sample systemmore » for Benton County PUD, a utility with characteristics typical of a BPA customer system. Reducing end-use voltage on secondary distribution systems to decrease the energy consumption of electrical users when possible, called ''Conservation Voltage Reduction,'' was found to be the most cost effective state-of-the-art technology. Voltampere reactive (var) optimization is a similarly cost effective alternative. The most significant reduction in losses on the transmission and distribution system would be achieved through the replacement of standard transformers with high efficiency transformers, such as amorphous steel transformers. Of the future technologies assessed, the ''Distribution Static VAR Generator'' appears to have the greatest potential for technological breakthroughs and, therefore in time, commercialization. ''Improved Dielectric Materials,'' with a relatively low cost and high potential for efficiency improvement, warrant R and D consideration. ''Extruded Three-Conductor Cable'' and ''Six- and Twelve-Phase Transmission'' programs provide only limited gains in efficiency and applicability and are therefore the least cost effective.« less

  4. Gamma-ray momentum reconstruction from Compton electron trajectories by filtered back-projection

    DOE PAGES

    Haefner, A.; Gunter, D.; Plimley, B.; ...

    2014-11-03

    Gamma-ray imaging utilizing Compton scattering has traditionally relied on measuring coincident gamma-ray interactions to map directional information of the source distribution. This coincidence requirement makes it an inherently inefficient process. We present an approach to gamma-ray reconstruction from Compton scattering that requires only a single electron tracking detector, thus removing the coincidence requirement. From the Compton scattered electron momentum distribution, our algorithm analytically computes the incident photon's correlated direction and energy distributions. Because this method maps the source energy and location, it is useful in applications, where prior information about the source distribution is unknown. We demonstrate this method withmore » electron tracks measured in a scientific Si charge coupled device. While this method was demonstrated with electron tracks in a Si-based detector, it is applicable to any detector that can measure electron direction and energy, or equivalently the electron momentum. For example, it can increase the sensitivity to obtain energy and direction in gas-based systems that suffer from limited efficiency.« less

  5. Lambert W function for applications in physics

    NASA Astrophysics Data System (ADS)

    Veberič, Darko

    2012-12-01

    The Lambert W(x) function and its possible applications in physics are presented. The actual numerical implementation in C++ consists of Halley's and Fritsch's iterations with initial approximations based on branch-point expansion, asymptotic series, rational fits, and continued-logarithm recursion. Program summaryProgram title: LambertW Catalogue identifier: AENC_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AENC_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License version 3 No. of lines in distributed program, including test data, etc.: 1335 No. of bytes in distributed program, including test data, etc.: 25 283 Distribution format: tar.gz Programming language: C++ (with suitable wrappers it can be called from C, Fortran etc.), the supplied command-line utility is suitable for other scripting languages like sh, csh, awk, perl etc. Computer: All systems with a C++ compiler. Operating system: All Unix flavors, Windows. It might work with others. RAM: Small memory footprint, less than 1 MB Classification: 1.1, 4.7, 11.3, 11.9. Nature of problem: Find fast and accurate numerical implementation for the Lambert W function. Solution method: Halley's and Fritsch's iterations with initial approximations based on branch-point expansion, asymptotic series, rational fits, and continued logarithm recursion. Additional comments: Distribution file contains the command-line utility lambert-w. Doxygen comments, included in the source files. Makefile. Running time: The tests provided take only a few seconds to run.

  6. Sputnik: ad hoc distributed computation.

    PubMed

    Völkel, Gunnar; Lausser, Ludwig; Schmid, Florian; Kraus, Johann M; Kestler, Hans A

    2015-04-15

    In bioinformatic applications, computationally demanding algorithms are often parallelized to speed up computation. Nevertheless, setting up computational environments for distributed computation is often tedious. Aim of this project were the lightweight ad hoc set up and fault-tolerant computation requiring only a Java runtime, no administrator rights, while utilizing all CPU cores most effectively. The Sputnik framework provides ad hoc distributed computation on the Java Virtual Machine which uses all supplied CPU cores fully. It provides a graphical user interface for deployment setup and a web user interface displaying the current status of current computation jobs. Neither a permanent setup nor administrator privileges are required. We demonstrate the utility of our approach on feature selection of microarray data. The Sputnik framework is available on Github http://github.com/sysbio-bioinf/sputnik under the Eclipse Public License. hkestler@fli-leibniz.de or hans.kestler@uni-ulm.de Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  7. Application Characterization at Scale: Lessons learned from developing a distributed Open Community Runtime system for High Performance Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Landwehr, Joshua B.; Suetterlein, Joshua D.; Marquez, Andres

    2016-05-16

    Since 2012, the U.S. Department of Energy’s X-Stack program has been developing solutions including runtime systems, programming models, languages, compilers, and tools for the Exascale system software to address crucial performance and power requirements. Fine grain programming models and runtime systems show a great potential to efficiently utilize the underlying hardware. Thus, they are essential to many X-Stack efforts. An abundant amount of small tasks can better utilize the vast parallelism available on current and future machines. Moreover, finer tasks can recover faster and adapt better, due to a decrease in state and control. Nevertheless, current applications have been writtenmore » to exploit old paradigms (such as Communicating Sequential Processor and Bulk Synchronous Parallel processing). To fully utilize the advantages of these new systems, applications need to be adapted to these new paradigms. As part of the applications’ porting process, in-depth characterization studies, focused on both application characteristics and runtime features, need to take place to fully understand the application performance bottlenecks and how to resolve them. This paper presents a characterization study for a novel high performance runtime system, called the Open Community Runtime, using key HPC kernels as its vehicle. This study has the following contributions: one of the first high performance, fine grain, distributed memory runtime system implementing the OCR standard (version 0.99a); and a characterization study of key HPC kernels in terms of runtime primitives running on both intra and inter node environments. Running on a general purpose cluster, we have found up to 1635x relative speed-up for a parallel tiled Cholesky Kernels on 128 nodes with 16 cores each and a 1864x relative speed-up for a parallel tiled Smith-Waterman kernel on 128 nodes with 30 cores.« less

  8. Simulation of Etching in Chlorine Discharges Using an Integrated Feature Evolution-Plasma Model

    NASA Technical Reports Server (NTRS)

    Hwang, Helen H.; Bose, Deepak; Govindan, T. R.; Meyyappan, M.; Biegel, Bryan (Technical Monitor)

    2002-01-01

    To better utilize its vast collection of heterogeneous resources that are geographically distributed across the United States, NASA is constructing a computational grid called the Information Power Grid (IPG). This paper describes various tools and techniques that we are developing to measure and improve the performance of a broad class of NASA applications when run on the IPG. In particular, we are investigating the areas of grid benchmarking, grid monitoring, user-level application scheduling, and decentralized system-level scheduling.

  9. Quantum-secured blockchain

    NASA Astrophysics Data System (ADS)

    Kiktenko, E. O.; Pozhar, N. O.; Anufriev, M. N.; Trushechkin, A. S.; Yunusov, R. R.; Kurochkin, Y. V.; Lvovsky, A. I.; Fedorov, A. K.

    2018-07-01

    Blockchain is a distributed database which is cryptographically protected against malicious modifications. While promising for a wide range of applications, current blockchain platforms rely on digital signatures, which are vulnerable to attacks by means of quantum computers. The same, albeit to a lesser extent, applies to cryptographic hash functions that are used in preparing new blocks, so parties with access to quantum computation would have unfair advantage in procuring mining rewards. Here we propose a possible solution to the quantum era blockchain challenge and report an experimental realization of a quantum-safe blockchain platform that utilizes quantum key distribution across an urban fiber network for information-theoretically secure authentication. These results address important questions about realizability and scalability of quantum-safe blockchains for commercial and governmental applications.

  10. Crowdsourcing applications for public health.

    PubMed

    Brabham, Daren C; Ribisl, Kurt M; Kirchner, Thomas R; Bernhardt, Jay M

    2014-02-01

    Crowdsourcing is an online, distributed, problem-solving, and production model that uses the collective intelligence of networked communities for specific purposes. Although its use has benefited many sectors of society, it has yet to be fully realized as a method for improving public health. This paper defines the core components of crowdsourcing and proposes a framework for understanding the potential utility of crowdsourcing in the domain of public health. Four discrete crowdsourcing approaches are described (knowledge discovery and management; distributed human intelligence tasking; broadcast search; and peer-vetted creative production types) and a number of potential applications for crowdsourcing for public health science and practice are enumerated. © 2013 American Journal of Preventive Medicine Published by American Journal of Preventive Medicine All rights reserved.

  11. Bio-Nanobattery Development and Characterization

    NASA Technical Reports Server (NTRS)

    King, Glen C.; Choi, Sang H.; Chu, Sang-Hyon; Kim, Jae-Woo; Watt, Gerald D.; Lillehei, Peter T.; Park, Yeonjoon; Elliott, James R.

    2005-01-01

    A bio-nanobattery is an electrical energy storage device that utilizes organic materials and processes on an atomic, or nanometer-scale. The bio-nanobattery under development at NASA s Langley Research Center provides new capabilities for electrical power generation, storage, and distribution as compared to conventional power storage systems. Most currently available electronic systems and devices rely on a single, centralized power source to supply electrical power to a specified location in the circuit. As electronic devices and associated components continue to shrink in size towards the nanometer-scale, a single centralized power source becomes impractical. Small systems, such as these, will require distributed power elements to reduce Joule heating, to minimize wiring quantities, and to allow autonomous operation of the various functions performed by the circuit. Our research involves the development and characterization of a bio-nanobattery using ferritins reconstituted with both an iron core (Fe-ferritin) and a cobalt core (Co-ferritin). Synthesis and characterization of the Co-ferritin and Fe-ferritin electrodes were performed, including reducing capability and the half-cell electrical potentials. Electrical output of nearly 0.5 V for the battery cell was measured. Ferritin utilizing other metallic cores were also considered to increase the overall electrical output. Two dimensional ferritin arrays were produced on various substrates to demonstrate the feasibility of a thin-film nano-scaled power storage system for distributed power storage applications. The bio-nanobattery will be ideal for nanometerscaled electronic applications, due to the small size, high energy density, and flexible thin-film structure. A five-cell demonstration article was produced for concept verification and bio-nanobattery characterization. Challenges to be addressed include the development of a multi-layered thin-film, increasing the energy density, dry-cell bionanobattery development, and selection of ferritin core materials to allow the broadest range of applications. The potential applications for the distributed power system include autonomously-operating intelligent chips, flexible thin-film electronic circuits, nanoelectromechanical systems (NEMS), ultra-high density data storage devices, nanoelectromagnetics, quantum electronic devices, biochips, nanorobots for medical applications and mechanical nano-fabrication, etc.

  12. On service differentiation in mobile Ad Hoc networks.

    PubMed

    Zhang, Shun-liang; Ye, Cheng-qing

    2004-09-01

    A network model is proposed to support service differentiation for mobile Ad Hoc networks by combining a fully distributed admission control approach and the DIFS based differentiation mechanism of IEEE802.11. It can provide different kinds of QoS (Quality of Service) for various applications. Admission controllers determine a committed bandwidth based on the reserved bandwidth of flows and the source utilization of networks. Packets are marked when entering into networks by markers according to the committed rate. By the mark in the packet header, intermediate nodes handle the received packets in different manners to provide applications with the QoS corresponding to the pre-negotiated profile. Extensive simulation experiments showed that the proposed mechanism can provide QoS guarantee to assured service traffic and increase the channel utilization of networks.

  13. Aerospace induction motor actuators driven from a 20-kHz power link

    NASA Technical Reports Server (NTRS)

    Hansen, Irving G.

    1990-01-01

    Aerospace electromechanical actuators utilizing induction motors are under development in sizes up to 40 kW. While these actuators have immediate application to the Advanced Launch System (ALS) program, several potential applications are currently under study including the Advanced Aircraft Program. Several recent advances developed for the Space Station Freedom have allowed induction motors to be selected as a first choice for such applications. Among these technologies are bi-directional electronics and high frequency power distribution techniques. Each of these technologies are discussed with emphasis on their impact upon induction motor operation.

  14. Application-Level Interoperability Across Grids and Clouds

    NASA Astrophysics Data System (ADS)

    Jha, Shantenu; Luckow, Andre; Merzky, Andre; Erdely, Miklos; Sehgal, Saurabh

    Application-level interoperability is defined as the ability of an application to utilize multiple distributed heterogeneous resources. Such interoperability is becoming increasingly important with increasing volumes of data, multiple sources of data as well as resource types. The primary aim of this chapter is to understand different ways in which application-level interoperability can be provided across distributed infrastructure. We achieve this by (i) using the canonical wordcount application, based on an enhanced version of MapReduce that scales-out across clusters, clouds, and HPC resources, (ii) establishing how SAGA enables the execution of wordcount application using MapReduce and other programming models such as Sphere concurrently, and (iii) demonstrating the scale-out of ensemble-based biomolecular simulations across multiple resources. We show user-level control of the relative placement of compute and data and also provide simple performance measures and analysis of SAGA-MapReduce when using multiple, different, heterogeneous infrastructures concurrently for the same problem instance. Finally, we discuss Azure and some of the system-level abstractions that it provides and show how it is used to support ensemble-based biomolecular simulations.

  15. A Geographic-Information-Systems-Based Approach to Analysis of Characteristics Predicting Student Persistence and Graduation

    ERIC Educational Resources Information Center

    Ousley, Chris

    2010-01-01

    This study sought to provide empirical evidence regarding the use of spatial analysis in enrollment management to predict persistence and graduation. The research utilized data from the 2000 U.S. Census and applicant records from The University of Arizona to study the spatial distributions of enrollments. Based on the initial results, stepwise…

  16. 77 FR 27428 - Certain Large Diameter Carbon and Alloy Seamless Standard, Line, and Pressure Pipe (Over 41/2

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-10

    ... diameter seamless pipe is used primarily for line applications such as oil, gas, or water pipeline, or utility distribution systems. Seamless pressure pipes are intended for the conveyance of water, steam... (``ASME'') code stress levels. Alloy pipes made to ASTM A-335 standard must be used if temperatures and...

  17. 78 FR 64475 - Certain Large Diameter Carbon and Alloy Seamless Standard, Line, and Pressure Pipe (Over 41/2

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-29

    ... line applications such as oil, gas, or water pipeline, or utility distribution systems. Seamless pressure pipes are intended for the conveyance of water, steam, petrochemicals, chemicals, oil products... stress levels. Alloy pipes made to ASTM A-335 standard must be used if temperatures and stress levels...

  18. 76 FR 66688 - Certain Large Diameter Carbon and Alloy Seamless Standard, Line, and Pressure Pipe (Over 41/2

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-27

    ... diameter seamless pipe is used primarily for line applications such as oil, gas, or water pipeline, or utility distribution systems. Seamless pressure pipes are intended for the conveyance of water, steam... (``ASME'') code stress levels. Alloy pipes made to ASTM A-335 standard must be used if temperatures and...

  19. 76 FR 47555 - Certain Large Diameter Carbon and Alloy Seamless Standard, Line and Pressure Pipe From Japan...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-05

    ... diameter seamless pipe is used primarily for line applications such as oil, gas, or water pipeline, or utility distribution systems. Seamless pressure pipes are intended for the conveyance of water, steam... Engineers (``ASME'') code stress levels. Alloy pipes made to ASTM A-335 standard must be used if...

  20. The Motivation Impact of Open Educational Resources Utilization on Physics Learning Using Quipper School App

    ERIC Educational Resources Information Center

    Sulisworo, Dwi; Sulistyo, Eko Nur; Akhsan, Rifai Nur

    2017-01-01

    The distribution of the education quality in Indonesia is relatively uneven. This affects the quality of secondary school graduates. On the other hand, the national growth of Information Communication Technology usage in Indonesia is very high, including the use of mobile technology. This is an opportunity for the application of OER (Open…

  1. Demonstration of UV LED versatility when paired with molded UV transmitting glass optics to produce unique irradiance patterns

    NASA Astrophysics Data System (ADS)

    Jasenak, Brian

    2017-02-01

    Ultraviolet light-emitting diode (UV LED) adoption is accelerating; they are being used in new applications such as UV curing, germicidal irradiation, nondestructive testing, and forensic analysis. In many of these applications, it is critically important to produce a uniform light distribution and consistent surface irradiance. Flat panes of fused quartz, silica, or glass are commonly used to cover and protect UV LED arrays. However, they don't offer the advantages of an optical lens design. An investigation was conducted to determine the effect of a secondary glass optic on the uniformity of the light distribution and irradiance. Glass optics capable of transmitting UV-A, UV-B, and UV-C wavelengths can improve light distribution, uniformity, and intensity. In this work, two simulation studies were created to illustrate distinct irradiance patterns desirable for potential real world applications. The first study investigates the use of a multi-UV LED array and optic to create a uniform irradiance pattern on the flat two dimensional (2D) target surface. The uniformity was improved by designing both the LED array and molded optic to produce a homogenous pattern. The second study investigated the use of an LED light source and molded optic to improve the light uniformity on the inside of a canister. The case study illustrates the requirements for careful selection of LED based on light distribution and subsequent design of optics. The optic utilizes total internal reflection to create optimized light distribution. The combination of the LED and molded optic showed significant improvement in uniformity on the inner surface of the canister. The simulations illustrate how the application of optics can significantly improve UV light distribution which can be critical in applications such as UV curing and sterilization.

  2. Techniques and Tools for Performance Tuning of Parallel and Distributed Scientific Applications

    NASA Technical Reports Server (NTRS)

    Sarukkai, Sekhar R.; VanderWijngaart, Rob F.; Castagnera, Karen (Technical Monitor)

    1994-01-01

    Performance degradation in scientific computing on parallel and distributed computer systems can be caused by numerous factors. In this half-day tutorial we explain what are the important methodological issues involved in obtaining codes that have good performance potential. Then we discuss what are the possible obstacles in realizing that potential on contemporary hardware platforms, and give an overview of the software tools currently available for identifying the performance bottlenecks. Finally, some realistic examples are used to illustrate the actual use and utility of such tools.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Horiike, S.; Okazaki, Y.

    This paper describes a performance estimation tool developed for modeling and simulation of open distributed energy management systems to support their design. The approach of discrete event simulation with detailed models is considered for efficient performance estimation. The tool includes basic models constituting a platform, e.g., Ethernet, communication protocol, operating system, etc. Application softwares are modeled by specifying CPU time, disk access size, communication data size, etc. Different types of system configurations for various system activities can be easily studied. Simulation examples show how the tool is utilized for the efficient design of open distributed energy management systems.

  4. Adaptive Energy Forecasting and Information Diffusion for Smart Power Grids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simmhan, Yogesh; Agarwal, Vaibhav; Aman, Saim

    2012-05-16

    Smart Power Grids exemplify an emerging class of Cyber Physical Applications that exhibit dynamic, distributed and data intensive (D3) characteristics along with an always-on paradigm to support operational needs. Smart Grids are an outcome of instrumentation, such as Phasor Measurement Units and Smart Power Meters, that is being deployed across the transmission and distribution network of electric grids. These sensors provide utilities with improved situation awareness on near-realtime electricity usage by individual consumers, and the power quality and stability of the transmission network.

  5. Application of TIMS data in stratigraphic analysis

    NASA Technical Reports Server (NTRS)

    Lang, H. R.

    1986-01-01

    An in-progress study demonstrates the utility of Thermal Infrared Multispectral Scanner (TIMS) data for unraveling the stratigraphic sequence of a western interior, North American foreland basin. The TIMS data can be used to determine the stratigraphic distribution of minerals that are diagnostic of specific depositional distribution. The thematic mapper (TM) and TIMS data were acquired in the Wind River/Bighorn area of central Wyoming in November 1982, and July 1983, respectively. Combined image processing, photogeologic, and spectral analysis methods were used to: map strata; construct stratigraphic columns; correlate data; and identify mineralogical facies.

  6. Computer programs for calculating pressure distributions including vortex effects on supersonic monoplane or cruciform wing-body-tail combinations with round or elliptical bodies

    NASA Technical Reports Server (NTRS)

    Dillenius, M. F. E.; Nielsen, J. N.

    1979-01-01

    Computer programs are presented which are capable of calculating detailed aerodynamic loadings and pressure distributions acting on pitched and rolled supersonic missile configurations which utilize bodies of circular or elliptical cross sections. The applicable range of angle of attack is up to 20 deg, and the Mach number range is 1.3 to about 2.5. Effects of body and fin vortices are included in the methods, as well as arbitrary deflections of canard or fin panels.

  7. FINAL SCIENTIFIC REPORT Southwest United States of America – Distributed Technology Training Consortia (SWUSA-DTTC) Contract Number: DE-EE0006339

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arnold, Kodie

    The Southwest United States of America – Distributed Technology Training Consortia (SWUSA-DTTC) leveraged the highest concentration of renewable resources in the U.S. as well as operation of the leading independent microgrid installations and other distributed technologies to collect and analyze real-time data streams, advance power system simulations and analysis, identify educational and training gaps and develop solutions-focused curricula. The SWUSA-DTTC consortium posed a unique collaboration between universities and utilities to ensure that classes were focused on subjects and topics of interest to the utilities and ones that had practical benefit related to the preparedness for accommodating high penetration of solarmore » and other distributed energy technologies. This approach to have a close collaboration and shared effort to develop the course content and curriculum is unique and a significant departure from conventional course development. This coursework and training was intended to endure over a long time horizon (10-20 year time frame), and include professionals over the entire Southwest region and the rest of the US, and even outreach into foreign countries. Project Objectives In order to support the increase in power systems research, development, and analytical capacity, the SWUSA-DTTC brought together respected professors in Power Systems education, student/professor research and development, and valuable industry and utility experience. Through this program, the partnered universities created and/or modified existing curricula available to students and professionals in the form of university courses, short courses, videos, consortia-led training, and online materials. During this time, the supporting vendors and utilities provided the SWUSA-DTTC with technical advisory roles as well as providing input and feedback in terms of utility and related energy industry needs. The goals were to create power and energy systems training, curricula, and workforce preparedness through the inclusion of data collection and analysis, power systems expertise, and application-specific training activities which build on fundamental principles, modeling and simulation tools, field-immersed training and methods of performance validation. The outcome of the program was to result in better prepared and greater number of graduates ready to contribute to the field of power systems which depend upon the safe, reliable and efficient generation sources which make up an increasingly diverse mix of renewable power. Additionally, the program was to deliver critical training modules which are intended to support mid-career professionals and be woven into utility training programs used all over the country. Discontinuation Summary On September 29, 2014, Electricore received notice of discontinuation of federal funding under subject Award DE-EE0006339. DOE’s notice of discontinuation was the first indication provided to Electricore that the work completed against Task 1 - Program Planning and Evaluation during Budget Period 1 was inadequate. Significantly, the discontinuation was based in large part on the erroneous assertion that many of the tasks were incomplete as indicated in the Continuation Application. However, the Continuation Application was submitted July 29, 2014, while the period of performance for this last quarter in Budget Period 1 did not end until September 30, 2014. Accordingly, the performance for this last quarter in Budget Period 1 will be reported in the upcoming Quarterly Report which is not due until October 30, 2014. Nevertheless, in response to DOE’s letter and in further support of Electricore’s Continuation Application, we are providing a detailed status demonstrating adherence to all program requirements for each Task 1, Budget Period 1 SOPO activity and milestones in response to the DOE review.« less

  8. Design and Verification of Remote Sensing Image Data Center Storage Architecture Based on Hadoop

    NASA Astrophysics Data System (ADS)

    Tang, D.; Zhou, X.; Jing, Y.; Cong, W.; Li, C.

    2018-04-01

    The data center is a new concept of data processing and application proposed in recent years. It is a new method of processing technologies based on data, parallel computing, and compatibility with different hardware clusters. While optimizing the data storage management structure, it fully utilizes cluster resource computing nodes and improves the efficiency of data parallel application. This paper used mature Hadoop technology to build a large-scale distributed image management architecture for remote sensing imagery. Using MapReduce parallel processing technology, it called many computing nodes to process image storage blocks and pyramids in the background to improve the efficiency of image reading and application and sovled the need for concurrent multi-user high-speed access to remotely sensed data. It verified the rationality, reliability and superiority of the system design by testing the storage efficiency of different image data and multi-users and analyzing the distributed storage architecture to improve the application efficiency of remote sensing images through building an actual Hadoop service system.

  9. Automatic Parallelization of Numerical Python Applications using the Global Arrays Toolkit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Daily, Jeffrey A.; Lewis, Robert R.

    2011-11-30

    Global Arrays is a software system from Pacific Northwest National Laboratory that enables an efficient, portable, and parallel shared-memory programming interface to manipulate distributed dense arrays. The NumPy module is the de facto standard for numerical calculation in the Python programming language, a language whose use is growing rapidly in the scientific and engineering communities. NumPy provides a powerful N-dimensional array class as well as other scientific computing capabilities. However, like the majority of the core Python modules, NumPy is inherently serial. Using a combination of Global Arrays and NumPy, we have reimplemented NumPy as a distributed drop-in replacement calledmore » Global Arrays in NumPy (GAiN). Serial NumPy applications can become parallel, scalable GAiN applications with only minor source code changes. Scalability studies of several different GAiN applications will be presented showing the utility of developing serial NumPy codes which can later run on more capable clusters or supercomputers.« less

  10. Partial Discharge Monitoring on Metal-Enclosed Switchgear with Distributed Non-Contact Sensors.

    PubMed

    Zhang, Chongxing; Dong, Ming; Ren, Ming; Huang, Wenguang; Zhou, Jierui; Gao, Xuze; Albarracín, Ricardo

    2018-02-11

    Metal-enclosed switchgear, which are widely used in the distribution of electrical energy, play an important role in power distribution networks. Their safe operation is directly related to the reliability of power system as well as the power quality on the consumer side. Partial discharge detection is an effective way to identify potential faults and can be utilized for insulation diagnosis of metal-enclosed switchgear. The transient earth voltage method, an effective non-intrusive method, has substantial engineering application value for estimating the insulation condition of switchgear. However, the practical application effectiveness of TEV detection is not satisfactory because of the lack of a TEV detection application method, i.e., a method with sufficient technical cognition and analysis. This paper proposes an innovative online PD detection system and a corresponding application strategy based on an intelligent feedback distributed TEV wireless sensor network, consisting of sensing, communication, and diagnosis layers. In the proposed system, the TEV signal or status data are wirelessly transmitted to the terminal following low-energy signal preprocessing and acquisition by TEV sensors. Then, a central server analyzes the correlation of the uploaded data and gives a fault warning level according to the quantity, trend, parallel analysis, and phase resolved partial discharge pattern recognition. In this way, a TEV detection system and strategy with distributed acquisition, unitized fault warning, and centralized diagnosis is realized. The proposed system has positive significance for reducing the fault rate of medium voltage switchgear and improving its operation and maintenance level.

  11. Utilities bullish on meter-reading technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garner, W.L.

    1995-01-15

    By the end of 1996, the 400,000 customers of Kansas City Power & Light Company (KCPL) will have their electric meters read by a real-time wireless network that will relay electrical consumption readings back to computers at the utility`s customer service office. KCPL`s executives believe the new radio and cellular network will greatly improve the company`s ability to control its power distribution, manage its load requirements, monitor outages, and in the near future, allow time-of-use and offpeak pricing. The KCPL system represents the first systemwide, commercial application of wireless automated meter reading (AMR) by a U.S. utility. The article alsomore » describes other AMR systems for reading water and gas meters, along with saying that $18 billion in future power plant investments can be avoided by using time-of-use pricing for residential customers.« less

  12. Expected Utility Distributions for Flexible, Contingent Execution

    NASA Technical Reports Server (NTRS)

    Bresina, John L.; Washington, Richard

    2000-01-01

    This paper presents a method for using expected utility distributions in the execution of flexible, contingent plans. A utility distribution maps the possible start times of an action to the expected utility of the plan suffix starting with that action. The contingent plan encodes a tree of possible courses of action and includes flexible temporal constraints and resource constraints. When execution reaches a branch point, the eligible option with the highest expected utility at that point in time is selected. The utility distributions make this selection sensitive to the runtime context, yet still efficient. Our approach uses predictions of action duration uncertainty as well as expectations of resource usage and availability to determine when an action can execute and with what probability. Execution windows and probabilities inevitably change as execution proceeds, but such changes do not invalidate the cached utility distributions, thus, dynamic updating of utility information is minimized.

  13. Use of prismatic films to control light distribution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kneipp, K.G.

    1994-12-31

    3M prismatic films are finding increasing utility in the construction of new hollow light guide fixtures which capitalize on the unique ways in which these novel materials interact with light. Often, the resulting systems provide features and end-user benefits which are difficult or impossible to achieve by alternative design or construction methods. It is apparent that the benefits may be applied to a wide variety of end-uses, and that the resulting products being developed will find utility in many diverse market areas. With the recognition that creating hollow light guide products and systems requires a substantial resource investment, and becausemore » of an existing prominent position in the traffic management market, 3M has decided to focus its current efforts in the development, manufacture, and distribution of value-added products for this market. However, through the sale of these prismatic films, a variety of companies have developed and are manufacturing and distributing other unrelated hollow light guide products which capitalize on the unique capabilities of these films in controlling and distributing light. There appears to be little doubt that the potential applications of this technology will grow both in numbers as well as in diversity.« less

  14. WarpEngine, a Flexible Platform for Distributed Computing Implemented in the VEGA Program and Specially Targeted for Virtual Screening Studies.

    PubMed

    Pedretti, Alessandro; Mazzolari, Angelica; Vistoli, Giulio

    2018-05-21

    The manuscript describes WarpEngine, a novel platform implemented within the VEGA ZZ suite of software for performing distributed simulations both in local and wide area networks. Despite being tailored for structure-based virtual screening campaigns, WarpEngine possesses the required flexibility to carry out distributed calculations utilizing various pieces of software, which can be easily encapsulated within this platform without changing their source codes. WarpEngine takes advantages of all cheminformatics features implemented in the VEGA ZZ program as well as of its largely customizable scripting architecture thus allowing an efficient distribution of various time-demanding simulations. To offer an example of the WarpEngine potentials, the manuscript includes a set of virtual screening campaigns based on the ACE data set of the DUD-E collections using PLANTS as the docking application. Benchmarking analyses revealed a satisfactory linearity of the WarpEngine performances, the speed-up values being roughly equal to the number of utilized cores. Again, the computed scalability values emphasized that a vast majority (i.e., >90%) of the performed simulations benefit from the distributed platform presented here. WarpEngine can be freely downloaded along with the VEGA ZZ program at www.vegazz.net .

  15. Practical comparison of distributed ledger technologies for IoT

    NASA Astrophysics Data System (ADS)

    Red, Val A.

    2017-05-01

    Existing distributed ledger implementations - specifically, several blockchain implementations - embody a cacophony of divergent capabilities augmenting innovations of cryptographic hashes, consensus mechanisms, and asymmetric cryptography in a wide variety of applications. Whether specifically designed for cryptocurrency or otherwise, several distributed ledgers rely upon modular mechanisms such as consensus or smart contracts. These components, however, can vary substantially among implementations; differences involving proof-of-work, practical byzantine fault tolerance, and other consensus approaches exemplify distinct distributed ledger variations. Such divergence results in unique combinations of modules, performance, latency, and fault tolerance. As implementations continue to develop rapidly due to the emerging nature of blockchain technologies, this paper encapsulates a snapshot of sensor and internet of things (IoT) specific implementations of blockchain as of the end of 2016. Several technical risks and divergent approaches preclude standardization of a blockchain for sensors and IoT in the foreseeable future; such issues will be assessed alongside the practicality of IoT applications among Hyperledger, Iota, and Ethereum distributed ledger implementations suggested for IoT. This paper contributes a comparison of existing distributed ledger implementations intended for practical sensor and IoT utilization. A baseline for characterizing distributed ledger implementations in the context of IoT and sensors is proposed. Technical approaches and performance are compared considering IoT size, weight, and power limitations. Consensus and smart contracts, if applied, are also analyzed for the respective implementations' practicality and security. Overall, the maturity of distributed ledgers with respect to sensor and IoT applicability will be analyzed for enterprise interoperability.

  16. Language and Discourse Analysis with Coh-Metrix: Applications from Educational Material to Learning Environments at Scale

    ERIC Educational Resources Information Center

    Dowell, Nia M. M.; Graesser, Arthur\tC.; Cai, Zhiqiang

    2016-01-01

    The goal of this article is to preserve and distribute the information presented at the LASI (2014) workshop on Coh-Metrix, a theoretically grounded, computational linguistics facility that analyzes texts on multiple levels of language and discourse. The workshop focused on the utility of Coh-Metrix in discourse theory and educational practice. We…

  17. Stability analysis of spacecraft power systems

    NASA Technical Reports Server (NTRS)

    Halpin, S. M.; Grigsby, L. L.; Sheble, G. B.; Nelms, R. M.

    1990-01-01

    The problems in applying standard electric utility models, analyses, and algorithms to the study of the stability of spacecraft power conditioning and distribution systems are discussed. Both single-phase and three-phase systems are considered. Of particular concern are the load and generator models that are used in terrestrial power system studies, as well as the standard assumptions of load and topological balance that lead to the use of the positive sequence network. The standard assumptions regarding relative speeds of subsystem dynamic responses that are made in the classical transient stability algorithm, which forms the backbone of utility-based studies, are examined. The applicability of these assumptions to a spacecraft power system stability study is discussed in detail. In addition to the classical indirect method, the applicability of Liapunov's direct methods to the stability determination of spacecraft power systems is discussed. It is pointed out that while the proposed method uses a solution process similar to the classical algorithm, the models used for the sources, loads, and networks are, in general, more accurate. Some preliminary results are given for a linear-graph, state-variable-based modeling approach to the study of the stability of space-based power distribution networks.

  18. Transforming the market for commercial and industrial distribution transformers: A government, manufacturer, and utility collaboration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DeLaski, A.; Gauthier, J.; Shugars, J.

    Distribution transformers offer a largely untapped opportunity for efficiency improvements in buildings. Application of energy-efficient equipment can reduce transformer losses by about 20%, substantially cutting a facility's total electricity bill and offering typical paybacks less than three years. Since nearly all of the electricity powering the commercial and industrial sectors is stepped down in voltage by facility-owned distribution transformers, broad application of energy-efficient equipment will lead to huge economy-wide energy and dollar savings as well as associated environmental benefits. This opportunity has led to a multi-party coordinated effort that offers a new model for national partnerships to pursue market transformation.more » The model, called the Informal Collaborative Model for the purposes of this paper, is characterized by voluntary commitments of multiple stakeholders to carry out key market interventions in a coordinated fashion, but without pooling resources or control. Collaborative participants are joined by a common interest in establishing and expanding the market for a new product, service, or practice that will yield substantial energy savings. This paper summarizes the technical efficiency opportunity available in distribution transformers; discusses the market barriers to widespread adoption of energy-efficient transformers; and details an overall market transformation strategy to address the identified market barriers. The respective roles of each of the diverse players--manufacturers, government agencies, and utility and regional energy efficiency programs--are given particular attention. Each of the organizations involved brings a particular set of tools and capabilities for addressing the market barriers to more efficient transformers.« less

  19. Foundational Report Series: Advanced Distribution Management Systems for Grid Modernization, High-Level Use Cases for DMS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Jianhui; Lu, Xiaonan; Martino, Sal

    Many distribution management systems (DMS) projects have achieved limited success because the electric utility did not sufficiently plan for actual use of the DMS functions in the control room environment. As a result, end users were not clear on how to use the new application software in actual production environments with existing, well-established business processes. An important first step in the DMS implementation process is development and refinement of the “to be” business processes. Development of use cases for the required DMS application functions is a key activity that leads to the formulation of the “to be” requirements. It ismore » also an important activity that is needed to develop specifications that are used to procure a new DMS.« less

  20. Modeling a distributed environment for a petroleum reservoir engineering application with software product line

    NASA Astrophysics Data System (ADS)

    de Faria Scheidt, Rafael; Vilain, Patrícia; Dantas, M. A. R.

    2014-10-01

    Petroleum reservoir engineering is a complex and interesting field that requires large amount of computational facilities to achieve successful results. Usually, software environments for this field are developed without taking care out of possible interactions and extensibilities required by reservoir engineers. In this paper, we present a research work which it is characterized by the design and implementation based on a software product line model for a real distributed reservoir engineering environment. Experimental results indicate successfully the utilization of this approach for the design of distributed software architecture. In addition, all components from the proposal provided greater visibility of the organization and processes for the reservoir engineers.

  1. Time-frequency representation of a highly nonstationary signal via the modified Wigner distribution

    NASA Technical Reports Server (NTRS)

    Zoladz, T. F.; Jones, J. H.; Jong, J.

    1992-01-01

    A new signal analysis technique called the modified Wigner distribution (MWD) is presented. The new signal processing tool has been very successful in determining time frequency representations of highly non-stationary multicomponent signals in both simulations and trials involving actual Space Shuttle Main Engine (SSME) high frequency data. The MWD departs from the classic Wigner distribution (WD) in that it effectively eliminates the cross coupling among positive frequency components in a multiple component signal. This attribute of the MWD, which prevents the generation of 'phantom' spectral peaks, will undoubtedly increase the utility of the WD for real world signal analysis applications which more often than not involve multicomponent signals.

  2. Determination of the Subcellular Distribution of Liposomes Using Confocal Microscopy.

    PubMed

    Solomon, Melani A

    2017-01-01

    It is being increasingly recognized that therapeutics need to be delivered to specific organelle targets within cells. Liposomes are versatile lipid-based drug delivery vehicles that can be surface-modified to deliver the loaded cargo to specific subcellular locations within the cell. Hence, the development of such technology requires a means of measuring the subcellular distribution possibly by utilizing imaging techniques that can visualize and quantitate the extent of this subcellular localization. The apparent increase of resolution along the Z-axis offered by confocal microscopy makes this technique suitable for such studies. In this chapter, we describe the application of confocal laser scanning microscopy (CLSM) to determine the subcellular distribution of fluorescently labeled mitochondriotropic liposomes.

  3. An Empirical Study of Synchrophasor Communication Delay in a Utility TCP/IP Network

    NASA Astrophysics Data System (ADS)

    Zhu, Kun; Chenine, Moustafa; Nordström, Lars; Holmström, Sture; Ericsson, Göran

    2013-07-01

    Although there is a plethora of literature dealing with Phasor Measurement Unit (PMU) communication delay, there has not been any effort made to generalize empirical delay results by identifying the distribution with the best fit. The existing studies typically assume a distribution or simply build on analogies to communication network routing delay. Specifically, this study provides insight into the characterization of the communication delay of both unprocessed PMU data and synchrophasors sorted by a Phasor Data Concentrator (PDC). The results suggest that a bi-modal distribution containing two normal distributions offers the best fit of the delay of the unprocessed data, whereas the delay profile of the sorted synchrophasors resembles a normal distribution based on these results, the possibility of evaluating the reliability of a synchrophasor application with respect to a particular choice of PDC timeout is discussed.

  4. Electric power processing, distribution and control for advanced aerospace vehicles.

    NASA Technical Reports Server (NTRS)

    Krausz, A.; Felch, J. L.

    1972-01-01

    The results of a current study program to develop a rational basis for selection of power processing, distribution, and control configurations for future aerospace vehicles including the Space Station, Space Shuttle, and high-performance aircraft are presented. Within the constraints imposed by the characteristics of power generation subsystems and the load utilization equipment requirements, the power processing, distribution and control subsystem can be optimized by selection of the proper distribution voltage, frequency, and overload/fault protection method. It is shown that, for large space vehicles which rely on static energy conversion to provide electric power, high-voltage dc distribution (above 100 V dc) is preferable to conventional 28 V dc and 115 V ac distribution per MIL-STD-704A. High-voltage dc also has advantages over conventional constant frequency ac systems in many aircraft applications due to the elimination of speed control, wave shaping, and synchronization equipment.

  5. Business Models and Regulation | Distributed Generation Interconnection

    Science.gov Websites

    @nrel.gov 303-384-4641 Utilities and regulators are responding to the growth of distributed generation with new business models and approaches. The growing role of distributed resources in the electricity Electric Cooperative, Groton Utilities Distributed Solar for Small Utilities A recording of the webinar is

  6. AUTOMATED UTILITY SERVICE AREA ASSESSMENT UNDER EMERGENCY CONDITIONS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    G. TOOLE; S. LINGER

    2001-01-01

    All electric utilities serve power to their customers through a variety of functional levels, notably substations. The majority of these components consist of distribution substations operating at lower voltages while a small fraction are transmission substations. There is an associated geographical area that encompasses customers who are served, defined as the service area. Analysis of substation service areas is greatly complicated by several factors: distribution networks are often highly interconnected which allows a multitude of possible switching operations; also, utilities dynamically alter the network topology in order to respond to emergency events. As a result, the service area for amore » substation can change radically. A utility will generally attempt to minimize the number of customers outaged by switching effected loads to alternate substations. In this manner, all or a portion of a disabled substation's load may be served by one or more adjacent substations. This paper describes a suite of analytical tools developed at Los Alamos National Laboratory (LANL), which address the problem of determining how a utility might respond to such emergency events. The estimated outage areas derived using the tools are overlaid onto other geographical and electrical layers in a geographic information system (GIS) software application. The effects of a power outage on a population, other infrastructures, or other physical features, can be inferred by the proximity of these features to the estimated outage area.« less

  7. Recent advances in redox flow cell storage systems

    NASA Technical Reports Server (NTRS)

    Thaller, L. H.

    1979-01-01

    Several features which were conceived and incorporated into complete redox systems that greatly enhanced its ability to be kept in proper charge balance, to be capable of internal voltage regulation, and in general be treated as a true multicell electrochemical system rather than an assembly of single cells that were wired together, were discussed. The technology status as it relates to the two application areas of solar photovoltaic/wind and distributed energy storage for electric utility applications was addressed. The cost and life advantages of redox systems were also covered.

  8. [Effects of grafting and nitrogen fertilization on melon yield and nitrogen uptake and utilization].

    PubMed

    Xue, Liang; Ma, Zhong Ming; DU, Shao Ping

    2017-06-18

    A split-field design experiment was carried out using two main methods of cultivation (grafting and self-rooted cultivation) and subplots with different nitrogen application levels (0, 120, 240, and 360 kg N·hm -2 ) to investigate the effects of cultivation method and nitrogen application levels on the yield and quality of melons, nitrogen transfer, nitrogen distribution, and nitrogen utilization rate. The results showed that melons produced by grafting cultivation had a 7.3% increase in yield and a 0.16%-3.28% decrease in soluble solid content, compared to those produced by self-rooted cultivation. The amount of nitrogen accumulated in melons grafted in the early growth phase was lower than that in self-rooted melons, and higher after fruiting. During harvest, nitrogen accumulation amount in grafted melon plants was 5.2% higher than that in self-rooted plants and nitrogen accumulation amount in fruits was 10.3% higher. Grafting cultivation increased the amount of nitrogen transfer from plants to fruits by 20.9% compared to self-rooted cultivation. Nitrogen distribution in fruits was >80% in grafted melons, whereas that in self-rooted melons was <80%. Under the same level of nitrogen fertilization, melons cultivated by grafting showed 1.3%-4.2% increase in nitrogen absorption and utilization rate, 2.73-5.56 kg·kg -1 increase in nitrogen agronomic efficiency, and 7.39-16.18 kg·kg -1 increase in nitrogen physiological efficiency, compared to self-rooted cultivation. On the basis of the combined perspective of commercial melon yield, and nitrogen absorption and utilization rate, an applied nitrogen amount of 240 kg·hm -2 is most suitable for graf-ting cultivation in this region.

  9. Scaling exponent and dispersity of polymers in solution by diffusion NMR.

    PubMed

    Williamson, Nathan H; Röding, Magnus; Miklavcic, Stanley J; Nydén, Magnus

    2017-05-01

    Molecular mass distribution measurements by pulsed gradient spin echo nuclear magnetic resonance (PGSE NMR) spectroscopy currently require prior knowledge of scaling parameters to convert from polymer self-diffusion coefficient to molecular mass. Reversing the problem, we utilize the scaling relation as prior knowledge to uncover the scaling exponent from within the PGSE data. Thus, the scaling exponent-a measure of polymer conformation and solvent quality-and the dispersity (M w /M n ) are obtainable from one simple PGSE experiment. The method utilizes constraints and parametric distribution models in a two-step fitting routine involving first the mass-weighted signal and second the number-weighted signal. The method is developed using lognormal and gamma distribution models and tested on experimental PGSE attenuation of the terminal methylene signal and on the sum of all methylene signals of polyethylene glycol in D 2 O. Scaling exponent and dispersity estimates agree with known values in the majority of instances, leading to the potential application of the method to polymers for which characterization is not possible with alternative techniques. Copyright © 2017 Elsevier Inc. All rights reserved.

  10. Analysis and Application of Microgrids

    NASA Astrophysics Data System (ADS)

    Yue, Lu

    New trends of generating electricity locally and utilizing non-conventional or renewable energy sources have attracted increasing interests due to the gradual depletion of conventional fossil fuel energy sources. The new type of power generation is called Distributed Generation (DG) and the energy sources utilized by Distributed Generation are termed Distributed Energy Sources (DERs). With DGs embedded in the distribution networks, they evolve from passive distribution networks to active distribution networks enabling bidirectional power flows in the networks. Further incorporating flexible and intelligent controllers and employing future technologies, active distribution networks will turn to a Microgrid. A Microgrid is a small-scale, low voltage Combined with Heat and Power (CHP) supply network designed to supply electrical and heat loads for a small community. To further implement Microgrids, a sophisticated Microgrid Management System must be integrated. However, due to the fact that a Microgrid has multiple DERs integrated and is likely to be deregulated, the ability to perform real-time OPF and economic dispatch with fast speed advanced communication network is necessary. In this thesis, first, problems such as, power system modelling, power flow solving and power system optimization, are studied. Then, Distributed Generation and Microgrid are studied and reviewed, including a comprehensive review over current distributed generation technologies and Microgrid Management Systems, etc. Finally, a computer-based AC optimization method which minimizes the total transmission loss and generation cost of a Microgrid is proposed and a wireless communication scheme based on synchronized Code Division Multiple Access (sCDMA) is proposed. The algorithm is tested with a 6-bus power system and a 9-bus power system.

  11. Generalized Cross Entropy Method for estimating joint distribution from incomplete information

    NASA Astrophysics Data System (ADS)

    Xu, Hai-Yan; Kuo, Shyh-Hao; Li, Guoqi; Legara, Erika Fille T.; Zhao, Daxuan; Monterola, Christopher P.

    2016-07-01

    Obtaining a full joint distribution from individual marginal distributions with incomplete information is a non-trivial task that continues to challenge researchers from various domains including economics, demography, and statistics. In this work, we develop a new methodology referred to as ;Generalized Cross Entropy Method; (GCEM) that is aimed at addressing the issue. The objective function is proposed to be a weighted sum of divergences between joint distributions and various references. We show that the solution of the GCEM is unique and global optimal. Furthermore, we illustrate the applicability and validity of the method by utilizing it to recover the joint distribution of a household profile of a given administrative region. In particular, we estimate the joint distribution of the household size, household dwelling type, and household home ownership in Singapore. Results show a high-accuracy estimation of the full joint distribution of the household profile under study. Finally, the impact of constraints and weight on the estimation of joint distribution is explored.

  12. Network bandwidth utilization forecast model on high bandwidth networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yoo, Wuchert; Sim, Alex

    With the increasing number of geographically distributed scientific collaborations and the scale of the data size growth, it has become more challenging for users to achieve the best possible network performance on a shared network. We have developed a forecast model to predict expected bandwidth utilization for high-bandwidth wide area network. The forecast model can improve the efficiency of resource utilization and scheduling data movements on high-bandwidth network to accommodate ever increasing data volume for large-scale scientific data applications. Univariate model is developed with STL and ARIMA on SNMP path utilization data. Compared with traditional approach such as Box-Jenkins methodology,more » our forecast model reduces computation time by 83.2%. It also shows resilience against abrupt network usage change. The accuracy of the forecast model is within the standard deviation of the monitored measurements.« less

  13. Network Bandwidth Utilization Forecast Model on High Bandwidth Network

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yoo, Wucherl; Sim, Alex

    With the increasing number of geographically distributed scientific collaborations and the scale of the data size growth, it has become more challenging for users to achieve the best possible network performance on a shared network. We have developed a forecast model to predict expected bandwidth utilization for high-bandwidth wide area network. The forecast model can improve the efficiency of resource utilization and scheduling data movements on high-bandwidth network to accommodate ever increasing data volume for large-scale scientific data applications. Univariate model is developed with STL and ARIMA on SNMP path utilization data. Compared with traditional approach such as Box-Jenkins methodology,more » our forecast model reduces computation time by 83.2percent. It also shows resilience against abrupt network usage change. The accuracy of the forecast model is within the standard deviation of the monitored measurements.« less

  14. The Influence of Peer Reviewer Expertise on the Evaluation of Research Funding Applications

    PubMed Central

    Sullivan, Joanne H.; Glisson, Scott R.

    2016-01-01

    Although the scientific peer review process is crucial to distributing research investments, little has been reported about the decision-making processes used by reviewers. One key attribute likely to be important for decision-making is reviewer expertise. Recent data from an experimental blinded review utilizing a direct measure of expertise has found that closer intellectual distances between applicant and reviewer lead to harsher evaluations, possibly suggesting that information is differentially sampled across subject-matter expertise levels and across information type (e.g. strengths or weaknesses). However, social and professional networks have been suggested to play a role in reviewer scoring. In an effort to test whether this result can be replicated in a real-world unblinded study utilizing self-assessed reviewer expertise, we conducted a retrospective multi-level regression analysis of 1,450 individual unblinded evaluations of 725 biomedical research funding applications by 1,044 reviewers. Despite the large variability in the scoring data, the results are largely confirmatory of work from blinded reviews, by which a linear relationship between reviewer expertise and their evaluations was observed—reviewers with higher levels of self-assessed expertise tended to be harsher in their evaluations. However, we also found that reviewer and applicant seniority could influence this relationship, suggesting social networks could have subtle influences on reviewer scoring. Overall, these results highlight the need to explore how reviewers utilize their expertise to gather and weight information from the application in making their evaluations. PMID:27768760

  15. The AIRS Applications Pipeline, from Identification to Visualization to Distribution

    NASA Astrophysics Data System (ADS)

    Ray, S. E.; Pagano, T. S.; Fetzer, E. J.; Lambrigtsen, B.; Teixeira, J.

    2014-12-01

    The Atmospheric Infrared Sounder (AIRS) on NASA's Aqua spacecraft has been returning daily global observations of Earth's atmospheric constituents and properties since 2002. AIRS provides observations of temperature and water vapor along the atmospheric column and is sensitive to many atmospheric constituents in the mid-troposphere, including carbon monoxide, carbon dioxide and ozone. With a 12-year data record and daily, global observations in near real-time, we are finding that AIRS data can play a role in applications that fall under most of the NASA Applied Sciences focus areas. Currently in development are temperature inversion maps that can potentially correlate to respiratory health problems, dengue fever and West Nile virus outbreak prediction maps, maps that can be used to make assessments of air quality, and maps of volcanic ash burden. This poster will communicate the Project's approach and efforts to date of its applications pipeline, which includes identifying applications, utilizing science expertise, hiring outside experts to assist with development and dissemination, visualization along application themes, and leveraging existing NASA data frameworks and organizations to facilitate archiving and distribution. In addition, a new web-based browse tool being developed by the AIRS Project for easy access to application product imagery will also be described.

  16. Discovering discovery patterns with Predication-based Semantic Indexing.

    PubMed

    Cohen, Trevor; Widdows, Dominic; Schvaneveldt, Roger W; Davies, Peter; Rindflesch, Thomas C

    2012-12-01

    In this paper we utilize methods of hyperdimensional computing to mediate the identification of therapeutically useful connections for the purpose of literature-based discovery. Our approach, named Predication-based Semantic Indexing, is utilized to identify empirically sequences of relationships known as "discovery patterns", such as "drug x INHIBITS substance y, substance y CAUSES disease z" that link pharmaceutical substances to diseases they are known to treat. These sequences are derived from semantic predications extracted from the biomedical literature by the SemRep system, and subsequently utilized to direct the search for known treatments for a held out set of diseases. Rapid and efficient inference is accomplished through the application of geometric operators in PSI space, allowing for both the derivation of discovery patterns from a large set of known TREATS relationships, and the application of these discovered patterns to constrain search for therapeutic relationships at scale. Our results include the rediscovery of discovery patterns that have been constructed manually by other authors in previous research, as well as the discovery of a set of previously unrecognized patterns. The application of these patterns to direct search through PSI space results in better recovery of therapeutic relationships than is accomplished with models based on distributional statistics alone. These results demonstrate the utility of efficient approximate inference in geometric space as a means to identify therapeutic relationships, suggesting a role of these methods in drug repurposing efforts. In addition, the results provide strong support for the utility of the discovery pattern approach pioneered by Hristovski and his colleagues. Copyright © 2012 Elsevier Inc. All rights reserved.

  17. Discovering discovery patterns with predication-based Semantic Indexing

    PubMed Central

    Cohen, Trevor; Widdows, Dominic; Schvaneveldt, Roger W.; Davies, Peter; Rindflesch, Thomas C.

    2012-01-01

    In this paper we utilize methods of hyperdimensional computing to mediate the identification of therapeutically useful connections for the purpose of literature-based discovery. Our approach, named Predication-based Semantic Indexing, is utilized to identify empirically sequences of relationships known as “discovery patterns”, such as “drug x INHIBITS substance y, substance y CAUSES disease z” that link pharmaceutical substances to diseases they are known to treat. These sequences are derived from semantic predications extracted from the biomedical literature by the SemRep system, and subsequently utilized to direct the search for known treatments for a held out set of diseases. Rapid and efficient inference is accomplished through the application of geometric operators in PSI space, allowing for both the derivation of discovery patterns from a large set of known TREATS relationships, and the application of these discovered patterns to constrain search for therapeutic relationships at scale. Our results include the rediscovery of discovery patterns that have been constructed manually by other authors in previous research, as well as the discovery of a set of previously unrecognized patterns. The application of these patterns to direct search through PSI space results in better recovery of therapeutic relationships than is accomplished with models based on distributional statistics alone. These results demonstrate the utility of efficient approximate inference in geometric space as a means to identify therapeutic relationships, suggesting a role of these methods in drug repurposing efforts. In addition, the results provide strong support for the utility of the discovery pattern approach pioneered by Hristovski and his colleagues. PMID:22841748

  18. Distributed Computing Architecture for Image-Based Wavefront Sensing and 2 D FFTs

    NASA Technical Reports Server (NTRS)

    Smith, Jeffrey S.; Dean, Bruce H.; Haghani, Shadan

    2006-01-01

    Image-based wavefront sensing (WFS) provides significant advantages over interferometric-based wavefi-ont sensors such as optical design simplicity and stability. However, the image-based approach is computational intensive, and therefore, specialized high-performance computing architectures are required in applications utilizing the image-based approach. The development and testing of these high-performance computing architectures are essential to such missions as James Webb Space Telescope (JWST), Terrestial Planet Finder-Coronagraph (TPF-C and CorSpec), and Spherical Primary Optical Telescope (SPOT). The development of these specialized computing architectures require numerous two-dimensional Fourier Transforms, which necessitate an all-to-all communication when applied on a distributed computational architecture. Several solutions for distributed computing are presented with an emphasis on a 64 Node cluster of DSPs, multiple DSP FPGAs, and an application of low-diameter graph theory. Timing results and performance analysis will be presented. The solutions offered could be applied to other all-to-all communication and scientifically computationally complex problems.

  19. A Development of Lightweight Grid Interface

    NASA Astrophysics Data System (ADS)

    Iwai, G.; Kawai, Y.; Sasaki, T.; Watase, Y.

    2011-12-01

    In order to help a rapid development of Grid/Cloud aware applications, we have developed API to abstract the distributed computing infrastructures based on SAGA (A Simple API for Grid Applications). SAGA, which is standardized in the OGF (Open Grid Forum), defines API specifications to access distributed computing infrastructures, such as Grid, Cloud and local computing resources. The Universal Grid API (UGAPI), which is a set of command line interfaces (CLI) and APIs, aims to offer simpler API to combine several SAGA interfaces with richer functionalities. These CLIs of the UGAPI offer typical functionalities required by end users for job management and file access to the different distributed computing infrastructures as well as local computing resources. We have also built a web interface for the particle therapy simulation and demonstrated the large scale calculation using the different infrastructures at the same time. In this paper, we would like to present how the web interface based on UGAPI and SAGA achieve more efficient utilization of computing resources over the different infrastructures with technical details and practical experiences.

  20. Simulation of air velocity in a vertical perforated air distributor

    NASA Astrophysics Data System (ADS)

    Ngu, T. N. W.; Chu, C. M.; Janaun, J. A.

    2016-06-01

    Perforated pipes are utilized to divide a fluid flow into several smaller streams. Uniform flow distribution requirement is of great concern in engineering applications because it has significant influence on the performance of fluidic devices. For industrial applications, it is crucial to provide a uniform velocity distribution through orifices. In this research, flow distribution patterns of a closed-end multiple outlet pipe standing vertically for air delivery in the horizontal direction was simulated. Computational Fluid Dynamics (CFD), a tool of research for enhancing and understanding design was used as the simulator and the drawing software SolidWorks was used for geometry setup. The main purpose of this work is to establish the influence of size of orifices, intervals between outlets, and the length of tube in order to attain uniformity of exit flows through a multi outlet perforated tube. However, due to the gravitational effect, the compactness of paddy increases gradually from top to bottom of dryer, uniform flow pattern was aimed for top orifices and larger flow for bottom orifices.

  1. Characterizing and reducing equifinality by constraining a distributed catchment model with regional signatures, local observations, and process understanding

    NASA Astrophysics Data System (ADS)

    Kelleher, Christa; McGlynn, Brian; Wagener, Thorsten

    2017-07-01

    Distributed catchment models are widely used tools for predicting hydrologic behavior. While distributed models require many parameters to describe a system, they are expected to simulate behavior that is more consistent with observed processes. However, obtaining a single set of acceptable parameters can be problematic, as parameter equifinality often results in several behavioral sets that fit observations (typically streamflow). In this study, we investigate the extent to which equifinality impacts a typical distributed modeling application. We outline a hierarchical approach to reduce the number of behavioral sets based on regional, observation-driven, and expert-knowledge-based constraints. For our application, we explore how each of these constraint classes reduced the number of behavioral parameter sets and altered distributions of spatiotemporal simulations, simulating a well-studied headwater catchment, Stringer Creek, Montana, using the distributed hydrology-soil-vegetation model (DHSVM). As a demonstrative exercise, we investigated model performance across 10 000 parameter sets. Constraints on regional signatures, the hydrograph, and two internal measurements of snow water equivalent time series reduced the number of behavioral parameter sets but still left a small number with similar goodness of fit. This subset was ultimately further reduced by incorporating pattern expectations of groundwater table depth across the catchment. Our results suggest that utilizing a hierarchical approach based on regional datasets, observations, and expert knowledge to identify behavioral parameter sets can reduce equifinality and bolster more careful application and simulation of spatiotemporal processes via distributed modeling at the catchment scale.

  2. A consortium approach to commercialized Westinghouse solid oxide fuel cell technology

    NASA Astrophysics Data System (ADS)

    Casanova, Allan

    Westinghouse is developing its tubular solid oxide fuel cells (SOFCs) for a variety of applications in stationary power generation markets. By pressurizing a SOFC and integrating it with a gas turbine (GT), power systems with efficiencies as high as 70-75% can be obtained. The first such system will be tested in 1998. Because of their extraordinarily high efficiency (60-70%) even in small sizes the first SOFC products to be offered are expected to be integrated SOFC/GT power systems in the 1-7 MW range, for use in the emerging distributed generation (DG) market segment. Expansion into larger sizes will follow later. Because of their modularity, environmental friendliness and expected cost effectiveness, and because of a worldwide thrust towards utility deregulation, a ready market is forecasted for baseload distributed generation. Assuming Westinghouse can complete its technology development and reach its cost targets, the integrated SOFC/GT power system is seen as a product with tremendous potential in the emerging distributed generation market. While Westinghouse has been a leader in the development of power generation technology for over a century, it does not plan to manufacture small gas turbines. However, GTs small enough to integrate with SOFCs and address the 1-7 MW market are generally available from various manufacturers. Westinghouse will need access to a new set of customers as it brings baseload plants to the present small market mix of emergency and peaking power applications. Small cogeneration applications, already strong in some parts of the world, are also gaining ground everywhere. Small GT manufacturers already serve this market, and alliances and partnerships can enhance SOFC commercialization. Utilities also serve the DG market, especially those that have set up energy service companies and seek to grow beyond the legal and geographical confines of their current regulated business. Because fuel cells in general are a new product, because small baseload applications are a new segment, and because deregulation will continue to shake up the mature traditional power generation market, the commercial risks of launching a new product at this time are unique and considerable. Hence, a collaborative approach to commercialization is deemed desirable and appropriate, and collaboration with GT manufacturers and utilities will be addressed in this paper.

  3. Development and application of a real-time testbed for multiagent system interoperability: A case study on hierarchical microgrid control

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cintuglu, Mehmet Hazar; Youssef, Tarek; Mohammed, Osama A.

    This article presents the development and application of a real-time testbed for multiagent system interoperability. As utility independent private microgrids are installed constantly, standardized interoperability frameworks are required to define behavioral models of the individual agents for expandability and plug-and-play operation. In this paper, we propose a comprehensive hybrid agent framework combining the foundation for intelligent physical agents (FIPA), IEC 61850, and data distribution service (DDS) standards. The IEC 61850 logical node concept is extended using FIPA based agent communication language (ACL) with application specific attributes and deliberative behavior modeling capability. The DDS middleware is adopted to enable a real-timemore » publisher-subscriber interoperability mechanism between platforms. The proposed multi-agent framework was validated in a laboratory based testbed involving developed intelligent electronic device (IED) prototypes and actual microgrid setups. Experimental results were demonstrated for both decentralized and distributed control approaches. Secondary and tertiary control levels of a microgrid were demonstrated for decentralized hierarchical control case study. A consensus-based economic dispatch case study was demonstrated as a distributed control example. Finally, it was shown that the developed agent platform is industrially applicable for actual smart grid field deployment.« less

  4. Development and application of a real-time testbed for multiagent system interoperability: A case study on hierarchical microgrid control

    DOE PAGES

    Cintuglu, Mehmet Hazar; Youssef, Tarek; Mohammed, Osama A.

    2016-08-10

    This article presents the development and application of a real-time testbed for multiagent system interoperability. As utility independent private microgrids are installed constantly, standardized interoperability frameworks are required to define behavioral models of the individual agents for expandability and plug-and-play operation. In this paper, we propose a comprehensive hybrid agent framework combining the foundation for intelligent physical agents (FIPA), IEC 61850, and data distribution service (DDS) standards. The IEC 61850 logical node concept is extended using FIPA based agent communication language (ACL) with application specific attributes and deliberative behavior modeling capability. The DDS middleware is adopted to enable a real-timemore » publisher-subscriber interoperability mechanism between platforms. The proposed multi-agent framework was validated in a laboratory based testbed involving developed intelligent electronic device (IED) prototypes and actual microgrid setups. Experimental results were demonstrated for both decentralized and distributed control approaches. Secondary and tertiary control levels of a microgrid were demonstrated for decentralized hierarchical control case study. A consensus-based economic dispatch case study was demonstrated as a distributed control example. Finally, it was shown that the developed agent platform is industrially applicable for actual smart grid field deployment.« less

  5. Document for 270 Voltage Direct Current (270 V dc) System

    NASA Astrophysics Data System (ADS)

    1992-09-01

    The paper presents the technical design and application information established by the SAE Aerospace Recommended Practice concerning the generation, distribution, control, and utilization of aircraft 270 V dc electrical power systems and support equipment. Also presented are references and definitions making it possible to compare various electrical systems and components. A diagram of the generic 270 V Direct Current High-Voltage Direct System is included.

  6. Utility-Scale Solar Power Converter: Agile Direct Grid Connect Medium Voltage 4.7-13.8 kV Power Converter for PV Applications Utilizing Wide Band Gap Devices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    Solar ADEPT Project: Satcon is developing a compact, lightweight power conversion device that is capable of taking utility-scale solar power and outputting it directly into the electric utility grid at distribution voltage levels—eliminating the need for large transformers. Transformers “step up” the voltage of the power that is generated by a solar power system so it can be efficiently transported through transmission lines and eventually “stepped down” to usable voltages before it enters homes and businesses. Power companies step up the voltage because less electricity is lost along transmission lines when the voltage is high and current is low. Satcon’smore » new power conversion devices will eliminate these heavy transformers and connect a utility-scale solar power system directly to the grid. Satcon’s modular devices are designed to ensure reliability—if one device fails it can be bypassed and the system can continue to run.« less

  7. Applications of Intelligent Technology to Power System Supervisory Control and Protection Systems

    NASA Astrophysics Data System (ADS)

    Nagata, Takeshi

    Power system supervisory control and protection systems provide utilities with capabilities that are key to a planning business function, i.e., delivering power in a reliable and safe manner. A quality system solution is central to effective operation of a utility's most critical and costly generation, transmission, and distribution assets. The challenging issues for these systems today are not the same as they were few years ago. Today, there is much more placed on integration, use of new IT technologies, and access to information for more purposes. This article presents the topics of intelligent technology to the power system supervisory control and protection systems.

  8. CHANGES IN LIPID-ENCAPSULATED MICROBUBBLE POPULATION DURING CONTINUOUS INFUSION AND METHODS TO MAINTAIN CONSISTENCY

    PubMed Central

    KAYA, MEHMET; GREGORY, THOMAS S.; DAYTON, PAUL A.

    2009-01-01

    Stabilized microbubbles are utilized as ultrasound contrast agents. These micron-sized gas capsules are injected into the bloodstream to provide contrast enhancement during ultrasound imaging. Some contrast imaging strategies, such as destruction-reperfusion, require a continuous injection of microbubbles over several minutes. Most quantitative imaging strategies rely on the ability to administer a consistent dose of contrast agent. Because of the buoyancy of these gas-filled agents, their spatial distribution within a syringe changes over time. The population of microbubbles that is pumped from a horizontal syringe outlet differs from initial population as the microbubbles float to the syringe top. In this manuscript, we study the changes in the population of a contrast agent that is pumped from a syringe due to microbubble floatation. Results are presented in terms of change in concentration and change in mean diameter, as a function of time, suspension medium, and syringe diameter. Data illustrate that the distribution of contrast agents injected from a syringe changes in both concentration and mean diameter over several minutes without mixing. We discuss the application of a mixing system and viscosity agents to keep the contrast solution more evenly distributed in a syringe. These results are significant for researchers utilizing microbubble contrast agents in continuous-infusion applications where it is important to maintain consistent contrast agent delivery rate, or in situations where the injection syringe cannot be mixed immediately prior to administration. PMID:19632760

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Narang, David; Ayyanar, Raja; Gemin, Paul

    APS’s renewable energy portfolio, driven in part by Arizona’s Renewable Energy Standard (RES) currently includes more than 1100 MW of installed capacity, equating to roughly 3000 GWh of annual production. Overall renewable production is expected to grow to 6000 GWh by 2025. It is expected that distributed photovoltaics, driven primarily by lower cost, will contribute to much of this growth and that by 2025, distributed installations will account for half of all renewable production (3000GHW). As solar penetration increases, additional analysis may be required for routine utility processes to ensure continued safe and reliable operation of the electric distribution network.more » Such processes include residential or commercial interconnection requests and load shifting during normal feeder operations. Circuits with existing high solar penetration will also have to be studied and results will need to be evaluated for adherence to utility practices or strategy. Increased distributed PV penetration may offer benefits such as load offsetting, but it also has the potential to adversely impact distribution system operation. These effects may be exacerbated by the rapid variability of PV production. Detailed effects of these phenomena in distributed PV applications continue to be studied. Comprehensive, high-resolution electrical models of the distribution system were developed to analyze the impacts of PV on distribution circuit protection systems (including coordination and anti-islanding), predict voltage regulation and phase balance issues, and develop volt/VAr control schemes. Modeling methods were refined by validating against field measurements. To augment the field measurements, methods were developed to synthesize high resolution load and PV generation data to facilitate quasi-static time series simulations. The models were then extended to explore boundary conditions for PV hosting capability of the feeder and to simulate common utility practices such as feeder reconfiguration. The modeling and analysis methodology was implemented using open source tools and a process was developed to aid utility engineers in future interconnection requests. Methods to increase PV hosting capacity were also explored during the course of the study. A 700kVA grid-supportive inverter was deployed on the feeder and each grid support mode was demonstrated. Energy storage was explored through simulation and models were developed to calculate the optimum size and placement needed to increase PV hosting capacity. A tool was developed to aid planners in assigning relative costs and benefits to various strategies for increasing PV hosting capacity beyond current levels. Following the completion of the project, APS intends to use the tools and methods to improve the framework of future PV integration on its system. The tools and methods are also expected to aid other utilities to accelerate distributed PV deployment.« less

  10. Foundational Report Series: Advanced Distribution Management Systems for Grid Modernization, Business Case Calculations for DMS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lu, Xiaonan; Singh, Ravindra; Wang, Jianhui

    Distribution Management System (DMS) applications require a substantial commitment of technical and financial resources. In order to proceed beyond limited-scale demonstration projects, utilities must have a clear understanding of the business case for committing these resources that recognizes the total cost of ownership. Many of the benefits provided by investments in DMSs do not translate easily into monetary terms, making cost-benefit calculations difficult. For example, Fault Location Isolation and Service Restoration (FLISR) can significantly reduce customer outage duration and improve reliability. However, there is no well-established and universally-accepted procedure for converting these benefits into monetary terms that can be comparedmore » directly to investment costs. This report presents a methodology to analyze the benefits and costs of DMS applications as fundamental to the business case.« less

  11. A Distributed Dynamic Programming-Based Solution for Load Management in Smart Grids

    NASA Astrophysics Data System (ADS)

    Zhang, Wei; Xu, Yinliang; Li, Sisi; Zhou, MengChu; Liu, Wenxin; Xu, Ying

    2018-03-01

    Load management is being recognized as an important option for active user participation in the energy market. Traditional load management methods usually require a centralized powerful control center and a two-way communication network between the system operators and energy end-users. The increasing user participation in smart grids may limit their applications. In this paper, a distributed solution for load management in emerging smart grids is proposed. The load management problem is formulated as a constrained optimization problem aiming at maximizing the overall utility of users while meeting the requirement for load reduction requested by the system operator, and is solved by using a distributed dynamic programming algorithm. The algorithm is implemented via a distributed framework and thus can deliver a highly desired distributed solution. It avoids the required use of a centralized coordinator or control center, and can achieve satisfactory outcomes for load management. Simulation results with various test systems demonstrate its effectiveness.

  12. New trends in species distribution modelling

    USGS Publications Warehouse

    Zimmermann, Niklaus E.; Edwards, Thomas C.; Graham, Catherine H.; Pearman, Peter B.; Svenning, Jens-Christian

    2010-01-01

    Species distribution modelling has its origin in the late 1970s when computing capacity was limited. Early work in the field concentrated mostly on the development of methods to model effectively the shape of a species' response to environmental gradients (Austin 1987, Austin et al. 1990). The methodology and its framework were summarized in reviews 10–15 yr ago (Franklin 1995, Guisan and Zimmermann 2000), and these syntheses are still widely used as reference landmarks in the current distribution modelling literature. However, enormous advancements have occurred over the last decade, with hundreds – if not thousands – of publications on species distribution model (SDM) methodologies and their application to a broad set of conservation, ecological and evolutionary questions. With this special issue, originating from the third of a set of specialized SDM workshops (2008 Riederalp) entitled 'The Utility of Species Distribution Models as Tools for Conservation Ecology', we reflect on current trends and the progress achieved over the last decade.

  13. An Analysis of Security and Privacy Issues in Smart Grid Software Architectures on Clouds

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simmhan, Yogesh; Kumbhare, Alok; Cao, Baohua

    2011-07-09

    Power utilities globally are increasingly upgrading to Smart Grids that use bi-directional communication with the consumer to enable an information-driven approach to distributed energy management. Clouds offer features well suited for Smart Grid software platforms and applications, such as elastic resources and shared services. However, the security and privacy concerns inherent in an information rich Smart Grid environment are further exacerbated by their deployment on Clouds. Here, we present an analysis of security and privacy issues in a Smart Grids software architecture operating on different Cloud environments, in the form of a taxonomy. We use the Los Angeles Smart Gridmore » Project that is underway in the largest U.S. municipal utility to drive this analysis that will benefit both Cloud practitioners targeting Smart Grid applications, and Cloud researchers investigating security and privacy.« less

  14. Spatially distributed modal signals of free shallow membrane shell structronic system

    NASA Astrophysics Data System (ADS)

    Yue, H. H.; Deng, Z. Q.; Tzou, H. S.

    2008-11-01

    Based on the smart material and structronics technology, distributed sensor and control of shell structures have been rapidly developed for the last 20 years. This emerging technology has been utilized in aerospace, telecommunication, micro-electromechanical systems and other engineering applications. However, distributed monitoring technique and its resulting global spatially distributed sensing signals of shallow paraboloidal membrane shells are not clearly understood. In this paper, modeling of free flexible paraboloidal shell with spatially distributed sensor, micro-sensing signal characteristics, and location of distributed piezoelectric sensor patches are investigated based on a new set of assumed mode shape functions. Parametric analysis indicates that the signal generation depends on modal membrane strains in the meridional and circumferential directions in which the latter is more significant than the former, when all bending strains vanish in membrane shells. This study provides a modeling and analysis technique for distributed sensors laminated on lightweight paraboloidal flexible structures and identifies critical components and regions that generate significant signals.

  15. Entangled-Pair Transmission Improvement Using Distributed Phase-Sensitive Amplification

    NASA Astrophysics Data System (ADS)

    Agarwal, Anjali; Dailey, James M.; Toliver, Paul; Peters, Nicholas A.

    2014-10-01

    We demonstrate the transmission of time-bin entangled photon pairs through a distributed optical phase-sensitive amplifier (OPSA). We utilize four-wave mixing at telecom wavelengths in a 5-km dispersion-shifted fiber OPSA operating in the low-gain limit. Measurements of two-photon interference curves show no statistically significant degradation in the fringe visibility at the output of the OPSA. In addition, coincidence counting rates are higher than direct passive transmission because of constructive interference between amplitudes of input photon pairs and those generated in the OPSA. Our results suggest that application of distributed phase-sensitive amplification to transmission of entangled photon pairs could be highly beneficial towards advancing the rate and scalability of future quantum communications systems.

  16. A non-stationary cost-benefit based bivariate extreme flood estimation approach

    NASA Astrophysics Data System (ADS)

    Qi, Wei; Liu, Junguo

    2018-02-01

    Cost-benefit analysis and flood frequency analysis have been integrated into a comprehensive framework to estimate cost effective design values. However, previous cost-benefit based extreme flood estimation is based on stationary assumptions and analyze dependent flood variables separately. A Non-Stationary Cost-Benefit based bivariate design flood estimation (NSCOBE) approach is developed in this study to investigate influence of non-stationarities in both the dependence of flood variables and the marginal distributions on extreme flood estimation. The dependence is modeled utilizing copula functions. Previous design flood selection criteria are not suitable for NSCOBE since they ignore time changing dependence of flood variables. Therefore, a risk calculation approach is proposed based on non-stationarities in both marginal probability distributions and copula functions. A case study with 54-year observed data is utilized to illustrate the application of NSCOBE. Results show NSCOBE can effectively integrate non-stationarities in both copula functions and marginal distributions into cost-benefit based design flood estimation. It is also found that there is a trade-off between maximum probability of exceedance calculated from copula functions and marginal distributions. This study for the first time provides a new approach towards a better understanding of influence of non-stationarities in both copula functions and marginal distributions on extreme flood estimation, and could be beneficial to cost-benefit based non-stationary bivariate design flood estimation across the world.

  17. Distributed computing methodology for training neural networks in an image-guided diagnostic application.

    PubMed

    Plagianakos, V P; Magoulas, G D; Vrahatis, M N

    2006-03-01

    Distributed computing is a process through which a set of computers connected by a network is used collectively to solve a single problem. In this paper, we propose a distributed computing methodology for training neural networks for the detection of lesions in colonoscopy. Our approach is based on partitioning the training set across multiple processors using a parallel virtual machine. In this way, interconnected computers of varied architectures can be used for the distributed evaluation of the error function and gradient values, and, thus, training neural networks utilizing various learning methods. The proposed methodology has large granularity and low synchronization, and has been implemented and tested. Our results indicate that the parallel virtual machine implementation of the training algorithms developed leads to considerable speedup, especially when large network architectures and training sets are used.

  18. A distributed approach for optimizing cascaded classifier topologies in real-time stream mining systems.

    PubMed

    Foo, Brian; van der Schaar, Mihaela

    2010-11-01

    In this paper, we discuss distributed optimization techniques for configuring classifiers in a real-time, informationally-distributed stream mining system. Due to the large volume of streaming data, stream mining systems must often cope with overload, which can lead to poor performance and intolerable processing delay for real-time applications. Furthermore, optimizing over an entire system of classifiers is a difficult task since changing the filtering process at one classifier can impact both the feature values of data arriving at classifiers further downstream and thus, the classification performance achieved by an ensemble of classifiers, as well as the end-to-end processing delay. To address this problem, this paper makes three main contributions: 1) Based on classification and queuing theoretic models, we propose a utility metric that captures both the performance and the delay of a binary filtering classifier system. 2) We introduce a low-complexity framework for estimating the system utility by observing, estimating, and/or exchanging parameters between the inter-related classifiers deployed across the system. 3) We provide distributed algorithms to reconfigure the system, and analyze the algorithms based on their convergence properties, optimality, information exchange overhead, and rate of adaptation to non-stationary data sources. We provide results using different video classifier systems.

  19. An Optimization-Based Framework for the Transformation of Incomplete Biological Knowledge into a Probabilistic Structure and Its Application to the Utilization of Gene/Protein Signaling Pathways in Discrete Phenotype Classification.

    PubMed

    Esfahani, Mohammad Shahrokh; Dougherty, Edward R

    2015-01-01

    Phenotype classification via genomic data is hampered by small sample sizes that negatively impact classifier design. Utilization of prior biological knowledge in conjunction with training data can improve both classifier design and error estimation via the construction of the optimal Bayesian classifier. In the genomic setting, gene/protein signaling pathways provide a key source of biological knowledge. Although these pathways are neither complete, nor regulatory, with no timing associated with them, they are capable of constraining the set of possible models representing the underlying interaction between molecules. The aim of this paper is to provide a framework and the mathematical tools to transform signaling pathways to prior probabilities governing uncertainty classes of feature-label distributions used in classifier design. Structural motifs extracted from the signaling pathways are mapped to a set of constraints on a prior probability on a Multinomial distribution. Being the conjugate prior for the Multinomial distribution, we propose optimization paradigms to estimate the parameters of a Dirichlet distribution in the Bayesian setting. The performance of the proposed methods is tested on two widely studied pathways: mammalian cell cycle and a p53 pathway model.

  20. Lamella settler crystallizer

    DOEpatents

    Maimoni, Arturo

    1990-01-01

    A crystallizer which incorporates a lamella settler and which is particularly applicable for use in batteries and power cells for electric vehicles or stationary applications. The lamella settler can be utilized for coarse particle separation or for agglomeration, and is particularly applicable to aluminum-air batteries or power cells for solving the hydrargillite (aluminum-hydroxide) removal problems from such batteries. This invention provides the advantages of very low energy consumption, turbulence, shear, cost and maintenance. Thus, due to the low shear and low turbulence of this invention, it is particularly effective in the control of aluminum hydroxide particle size distribution in the various sections of an aluminum-air system, as well as in other electrochemical systems requiring separation for phases of different densities.

  1. Telerobotic management system: coordinating multiple human operators with multiple robots

    NASA Astrophysics Data System (ADS)

    King, Jamie W.; Pretty, Raymond; Brothers, Brendan; Gosine, Raymond G.

    2003-09-01

    This paper describes an application called the Tele-robotic management system (TMS) for coordinating multiple operators with multiple robots for applications such as underground mining. TMS utilizes several graphical interfaces to allow the user to define a partially ordered plan for multiple robots. This plan is then converted to a Petri net for execution and monitoring. TMS uses a distributed framework to allow robots and operators to easily integrate with the applications. This framework allows robots and operators to join the network and advertise their capabilities through services. TMS then decides whether tasks should be dispatched to a robot or a remote operator based on the services offered by the robots and operators.

  2. Applications of crude incidence curves.

    PubMed

    Korn, E L; Dorey, F J

    1992-04-01

    Crude incidence curves display the cumulative number of failures of interest as a function of time. With competing causes of failure, they are distinct from cause-specific incidence curves that treat secondary types of failures as censored observations. After briefly reviewing their definition and estimation, we present five applications of crude incidence curves to show their utility in a broad range of studies. In some of these applications it is helpful to model survival-time distributions with use of two different time metameters, for example, time from diagnosis and age of the patient. We describe how one can incorporate published vital statistics into the models when secondary types of failure correspond to common causes of death.

  3. Time-resolved ion energy and charge state distributions in pulsed cathodic arc plasmas of Nb‑Al cathodes in high vacuum

    NASA Astrophysics Data System (ADS)

    Zöhrer, Siegfried; Anders, André; Franz, Robert

    2018-05-01

    Cathodic arcs have been utilized in various applications including the deposition of thin films and coatings, ion implantation, and high current switching. Despite substantial progress in recent decades, the physical mechanisms responsible for the observed plasma properties are still a matter of dispute, particularly for multi-element cathodes, which can play an essential role in applications. The analysis of plasma properties is complicated by the generally occurring neutral background of metal atoms, which perturbs initial ion properties. By using a time-resolved method in combination with pulsed arcs and a comprehensive Nb‑Al cathode model system, we investigate the influence of cathode composition on the plasma, while making the influence of neutrals visible for the observed time frame. The results visualize ion detections of 600 μs plasma pulses, extracted 0.27 m from the cathode, resolved in mass-per-charge, energy-per-charge and time. Ion properties are found to be strongly dependent on the cathode material in a way that cannot be deduced by simple linear extrapolation. Subsequently, current hypotheses in cathodic arc physics applying to multi-element cathodes, like the so-called ‘velocity rule’ or the ‘cohesive energy rule’, are tested for early and late stages of the pulse. Apart from their fundamental character, the findings could be useful in optimizing or designing plasma properties for applications, by actively utilizing effects on ion distributions caused by composite cathode materials and charge exchange with neutrals.

  4. Mobile healthcare information management utilizing Cloud Computing and Android OS.

    PubMed

    Doukas, Charalampos; Pliakas, Thomas; Maglogiannis, Ilias

    2010-01-01

    Cloud Computing provides functionality for managing information data in a distributed, ubiquitous and pervasive manner supporting several platforms, systems and applications. This work presents the implementation of a mobile system that enables electronic healthcare data storage, update and retrieval using Cloud Computing. The mobile application is developed using Google's Android operating system and provides management of patient health records and medical images (supporting DICOM format and JPEG2000 coding). The developed system has been evaluated using the Amazon's S3 cloud service. This article summarizes the implementation details and presents initial results of the system in practice.

  5. Calculations of atmospheric refraction for spacecraft remote-sensing applications

    NASA Technical Reports Server (NTRS)

    Chu, W. P.

    1983-01-01

    Analytical solutions to the refraction integrals appropriate for ray trajectories along slant paths through the atmosphere are derived in this paper. This type of geometry is commonly encountered in remote-sensing applications utilizing an occultation technique. The solutions are obtained by evaluating higher-order terms from expansion of the refraction integral and are dependent on the vertical temperature distributions. Refraction parameters such as total refraction angles, air masses, and path lengths can be accurately computed. It is also shown that the method can be used for computing refraction parameters in astronomical refraction geometry for large zenith angles.

  6. A decision model for planetary missions

    NASA Technical Reports Server (NTRS)

    Hazelrigg, G. A., Jr.; Brigadier, W. L.

    1976-01-01

    Many techniques developed for the solution of problems in economics and operations research are directly applicable to problems involving engineering trade-offs. This paper investigates the use of utility theory for decision making in planetary exploration space missions. A decision model is derived that accounts for the objectives of the mission - science - the cost of flying the mission and the risk of mission failure. A simulation methodology for obtaining the probability distribution of science value and costs as a function spacecraft and mission design is presented and an example application of the decision methodology is given for various potential alternatives in a comet Encke mission.

  7. Classification of cognitive systems dedicated to data sharing

    NASA Astrophysics Data System (ADS)

    Ogiela, Lidia; Ogiela, Marek R.

    2017-08-01

    In this paper will be presented classification of new cognitive information systems dedicated to cryptographic data splitting and sharing processes. Cognitive processes of semantic data analysis and interpretation, will be used to describe new classes of intelligent information and vision systems. In addition, cryptographic data splitting algorithms and cryptographic threshold schemes will be used to improve processes of secure and efficient information management with application of such cognitive systems. The utility of the proposed cognitive sharing procedures and distributed data sharing algorithms will be also presented. A few possible application of cognitive approaches for visual information management and encryption will be also described.

  8. Parallel processing for scientific computations

    NASA Technical Reports Server (NTRS)

    Alkhatib, Hasan S.

    1995-01-01

    The scope of this project dealt with the investigation of the requirements to support distributed computing of scientific computations over a cluster of cooperative workstations. Various experiments on computations for the solution of simultaneous linear equations were performed in the early phase of the project to gain experience in the general nature and requirements of scientific applications. A specification of a distributed integrated computing environment, DICE, based on a distributed shared memory communication paradigm has been developed and evaluated. The distributed shared memory model facilitates porting existing parallel algorithms that have been designed for shared memory multiprocessor systems to the new environment. The potential of this new environment is to provide supercomputing capability through the utilization of the aggregate power of workstations cooperating in a cluster interconnected via a local area network. Workstations, generally, do not have the computing power to tackle complex scientific applications, making them primarily useful for visualization, data reduction, and filtering as far as complex scientific applications are concerned. There is a tremendous amount of computing power that is left unused in a network of workstations. Very often a workstation is simply sitting idle on a desk. A set of tools can be developed to take advantage of this potential computing power to create a platform suitable for large scientific computations. The integration of several workstations into a logical cluster of distributed, cooperative, computing stations presents an alternative to shared memory multiprocessor systems. In this project we designed and evaluated such a system.

  9. System approach to distributed sensor management

    NASA Astrophysics Data System (ADS)

    Mayott, Gregory; Miller, Gordon; Harrell, John; Hepp, Jared; Self, Mid

    2010-04-01

    Since 2003, the US Army's RDECOM CERDEC Night Vision Electronic Sensor Directorate (NVESD) has been developing a distributed Sensor Management System (SMS) that utilizes a framework which demonstrates application layer, net-centric sensor management. The core principles of the design support distributed and dynamic discovery of sensing devices and processes through a multi-layered implementation. This results in a sensor management layer that acts as a System with defined interfaces for which the characteristics, parameters, and behaviors can be described. Within the framework, the definition of a protocol is required to establish the rules for how distributed sensors should operate. The protocol defines the behaviors, capabilities, and message structures needed to operate within the functional design boundaries. The protocol definition addresses the requirements for a device (sensors or processes) to dynamically join or leave a sensor network, dynamically describe device control and data capabilities, and allow dynamic addressing of publish and subscribe functionality. The message structure is a multi-tiered definition that identifies standard, extended, and payload representations that are specifically designed to accommodate the need for standard representations of common functions, while supporting the need for feature-based functions that are typically vendor specific. The dynamic qualities of the protocol enable a User GUI application the flexibility of mapping widget-level controls to each device based on reported capabilities in real-time. The SMS approach is designed to accommodate scalability and flexibility within a defined architecture. The distributed sensor management framework and its application to a tactical sensor network will be described in this paper.

  10. Nowcasting Induced Seismicity at the Groningen Gas Field in the Netherlands

    NASA Astrophysics Data System (ADS)

    Luginbuhl, M.; Rundle, J. B.; Turcotte, D. L.

    2017-12-01

    The Groningen natural gas field in the Netherlands has recently been a topic of controversy for many residents in the surrounding area. The gas field provides energy for the majority of the country; however, for a minority of Dutch citizens who live nearby, the seismicity induced by the gas field is a cause for major concern. Since the early 2000's, the region has seen an increase in both number and magnitude of events, the largest of which was a magnitude 3.6 in 2012. Earthquakes of this size and smaller easily cause infrastructural damage to older houses and farms built with single brick walls. Nowcasting is a new method of statistically classifying seismicity and seismic risk. In this paper, the method is applied to the induced seismicity at the natural gas fields in Groningen, Netherlands. Nowcasting utilizes the catalogs of seismicity in these regions. Two earthquake magnitudes are selected, one large say , and one small say . The method utilizes the number of small earthquakes that occur between pairs of large earthquakes. The cumulative probability distribution of these values is obtained. The earthquake potential score (EPS) is defined by the number of small earthquakes that have occurred since the last large earthquake, the point where this number falls on the cumulative probability distribution of interevent counts defines the EPS. A major advantage of nowcasting is that it utilizes "natural time", earthquake counts, between events rather than clock time. Thus, it is not necessary to decluster aftershocks and the results are applicable if the level of induced seismicity varies in time, which it does in this case. The application of natural time to the accumulation of the seismic hazard depends on the applicability of Gutenberg-Richter (GR) scaling. The increasing number of small earthquakes that occur after a large earthquake can be scaled to give the risk of a large earthquake occurring. To illustrate our approach, we utilize the number of earthquakes in Groningen to nowcast the number of earthquakes in Groningen. The applicability of the scaling is illustrated during the rapid build up of seismicity between 2004 and 2016. It can now be used to forecast the expected reduction in seismicity associated with reduction in gas production.

  11. Study of application of ERTS-A imagery to fracture related mine safety hazards in the coal mining industry

    NASA Technical Reports Server (NTRS)

    Wier, C. E.; Wobber, F. J. (Principal Investigator); Russell, O. R.; Amato, R. V.

    1973-01-01

    The author has identified the following significant results. The utility of ERTS-1/high altitude aircraft imagery to detect underground mine hazards is strongly suggested. A 1:250,000 scale mined lands map of the Vincennes Quadrangle, Indiana has been prepared. This map is a prototype for a national mined lands inventory and will be distributed to State and Federal offices.

  12. 'Renewables-Friendly' Grid Development Strategies: Experience in the United States, Potential Lessons for China (Chinese Translation) (in Chinese)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hurlbut, David; Zhou, Ella; Porter, Kevin

    2015-10-03

    This is a Chinese translation of NREL/TP-6A20-64940. This report aims to help China's reform effort by providing a concise summary of experience in the United States with 'renewables-friendly' grid management, focusing on experiences that might be applicable to China. It focuses on utility-scale renewables and sets aside issues related to distributed generation.

  13. Raman Microscopy and Imaging: Applications to Skin Pharmacology and Wound Healing

    NASA Astrophysics Data System (ADS)

    Flach, Carol R.; Zhang, Guojin; Mendelsohn, Richard

    The utility of confocal Raman microscopy to study biological events in skin is demonstrated with three examples. (i) monitoring the spatial and structural differences between native and cultured skin, (ii) tracking the permeation and biochemical transformation in skin of a Vitamin E derivative and (iii) tracking the spatial distribution of three major skin proteins (keratin, collagen, and elastin) during wound healing in an explant skin model.

  14. Novel high power impulse magnetron sputtering enhanced by an auxiliary electrical field

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Chunwei, E-mail: lcwnefu@126.com, E-mail: xiubotian@163.com; State Key Laboratory of Advanced Welding and Joining, Harbin Institute of Technology, Harbin 150001; Tian, Xiubo, E-mail: lcwnefu@126.com, E-mail: xiubotian@163.com

    2016-08-15

    The high power impulse magnetron sputtering (HIPIMS) technique is a novel highly ionized physical vapor deposition method with a high application potential. However, the electron utilization efficiency during sputtering is rather low and the metal particle ionization rate needs to be considerably improved to allow for a large-scale industrial application. Therefore, we enhanced the HIPIMS technique by simultaneously applying an electric field (EF-HIPIMS). The effect of the electric field on the discharge process was studied using a current sensor and an optical emission spectrometer. Furthermore, the spatial distribution of the electric potential and electric field during the EF-HIPIMS process wasmore » simulated using the ANSYS software. The results indicate that a higher electron utilization efficiency and a higher particle ionization rate could be achieved. The auxiliary anode obviously changed the distribution of the electric potential and the electric field in the discharge region, which increased the plasma density and enhanced the degree of ionization of the vanadium and argon gas. Vanadium films were deposited to further compare both techniques, and the morphology of the prepared films was investigated by scanning electron microscopy. The films showed a smaller crystal grain size and a denser growth structure when the electric field was applied during the discharge process.« less

  15. The feasibility of replacing or upgrading utility distribution transformers during routine maintenance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barnes, P.R.; Van Dyke, J.W; McConnell, B.W.

    It is estimated that electric utilities use about 40 million distribution transformers in supplying electricity to customers in the United States. Although utility distribution transformers collectively have a high average efficiency, they account for approximately 61 billion kWh of the 229 billion kWh of energy lost annually in the delivery of electricity. Distribution transformers are being replaced over time by new, more efficient, lower-loss units during routine utility maintenance of power distribution systems. Maintenance is typically not performed on units in service. However, units removed from service with appreciable remaining life are often refurbished and returned to stock. Distribution transformersmore » may be removed from service for many reasons, including failure, over- or underloading, or line upgrades such as voltage changes or rerouting. When distribution transformers are removed from service, a decision must be made whether to dispose of the transformer and purchase a lower-loss replacement or to refurbish the transformer and return it to stock for future use. This report contains findings and recommendations on replacing utility distribution transformers during routine maintenance, which is required by section 124 of the Energy Policy Act of 1992. The objectives of the study are to evaluate the practicability, cost-effectiveness, and potential energy savings of replacing or upgrading existing transformers during routine utility maintenance and to develop recommendations on ways to achieve the potential energy savings. Using survey data obtained from utilities and analyses of the economics of refurbishment versus replacement of distribution transformers that are removed from service, it is found that on average utilities are implementing reasonable decisions on refurbishment versus replacement.« less

  16. Space platform utilities distribution study

    NASA Technical Reports Server (NTRS)

    Lefever, A. E.

    1980-01-01

    Generic concepts for the installation of power data and thermal fluid distribution lines on large space platforms were discussed. Connections with central utility subsystem modules and pallet interfaces were also considered. Three system concept study platforms were used as basepoints for the detail development. The tradeoff of high voltage low voltage power distribution and the impact of fiber optics as a data distribution mechanism were analyzed. Thermal expansion and temperature control of utility lines and ducts were considered. Technology developments required for implementation of the generic distribution concepts were identified.

  17. A PDA study management tool (SMT) utilizing wireless broadband and full DICOM viewing capability

    NASA Astrophysics Data System (ADS)

    Documet, Jorge; Liu, Brent; Zhou, Zheng; Huang, H. K.; Documet, Luis

    2007-03-01

    During the last 4 years IPI (Image Processing and Informatics) Laboratory has been developing a web-based Study Management Tool (SMT) application that allows Radiologists, Film librarians and PACS-related (Picture Archiving and Communication System) users to dynamically and remotely perform Query/Retrieve operations in a PACS network. The users utilizing a regular PDA (Personal Digital Assistant) can remotely query a PACS archive to distribute any study to an existing DICOM (Digital Imaging and Communications in Medicine) node. This application which has proven to be convenient to manage the Study Workflow [1, 2] has been extended to include a DICOM viewing capability in the PDA. With this new feature, users can take a quick view of DICOM images providing them mobility and convenience at the same time. In addition, we are extending this application to Metropolitan-Area Wireless Broadband Networks. This feature requires Smart Phones that are capable of working as a PDA and have access to Broadband Wireless Services. With the extended application to wireless broadband technology and the preview of DICOM images, the Study Management Tool becomes an even more powerful tool for clinical workflow management.

  18. Spatial Signal Characteristics of Shallow Paraboloidal Shell Structronic Systems

    NASA Astrophysics Data System (ADS)

    Yue, H. H.; Deng, Z. Q.; Tzou, H. S.

    Based on the smart material and structronics technology, distributed sensor and control of shell structures have been rapidly developed for the last twenty years. This emerging technology has been utilized in aerospace, telecommunication, micro-electromechanical systems and other engineering applications. However, distributed monitoring technique and its resulting global spatially distributed sensing signals of thin flexible membrane shells are not clearly understood. In this paper, modeling of free thin paraboloidal shell with spatially distributed sensor, micro-sensing signal characteristics, and location of distributed piezoelectric sensor patches are investigated based on a new set of assumed mode shape functions. Parametric analysis indicates that the signal generation depends on modal membrane strains in the meridional and circumferential directions in which the latter is more significant than the former, when all bending strains vanish in membrane shells. This study provides a modeling and analysis technique for distributed sensors laminated on lightweight paraboloidal flexible structures and identifies critical components and regions that generate significant signals.

  19. Pressure loss modulus correlation for Delta p across uniformly distributed-loss devices

    NASA Technical Reports Server (NTRS)

    Nunz, Gregory J.

    1994-01-01

    A dimensionless group, called a pressure loss modulus (N(sub PL)), is introduced that, in conjunction with an appropriately defined Reynolds number, is of considerable engineering utility in correlating steady-state Delta p vs flow calibration data and subsequently as a predictor, using the same or a different fluid, in uniformly distributed pressure loss devices. It is particularly useful under operation in the transition regime. Applications of this simple bivariate correlation to three diverse devices of particular interest for small liquid rocket engine fluid systems are discussed: large L/D capillary tube restrictors, packed granular catalyst beds, and stacked vortex-loss disk restrictors.

  20. Evaluation of power control concepts using the PMAD systems test bed. [Power Management and Distribution

    NASA Technical Reports Server (NTRS)

    Beach, R. F.; Kimnach, G. L.; Jett, T. A.; Trash, L. M.

    1989-01-01

    The Lewis Research Center's Power Management and Distribution (PMAD) System testbed and its use in the evaluation of control concepts applicable to the NASA Space Station Freedom electric power system (EPS) are described. The facility was constructed to allow testing of control hardware and software in an environment functionally similar to the space station electric power system. Control hardware and software have been developed to allow operation of the testbed power system in a manner similar to a supervisory control and data acquisition (SCADA) system employed by utility power systems for control. The system hardware and software are described.

  1. A validation study of a stochastic model of human interaction

    NASA Astrophysics Data System (ADS)

    Burchfield, Mitchel Talmadge

    The purpose of this dissertation is to validate a stochastic model of human interactions which is part of a developmentalism paradigm. Incorporating elements of ancient and contemporary philosophy and science, developmentalism defines human development as a progression of increasing competence and utilizes compatible theories of developmental psychology, cognitive psychology, educational psychology, social psychology, curriculum development, neurology, psychophysics, and physics. To validate a stochastic model of human interactions, the study addressed four research questions: (a) Does attitude vary over time? (b) What are the distributional assumptions underlying attitudes? (c) Does the stochastic model, {-}N{intlimitssbsp{-infty}{infty}}varphi(chi,tau)\\ Psi(tau)dtau, have utility for the study of attitudinal distributions and dynamics? (d) Are the Maxwell-Boltzmann, Fermi-Dirac, and Bose-Einstein theories applicable to human groups? Approximately 25,000 attitude observations were made using the Semantic Differential Scale. Positions of individuals varied over time and the logistic model predicted observed distributions with correlations between 0.98 and 1.0, with estimated standard errors significantly less than the magnitudes of the parameters. The results bring into question the applicability of Fisherian research designs (Fisher, 1922, 1928, 1938) for behavioral research based on the apparent failure of two fundamental assumptions-the noninteractive nature of the objects being studied and normal distribution of attributes. The findings indicate that individual belief structures are representable in terms of a psychological space which has the same or similar properties as physical space. The psychological space not only has dimension, but individuals interact by force equations similar to those described in theoretical physics models. Nonlinear regression techniques were used to estimate Fermi-Dirac parameters from the data. The model explained a high degree of the variance in each probability distribution. The correlation between predicted and observed probabilities ranged from a low of 0.955 to a high value of 0.998, indicating that humans behave in psychological space as Fermions behave in momentum space.

  2. Charging Guidance of Electric Taxis Based on Adaptive Particle Swarm Optimization

    PubMed Central

    Niu, Liyong; Zhang, Di

    2015-01-01

    Electric taxis are playing an important role in the application of electric vehicles. The actual operational data of electric taxis in Shenzhen, China, is analyzed, and, in allusion to the unbalanced time availability of the charging station equipment, the electric taxis charging guidance system is proposed basing on the charging station information and vehicle information. An electric taxis charging guidance model is established and guides the charging based on the positions of taxis and charging stations with adaptive mutation particle swarm optimization. The simulation is based on the actual data of Shenzhen charging stations, and the results show that electric taxis can be evenly distributed to the appropriate charging stations according to the charging pile numbers in charging stations after the charging guidance. The even distribution among the charging stations in the area will be achieved and the utilization of charging equipment will be improved, so the proposed charging guidance method is verified to be feasible. The improved utilization of charging equipment can save public charging infrastructure resources greatly. PMID:26236770

  3. Autonomous docking system for space structures and satellites

    NASA Astrophysics Data System (ADS)

    Prasad, Guru; Tajudeen, Eddie; Spenser, James

    2005-05-01

    Aximetric proposes Distributed Command and Control (C2) architecture for autonomous on-orbit assembly in space with our unique vision and sensor driven docking mechanism. Aximetric is currently working on ip based distributed control strategies, docking/mating plate, alignment and latching mechanism, umbilical structure/cord designs, and hardware/software in a closed loop architecture for smart autonomous demonstration utilizing proven developments in sensor and docking technology. These technologies can be effectively applied to many transferring/conveying and on-orbit servicing applications to include the capturing and coupling of space bound vehicles and components. The autonomous system will be a "smart" system that will incorporate a vision system used for identifying, tracking, locating and mating the transferring device to the receiving device. A robustly designed coupler for the transfer of the fuel will be integrated. Advanced sealing technology will be utilized for isolation and purging of resulting cavities from the mating process and/or from the incorporation of other electrical and data acquisition devices used as part of the overall smart system.

  4. A review of initial investigations to utilize ERTS-1 data in determining the availability and distribution of living marine resources. [harvest and management of fisheries resources in Mississippi Sound and Gulf waters

    NASA Technical Reports Server (NTRS)

    Stevenson, W. H.; Kemmerer, A. J.; Atwell, B. H.; Maughan, P. M.

    1974-01-01

    The National Marine Fisheries Service has been studying the application of aerospace remote sensing to fisheries management and utilization for many years. The 15-month ERTS study began in July 1972 to: (1) determine the reliability of satellite and high altitude sensors to provide oceanographic parameters in coastal waters; (2) demonstrate the use of remotely-sensed oceanographic information to predict the distribution and abundance of adult menhaden; and (3) demonstrate the potential use of satellites for acquiring information for improving the harvest and management of fisheries resources. The study focused on a coastal area in the north-central portion of the Gulf of Mexico, including parts of Alabama, Mississippi, and Louisiana. The test area used in the final analysis was the Mississippi Sound and the area outside the barrier islands to approximately the 18-meter (10-fathom) curve.

  5. Energy harvesting from coupled bending-twisting oscillations in carbon-fibre reinforced polymer laminates

    NASA Astrophysics Data System (ADS)

    Xie, Mengying; Zhang, Yan; Kraśny, Marcin J.; Rhead, Andrew; Bowen, Chris; Arafa, Mustafa

    2018-07-01

    The energy harvesting capability of resonant harvesting structures, such as piezoelectric cantilever beams, can be improved by utilizing coupled oscillations that generate favourable strain mode distributions. In this work, we present the first demonstration of the use of a laminated carbon fibre reinforced polymer to create cantilever beams that undergo coupled bending-twisting oscillations for energy harvesting applications. Piezoelectric layers that operate in bending and shear mode are attached to the bend-twist coupled beam surface at locations of maximum bending and torsional strains in the first mode of vibration to fully exploit the strain distribution along the beam. Modelling of this new bend-twist harvesting system is presented, which compares favourably with experimental results. It is demonstrated that the variety of bend and torsional modes of the harvesters can be utilized to create a harvester that operates over a wider range of frequencies and such multi-modal device architectures provides a unique approach to tune the frequency response of resonant harvesting systems.

  6. Charging Guidance of Electric Taxis Based on Adaptive Particle Swarm Optimization.

    PubMed

    Niu, Liyong; Zhang, Di

    2015-01-01

    Electric taxis are playing an important role in the application of electric vehicles. The actual operational data of electric taxis in Shenzhen, China, is analyzed, and, in allusion to the unbalanced time availability of the charging station equipment, the electric taxis charging guidance system is proposed basing on the charging station information and vehicle information. An electric taxis charging guidance model is established and guides the charging based on the positions of taxis and charging stations with adaptive mutation particle swarm optimization. The simulation is based on the actual data of Shenzhen charging stations, and the results show that electric taxis can be evenly distributed to the appropriate charging stations according to the charging pile numbers in charging stations after the charging guidance. The even distribution among the charging stations in the area will be achieved and the utilization of charging equipment will be improved, so the proposed charging guidance method is verified to be feasible. The improved utilization of charging equipment can save public charging infrastructure resources greatly.

  7. Designing management strategies for carbon dioxide storage and utilization under uncertainty using inexact modelling

    NASA Astrophysics Data System (ADS)

    Wang, Yu; Fan, Jie; Xu, Ye; Sun, Wei; Chen, Dong

    2017-06-01

    Effective application of carbon capture, utilization and storage (CCUS) systems could help to alleviate the influence of climate change by reducing carbon dioxide (CO2) emissions. The research objective of this study is to develop an equilibrium chance-constrained programming model with bi-random variables (ECCP model) for supporting the CCUS management system under random circumstances. The major advantage of the ECCP model is that it tackles random variables as bi-random variables with a normal distribution, where the mean values follow a normal distribution. This could avoid irrational assumptions and oversimplifications in the process of parameter design and enrich the theory of stochastic optimization. The ECCP model is solved by an equilibrium change-constrained programming algorithm, which provides convenience for decision makers to rank the solution set using the natural order of real numbers. The ECCP model is applied to a CCUS management problem, and the solutions could be useful in helping managers to design and generate rational CO2-allocation patterns under complexities and uncertainties.

  8. DC Microgrids Scoping Study. Estimate of Technical and Economic Benefits

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Backhaus, Scott N.; Swift, Gregory William; Chatzivasileiadis, Spyridon

    Microgrid demonstrations and deployments are expanding in US power systems and around the world. Although goals are specific to each site, these microgrids have demonstrated the ability to provide higher reliability and higher power quality than utility power systems and improved energy utilization. The vast majority of these microgrids are based on AC power transfer because this has been the traditionally dominant power delivery scheme. Independently, manufacturers, power system designers and researchers are demonstrating and deploying DC power distribution systems for applications where the end-use loads are natively DC, e.g., computers, solid-state lighting, and building networks. These early DC applicationsmore » may provide higher efficiency, added flexibility, reduced capital costs over their AC counterparts. Further, when onsite renewable generation, electric vehicles and storage systems are present, DC-based microgrids may offer additional benefits. Early successes from these efforts raises a question - can a combination of microgrid concepts and DC distribution systems provide added benefits beyond what has been achieved individually?« less

  9. Quantum coherence via skew information and its polygamy

    NASA Astrophysics Data System (ADS)

    Yu, Chang-shui

    2017-04-01

    Quantifying coherence is a key task in both quantum-mechanical theory and practical applications. Here, a reliable quantum coherence measure is presented by utilizing the quantum skew information of the state of interest subject to a certain broken observable. This coherence measure is proven to fulfill all the criteria (especially the strong monotonicity) recently introduced in the resource theories of quantum coherence. The coherence measure has an analytic expression and an obvious operational meaning related to quantum metrology. In terms of this coherence measure, the distribution of the quantum coherence, i.e., how the quantum coherence is distributed among the multiple parties, is studied and a corresponding polygamy relation is proposed. As a further application, it is found that the coherence measure forms the natural upper bounds for quantum correlations prepared by incoherent operations. The experimental measurements of our coherence measure as well as the relative-entropy coherence and lp-norm coherence are studied finally.

  10. Application of parallel distributed Lagrange multiplier technique to simulate coupled Fluid-Granular flows in pipes with varying Cross-Sectional area

    DOE PAGES

    Kanarska, Yuliya; Walton, Otis

    2015-11-30

    Fluid-granular flows are common phenomena in nature and industry. Here, an efficient computational technique based on the distributed Lagrange multiplier method is utilized to simulate complex fluid-granular flows. Each particle is explicitly resolved on an Eulerian grid as a separate domain, using solid volume fractions. The fluid equations are solved through the entire computational domain, however, Lagrange multiplier constrains are applied inside the particle domain such that the fluid within any volume associated with a solid particle moves as an incompressible rigid body. The particle–particle interactions are implemented using explicit force-displacement interactions for frictional inelastic particles similar to the DEMmore » method with some modifications using the volume of an overlapping region as an input to the contact forces. Here, a parallel implementation of the method is based on the SAMRAI (Structured Adaptive Mesh Refinement Application Infrastructure) library.« less

  11. Smartphone-based distributed data collection enables rapid assessment of shorebird habitat suitability

    USGS Publications Warehouse

    Thieler, E. Robert; Zeigler, Sara; Winslow, Luke; Hines, Megan K.; Read, Jordan S.; Walker, Jordan I.

    2016-01-01

    Understanding and managing dynamic coastal landscapes for beach-dependent species requires biological and geological data across the range of relevant environments and habitats. It is difficult to acquire such information; data often have limited focus due to resource constraints, are collected by non-specialists, or lack observational uniformity. We developed an open-source smartphone application called iPlover that addresses these difficulties in collecting biogeomorphic information at piping plover (Charadrius melodus) nest sites on coastal beaches. This paper describes iPlover development and evaluates data quality and utility following two years of collection (n = 1799 data points over 1500 km of coast between Maine and North Carolina, USA). We found strong agreement between field user and expert assessments and high model skill when data were used for habitat suitability prediction. Methods used here to develop and deploy a distributed data collection system have broad applicability to interdisciplinary environmental monitoring and modeling.

  12. Method of phase space beam dilution utilizing bounded chaos generated by rf phase modulation

    DOE PAGES

    Pham, Alfonse N.; Lee, S. Y.; Ng, K. Y.

    2015-12-10

    This paper explores the physics of chaos in a localized phase-space region produced by rf phase modulation applied to a double rf system. The study can be exploited to produce rapid particle bunch broadening exhibiting longitudinal particle distribution uniformity. Hamiltonian models and particle-tracking simulations are introduced to understand the mechanism and applicability of controlled particle diffusion. When phase modulation is applied to the double rf system, regions of localized chaos are produced through the disruption and overlapping of parametric resonant islands and configured to be bounded by well-behaved invariant tori to prevent particle loss. The condition of chaoticity and themore » degree of particle dilution can be controlled by the rf parameters. As a result, the method has applications in alleviating adverse space-charge effects in high-intensity beams, particle bunch distribution uniformization, and industrial radiation-effects experiments.« less

  13. Spatial Point Pattern Analysis of Neurons Using Ripley's K-Function in 3D

    PubMed Central

    Jafari-Mamaghani, Mehrdad; Andersson, Mikael; Krieger, Patrik

    2010-01-01

    The aim of this paper is to apply a non-parametric statistical tool, Ripley's K-function, to analyze the 3-dimensional distribution of pyramidal neurons. Ripley's K-function is a widely used tool in spatial point pattern analysis. There are several approaches in 2D domains in which this function is executed and analyzed. Drawing consistent inferences on the underlying 3D point pattern distributions in various applications is of great importance as the acquisition of 3D biological data now poses lesser of a challenge due to technological progress. As of now, most of the applications of Ripley's K-function in 3D domains do not focus on the phenomenon of edge correction, which is discussed thoroughly in this paper. The main goal is to extend the theoretical and practical utilization of Ripley's K-function and corresponding tests based on bootstrap resampling from 2D to 3D domains. PMID:20577588

  14. Smartphone-Based Distributed Data Collection Enables Rapid Assessment of Shorebird Habitat Suitability

    PubMed Central

    Zeigler, Sara L.; Winslow, Luke A.; Hines, Megan K.; Read, Jordan S.; Walker, Jordan I.

    2016-01-01

    Understanding and managing dynamic coastal landscapes for beach-dependent species requires biological and geological data across the range of relevant environments and habitats. It is difficult to acquire such information; data often have limited focus due to resource constraints, are collected by non-specialists, or lack observational uniformity. We developed an open-source smartphone application called iPlover that addresses these difficulties in collecting biogeomorphic information at piping plover (Charadrius melodus) nest sites on coastal beaches. This paper describes iPlover development and evaluates data quality and utility following two years of collection (n = 1799 data points over 1500 km of coast between Maine and North Carolina, USA). We found strong agreement between field user and expert assessments and high model skill when data were used for habitat suitability prediction. Methods used here to develop and deploy a distributed data collection system have broad applicability to interdisciplinary environmental monitoring and modeling. PMID:27828974

  15. Manufacture, distribution, and handling of nitrate salts for solar-thermal applications

    NASA Astrophysics Data System (ADS)

    Fiorucci, L. C.; Goldstein, S. L.

    1982-11-01

    The low cost and attractive physical properties of molten sodium/potassium nitrate salts were shown to be one of the most cost effective fluids for heat absorption and thermal energy storage in Solar Central Receiver (SCR) systems. Information related to the availability, transport, handling, and utilization of these salts for commercial size SCR applications is provided. The following items are reviewed: existing manufacturing processes for natural and synthetic nitrates; the upstream availability of raw materials; downstream existing and projected demand for these products in other sectors of the economy; and relevant handling and distribution technologies. Safety considerations and issues more directly related to the SCR facility, such as initial system charging, salt maintenance and regeneration, and disposal are also reviewed. Options for supply, surge storage, and initial charging are discussed for the 1 MWt to 300 MWe range of solar plant sizes.

  16. Aesthetic coatings for concrete bridge components

    NASA Astrophysics Data System (ADS)

    Kriha, Brent R.

    This thesis evaluated the durability and aesthetic performance of coating systems for utilization in concrete bridge applications. The principle objectives of this thesis were: 1) Identify aesthetic coating systems appropriate for concrete bridge applications; 2) Evaluate the performance of the selected systems through a laboratory testing regimen; 3) Develop guidelines for coating selection, surface preparation, and application. A series of site visits to various bridges throughout the State of Wisconsin provided insight into the performance of common coating systems and allowed problematic structural details to be identified. To aid in the selection of appropriate coating systems, questionnaires were distributed to coating manufacturers, bridge contractors, and various DOT offices to identify high performing coating systems and best practices for surface preparation and application. These efforts supplemented a literature review investigating recent publications related to formulation, selection, surface preparation, application, and performance evaluation of coating materials.

  17. Dynamic Load-Balancing for Distributed Heterogeneous Computing of Parallel CFD Problems

    NASA Technical Reports Server (NTRS)

    Ecer, A.; Chien, Y. P.; Boenisch, T.; Akay, H. U.

    2000-01-01

    The developed methodology is aimed at improving the efficiency of executing block-structured algorithms on parallel, distributed, heterogeneous computers. The basic approach of these algorithms is to divide the flow domain into many sub- domains called blocks, and solve the governing equations over these blocks. Dynamic load balancing problem is defined as the efficient distribution of the blocks among the available processors over a period of several hours of computations. In environments with computers of different architecture, operating systems, CPU speed, memory size, load, and network speed, balancing the loads and managing the communication between processors becomes crucial. Load balancing software tools for mutually dependent parallel processes have been created to efficiently utilize an advanced computation environment and algorithms. These tools are dynamic in nature because of the chances in the computer environment during execution time. More recently, these tools were extended to a second operating system: NT. In this paper, the problems associated with this application will be discussed. Also, the developed algorithms were combined with the load sharing capability of LSF to efficiently utilize workstation clusters for parallel computing. Finally, results will be presented on running a NASA based code ADPAC to demonstrate the developed tools for dynamic load balancing.

  18. Clinical applications of PET in oncology.

    PubMed

    Rohren, Eric M; Turkington, Timothy G; Coleman, R Edward

    2004-05-01

    Positron emission tomography (PET) provides metabolic information that has been documented to be useful in patient care. The properties of positron decay permit accurate imaging of the distribution of positron-emitting radiopharmaceuticals. The wide array of positron-emitting radiopharmaceuticals has been used to characterize multiple physiologic and pathologic states. PET is used for characterizing brain disorders such as Alzheimer disease and epilepsy and cardiac disorders such as coronary artery disease and myocardial viability. The neurologic and cardiac applications of PET are not covered in this review. The major utilization of PET clinically is in oncology and consists of imaging the distribution of fluorine 18 fluorodeoxyglucose (FDG). FDG, an analogue of glucose, accumulates in most tumors in a greater amount than it does in normal tissue. FDG PET is being used in diagnosis and follow-up of several malignancies, and the list of articles supporting its use continues to grow. In this review, the physics and instrumentation aspects of PET are described. Many of the clinical applications in oncology are mature and readily covered by third-party payers. Other applications are being used clinically but have not been as carefully evaluated in the literature, and these applications may not be covered by third-party payers. The developing applications of PET are included in this review.

  19. Mobile Application for Pregnant Women: What Do Mothers Say?

    PubMed

    Sommer, Janine; Daus, Mariana; Smith, María; Luna, Daniel

    2017-01-01

    Today, health information technologies are constantly expanding and changing, allowing more and more people to use different mobile applications to receive information and control their health condition. Based on the need to implement an application for pregnant women in the Personal Health Record (PHR) of Hospital Italiano de Buenos Aires (HIBA), an Australian survey was carried out to measure the use and utility of a pregnancy application (pregnancy app). Our results were broadly in agreement with the reference values. The survey was distributed through social networks (Facebook and Twitter) during September 2016. We obtained 235 responses from Spanish-speaking women, mostly Argentinian. In conclusion, it could be observed that a pregnancy app offers the possibility of a greater follow-up and provides reassurance to the pregnant women who use it.

  20. NASA technology utilization survey on composite materials

    NASA Technical Reports Server (NTRS)

    Leeds, M. A.; Schwartz, S.; Holm, G. J.; Krainess, A. M.; Wykes, D. M.; Delzell, M. T.; Veazie, W. H., Jr.

    1972-01-01

    NASA and NASA-funded contractor contributions to the field of composite materials are surveyed. Existing and potential non-aerospace applications of the newer composite materials are emphasized. Economic factors for selection of a composite for a particular application are weight savings, performance (high strength, high elastic modulus, low coefficient of expansion, heat resistance, corrosion resistance,), longer service life, and reduced maintenance. Applications for composites in agriculture, chemical and petrochemical industries, construction, consumer goods, machinery, power generation and distribution, transportation, biomedicine, and safety are presented. With the continuing trend toward further cost reductions, composites warrant consideration in a wide range of non-aerospace applications. Composite materials discussed include filamentary reinforced materials, laminates, multiphase alloys, solid multiphase lubricants, and multiphase ceramics. New processes developed to aid in fabrication of composites are given.

  1. Simultaneous K-edge subtraction tomography for tracing strontium using parametric X-ray radiation

    NASA Astrophysics Data System (ADS)

    Hayakawa, Y.; Hayakawa, K.; Kaneda, T.; Nogami, K.; Sakae, T.; Sakai, T.; Sato, I.; Takahashi, Y.; Tanaka, T.

    2017-07-01

    The X-ray source based on parametric X-ray radiation (PXR) has been regularly providing a coherent X-ray beam for application studies at Nihon University. Recently, three dimensional (3D) computed tomography (CT) has become one of the most important applications of the PXR source. The methodology referred to as K-edge subtraction (KES) imaging is a particularly successful application utilizing the energy selectivity of PXR. In order to demonstrate the applicability of PXR-KES, a simultaneous KES experiment for a specimen containing strontium was performed using a PXR beam having an energy near the Sr K-edge of 16.1 keV. As a result, the 3D distribution of Sr was obtained by subtraction between the two simultaneously acquired tomographic images.

  2. A geometric model of mortality and crop protection for insects feeding on discrete toxicant deposits.

    PubMed

    Ebert, Timothy; Derksen, Richard

    2004-04-01

    Current theory governing the biological effectiveness of toxicants stresses the dose-response relationship and focuses on uniform toxicant distributions in the insect's environment. However, toxicants are seldom uniformly dispersed under field conditions. Toxicant distribution affects bioavailability, but the mechanics of such interactions is not well documented. We present a geometric model of the interactions between insects and heterogeneously distributed toxicants. From the model, we conclude the following: 1) There is an optimal droplet size, and droplets both smaller and larger than this optimum will decrease efficacy. 2) There is an ideal droplet distribution. Droplets should be spaced based on two criteria: calculate the allowable damage, double this quantity, and one lethal deposit should be placed in this area; and define the quantity of leaf the larva could eat before the toxicant decays below the lethal level and place one lethal deposit within this area. 3) Distributions of toxicant where deposits are sublethal will often be ineffective, but the application is wasteful if deposits contain more than a lethal dose. 4) Insect behavior both as individuals and collectively influences the level of crop production provided by an application. This conclusion has implications for both crop protection and natural plant-insect interactions. The effective utilization of new more environmentally sensitive toxicants may depend on how well we understand how heterogeneous toxicant distributions interact with insect behavior to determine the biological outcome.

  3. Independent-Trajectory Thermodynamic Integration: a practical guide to protein-drug binding free energy calculations using distributed computing.

    PubMed

    Lawrenz, Morgan; Baron, Riccardo; Wang, Yi; McCammon, J Andrew

    2012-01-01

    The Independent-Trajectory Thermodynamic Integration (IT-TI) approach for free energy calculation with distributed computing is described. IT-TI utilizes diverse conformational sampling obtained from multiple, independent simulations to obtain more reliable free energy estimates compared to single TI predictions. The latter may significantly under- or over-estimate the binding free energy due to finite sampling. We exemplify the advantages of the IT-TI approach using two distinct cases of protein-ligand binding. In both cases, IT-TI yields distributions of absolute binding free energy estimates that are remarkably centered on the target experimental values. Alternative protocols for the practical and general application of IT-TI calculations are investigated. We highlight a protocol that maximizes predictive power and computational efficiency.

  4. A benders decomposition approach to multiarea stochastic distributed utility planning

    NASA Astrophysics Data System (ADS)

    McCusker, Susan Ann

    Until recently, small, modular generation and storage options---distributed resources (DRs)---have been installed principally in areas too remote for economic power grid connection and sensitive applications requiring backup capacity. Recent regulatory changes and DR advances, however, have lead utilities to reconsider the role of DRs. To a utility facing distribution capacity bottlenecks or uncertain load growth, DRs can be particularly valuable since they can be dispersed throughout the system and constructed relatively quickly. DR value is determined by comparing its costs to avoided central generation expenses (i.e., marginal costs) and distribution investments. This requires a comprehensive central and local planning and production model, since central system marginal costs result from system interactions over space and time. This dissertation develops and applies an iterative generalized Benders decomposition approach to coordinate models for optimal DR evaluation. Three coordinated models exchange investment, net power demand, and avoided cost information to minimize overall expansion costs. Local investment and production decisions are made by a local mixed integer linear program. Central system investment decisions are made by a LP, and production costs are estimated by a stochastic multi-area production costing model with Kirchhoff's Voltage and Current Law constraints. The nested decomposition is a new and unique method for distributed utility planning that partitions the variables twice to separate local and central investment and production variables, and provides upper and lower bounds on expected expansion costs. Kirchhoff's Voltage Law imposes nonlinear, nonconvex constraints that preclude use of LP if transmission capacity is available in a looped transmission system. This dissertation develops KVL constraint approximations that permit the nested decomposition to consider new transmission resources, while maintaining linearity in the three individual models. These constraints are presented as a heuristic for the given examples; future research will investigate conditions for convergence. A ten-year multi-area example demonstrates the decomposition approach and suggests the ability of DRs and new transmission to modify capacity additions and production costs by changing demand and power flows. Results demonstrate that DR and new transmission options may lead to greater capacity additions, but resulting production cost savings more than offset extra capacity costs.

  5. Distributed computing feasibility in a non-dedicated homogeneous distributed system

    NASA Technical Reports Server (NTRS)

    Leutenegger, Scott T.; Sun, Xian-He

    1993-01-01

    The low cost and availability of clusters of workstations have lead researchers to re-explore distributed computing using independent workstations. This approach may provide better cost/performance than tightly coupled multiprocessors. In practice, this approach often utilizes wasted cycles to run parallel jobs. The feasibility of such a non-dedicated parallel processing environment assuming workstation processes have preemptive priority over parallel tasks is addressed. An analytical model is developed to predict parallel job response times. Our model provides insight into how significantly workstation owner interference degrades parallel program performance. A new term task ratio, which relates the parallel task demand to the mean service demand of nonparallel workstation processes, is introduced. It was proposed that task ratio is a useful metric for determining how large the demand of a parallel applications must be in order to make efficient use of a non-dedicated distributed system.

  6. Parameter estimation techniques based on optimizing goodness-of-fit statistics for structural reliability

    NASA Technical Reports Server (NTRS)

    Starlinger, Alois; Duffy, Stephen F.; Palko, Joseph L.

    1993-01-01

    New methods are presented that utilize the optimization of goodness-of-fit statistics in order to estimate Weibull parameters from failure data. It is assumed that the underlying population is characterized by a three-parameter Weibull distribution. Goodness-of-fit tests are based on the empirical distribution function (EDF). The EDF is a step function, calculated using failure data, and represents an approximation of the cumulative distribution function for the underlying population. Statistics (such as the Kolmogorov-Smirnov statistic and the Anderson-Darling statistic) measure the discrepancy between the EDF and the cumulative distribution function (CDF). These statistics are minimized with respect to the three Weibull parameters. Due to nonlinearities encountered in the minimization process, Powell's numerical optimization procedure is applied to obtain the optimum value of the EDF. Numerical examples show the applicability of these new estimation methods. The results are compared to the estimates obtained with Cooper's nonlinear regression algorithm.

  7. A watershed scale spatially-distributed model for streambank erosion rate driven by channel curvature

    NASA Astrophysics Data System (ADS)

    McMillan, Mitchell; Hu, Zhiyong

    2017-10-01

    Streambank erosion is a major source of fluvial sediment, but few large-scale, spatially distributed models exist to quantify streambank erosion rates. We introduce a spatially distributed model for streambank erosion applicable to sinuous, single-thread channels. We argue that such a model can adequately characterize streambank erosion rates, measured at the outsides of bends over a 2-year time period, throughout a large region. The model is based on the widely-used excess-velocity equation and comprised three components: a physics-based hydrodynamic model, a large-scale 1-dimensional model of average monthly discharge, and an empirical bank erodibility parameterization. The hydrodynamic submodel requires inputs of channel centerline, slope, width, depth, friction factor, and a scour factor A; the large-scale watershed submodel utilizes watershed-averaged monthly outputs of the Noah-2.8 land surface model; bank erodibility is based on tree cover and bank height as proxies for root density. The model was calibrated with erosion rates measured in sand-bed streams throughout the northern Gulf of Mexico coastal plain. The calibrated model outperforms a purely empirical model, as well as a model based only on excess velocity, illustrating the utility of combining a physics-based hydrodynamic model with an empirical bank erodibility relationship. The model could be improved by incorporating spatial variability in channel roughness and the hydrodynamic scour factor, which are here assumed constant. A reach-scale application of the model is illustrated on ∼1 km of a medium-sized, mixed forest-pasture stream, where the model identifies streambank erosion hotspots on forested and non-forested bends.

  8. USSR and Eastern Europe Scientific Abstracts, Chemistry, Number 56

    DTIC Science & Technology

    1977-07-06

    U5.2:631.U5.4 THE EFFECT OF POTASSIUM UPON THE UTILIZATION OF AMMONIA NITROGEN IN A VEGETATION EXPERIMENT ON PEAT-PODZOLIC SOIL AND ON CHERNOZEM...upon the har- vest obtained in a vegetation experiment: not a single one of the tested methods of application of the fertilizers was accompanied by a...toxic agents into the organism, their movement in it, metabolism, distribution and excretion. A literature review covering physical-chemical methods

  9. Fractional representation theory - Robustness results with applications to finite dimensional control of a class of linear distributed systems

    NASA Technical Reports Server (NTRS)

    Nett, C. N.; Jacobson, C. A.; Balas, M. J.

    1983-01-01

    This paper reviews and extends the fractional representation theory. In particular, new and powerful robustness results are presented. This new theory is utilized to develop a preliminary design methodology for finite dimensional control of a class of linear evolution equations on a Banach space. The design is for stability in an input-output sense, but particular attention is paid to internal stability as well.

  10. Utilization of Internet Protocol-Based Voice Systems in Remote Payload Operations

    NASA Technical Reports Server (NTRS)

    Best, Susan; Nichols, Kelvin; Bradford, Robert

    2003-01-01

    This viewgraph presentation provides an overview of a proposed voice communication system for use in remote payload operations performed on the International Space Station. The system, Internet Voice Distribution System (IVoDS), would make use of existing Internet protocols, and offer a number of advantages over the system currently in use. Topics covered include: system description and operation, system software and hardware, system architecture, project status, and technology transfer applications.

  11. An adaptive neural swarm approach for intrusion defense in ad hoc networks

    NASA Astrophysics Data System (ADS)

    Cannady, James

    2011-06-01

    Wireless sensor networks (WSN) and mobile ad hoc networks (MANET) are being increasingly deployed in critical applications due to the flexibility and extensibility of the technology. While these networks possess numerous advantages over traditional wireless systems in dynamic environments they are still vulnerable to many of the same types of host-based and distributed attacks common to those systems. Unfortunately, the limited power and bandwidth available in WSNs and MANETs, combined with the dynamic connectivity that is a defining characteristic of the technology, makes it extremely difficult to utilize traditional intrusion detection techniques. This paper describes an approach to accurately and efficiently detect potentially damaging activity in WSNs and MANETs. It enables the network as a whole to recognize attacks, anomalies, and potential vulnerabilities in a distributive manner that reflects the autonomic processes of biological systems. Each component of the network recognizes activity in its local environment and then contributes to the overall situational awareness of the entire system. The approach utilizes agent-based swarm intelligence to adaptively identify potential data sources on each node and on adjacent nodes throughout the network. The swarm agents then self-organize into modular neural networks that utilize a reinforcement learning algorithm to identify relevant behavior patterns in the data without supervision. Once the modular neural networks have established interconnectivity both locally and with neighboring nodes the analysis of events within the network can be conducted collectively in real-time. The approach has been shown to be extremely effective in identifying distributed network attacks.

  12. Numerical modeling of magnetic moments for UXO applications

    USGS Publications Warehouse

    Sanchez, V.; Li, Y.; Nabighian, M.; Wright, D.

    2006-01-01

    The surface magnetic anomaly observed in UXO clearance is mainly dipolar and, consequently, the dipole is the only magnetic moment regularly recovered in UXO applications. The dipole moment contains information about intensity of magnetization but lacks information about shape. In contrast, higher-order moments, such as quadrupole and octupole, encode asymmetry properties of the magnetization distribution within the buried targets. In order to improve our understanding of magnetization distribution within UXO and non-UXO objects and its potential utility in UXO clearance, we present a 3D numerical modeling study for highly susceptible metallic objects. The basis for the modeling is the solution of a nonlinear integral equation describing magnetization within isolated objects. A solution for magnetization distribution then allows us to compute magnetic moments of the object, analyze their relationships, and provide a depiction of the surface anomaly produced by different moments within the object. Our modeling results show significant high-order moments for more asymmetric objects situated at depths typical of UXO burial, and suggest that the increased relative contribution to magnetic gradient data from these higher-order moments may provide a practical tool for improved UXO discrimination.

  13. Advanced Unstructured Grid Generation for Complex Aerodynamic Applications

    NASA Technical Reports Server (NTRS)

    Pirzadeh, Shahyar Z.

    2008-01-01

    A new approach for distribution of grid points on the surface and in the volume has been developed and implemented in the NASA unstructured grid generation code VGRID. In addition to the point and line sources of prior work, the new approach utilizes surface and volume sources for automatic curvature-based grid sizing and convenient point distribution in the volume. A new exponential growth function produces smoother and more efficient grids and provides superior control over distribution of grid points in the field. All types of sources support anisotropic grid stretching which not only improves the grid economy but also provides more accurate solutions for certain aerodynamic applications. The new approach does not require a three-dimensional background grid as in the previous methods. Instead, it makes use of an efficient bounding-box auxiliary medium for storing grid parameters defined by surface sources. The new approach is less memory-intensive and more efficient computationally. The grids generated with the new method either eliminate the need for adaptive grid refinement for certain class of problems or provide high quality initial grids that would enhance the performance of many adaptation methods.

  14. Advanced Unstructured Grid Generation for Complex Aerodynamic Applications

    NASA Technical Reports Server (NTRS)

    Pirzadeh, Shahyar

    2010-01-01

    A new approach for distribution of grid points on the surface and in the volume has been developed. In addition to the point and line sources of prior work, the new approach utilizes surface and volume sources for automatic curvature-based grid sizing and convenient point distribution in the volume. A new exponential growth function produces smoother and more efficient grids and provides superior control over distribution of grid points in the field. All types of sources support anisotropic grid stretching which not only improves the grid economy but also provides more accurate solutions for certain aerodynamic applications. The new approach does not require a three-dimensional background grid as in the previous methods. Instead, it makes use of an efficient bounding-box auxiliary medium for storing grid parameters defined by surface sources. The new approach is less memory-intensive and more efficient computationally. The grids generated with the new method either eliminate the need for adaptive grid refinement for certain class of problems or provide high quality initial grids that would enhance the performance of many adaptation methods.

  15. Data distribution service-based interoperability framework for smart grid testbed infrastructure

    DOE PAGES

    Youssef, Tarek A.; Elsayed, Ahmed T.; Mohammed, Osama A.

    2016-03-02

    This study presents the design and implementation of a communication and control infrastructure for smart grid operation. The proposed infrastructure enhances the reliability of the measurements and control network. The advantages of utilizing the data-centric over message-centric communication approach are discussed in the context of smart grid applications. The data distribution service (DDS) is used to implement a data-centric common data bus for the smart grid. This common data bus improves the communication reliability, enabling distributed control and smart load management. These enhancements are achieved by avoiding a single point of failure while enabling peer-to-peer communication and an automatic discoverymore » feature for dynamic participating nodes. The infrastructure and ideas presented in this paper were implemented and tested on the smart grid testbed. A toolbox and application programing interface for the testbed infrastructure are developed in order to facilitate interoperability and remote access to the testbed. This interface allows control, monitoring, and performing of experiments remotely. Furthermore, it could be used to integrate multidisciplinary testbeds to study complex cyber-physical systems (CPS).« less

  16. Stability of collapse lyophilized influenza vaccine formulations.

    PubMed

    Anamur, Cihad; Winter, Gerhard; Engert, Julia

    2015-04-10

    A clear limitation of many liquid vaccines is the obligatory cold-chain distribution system. Therefore, distribution of a dried vaccine formulation may be beneficial in terms of vaccine stability, handling and transport. Collapse freeze-drying is a process which utilizes fairly aggressive but at the same time economic lyophilization cycles where the formulation is dried above its glass transition temperature. In this study, we used collapse freeze-drying for a thermosensitive model influenza vaccine (Pandemrix(®)). The dried lyophilizates were further cryo-milled to engineer powder particles in the size range of approximately 20-80 μm which is applicable for epidermal powder immunization. Vaccine potency and stability were neither affected by high temperature input during collapse lyophilization nor over a storage period of six months. Furthermore, cryo-milled vaccine lyophilizates showed good storage stability of up to three months at high storage temperature (40 °C). This technique can provide a powerful tool for the worldwide distribution of vaccine and for new application technologies such as engineered powder immunization. Copyright © 2015 Elsevier B.V. All rights reserved.

  17. Towards a cyber-physical era: soft computing framework based multi-sensor array for water quality monitoring

    NASA Astrophysics Data System (ADS)

    Bhardwaj, Jyotirmoy; Gupta, Karunesh K.; Gupta, Rajiv

    2018-02-01

    New concepts and techniques are replacing traditional methods of water quality parameter measurement systems. This paper introduces a cyber-physical system (CPS) approach for water quality assessment in a distribution network. Cyber-physical systems with embedded sensors, processors and actuators can be designed to sense and interact with the water environment. The proposed CPS is comprised of sensing framework integrated with five different water quality parameter sensor nodes and soft computing framework for computational modelling. Soft computing framework utilizes the applications of Python for user interface and fuzzy sciences for decision making. Introduction of multiple sensors in a water distribution network generates a huge number of data matrices, which are sometimes highly complex, difficult to understand and convoluted for effective decision making. Therefore, the proposed system framework also intends to simplify the complexity of obtained sensor data matrices and to support decision making for water engineers through a soft computing framework. The target of this proposed research is to provide a simple and efficient method to identify and detect presence of contamination in a water distribution network using applications of CPS.

  18. The feasibility of replacing or upgrading utility distribution transformers during routine maintenance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barnes, P.R.; Van Dyke, J.W.; McConnell, B.W.

    It is estimated that electric utilities use about 40 million distribution transformers in supplying electricity to customers in the United States. Although utility distribution transformers collectively have a high average efficiency, they account for approximately 61 billion kWh of the 229 billion kWh of energy lost annually in the delivery of electricity. Distribution transformers are being replaced over time by new, more efficient, lower-loss units during routine utility maintenance of power distribution systems. Maintenance is typically not performed on units in service. However, units removed from service with appreciable remaining life are often refurbished and returned to stock. Distribution transformersmore » may be removed from service for many reasons, including failure, over- or underloading, or line upgrades such as voltage changes or rerouting. When distribution transformers are removed from service, a decision must be made whether to dispose of the transformer and purchase a lower-loss replacement or to refurbish the transformer and return it to stock for future use. This report contains findings and recommendations on replacing utility distribution transformers during routine maintenance, which is required by section 124(c) of the Energy Policy Act of 1992. The objectives of the study are to evaluate the practicability, cost-effectiveness, and potential energy savings of replacing or upgrading existing transformers during routine utility maintenance and to develop recommendations on was to achieve the potential energy savings.« less

  19. Clinical Utilization of Repeated Open Application Test Among American Contact Dermatitis Society Members.

    PubMed

    Brown, Gabrielle E; Botto, Nina; Butler, Daniel C; Murase, Jenny E

    2015-01-01

    The repeated open application test (ROAT) provides useful information regarding allergens in suspected cases of allergic contact dermatitis; however, standardized methodology has not been established. The aim of this study was to assess how ROAT is used in clinical and research settings. We distributed a survey regarding ROAT practice to the American Contact Dermatitis Society and conducted a literature review of ROAT utilization in research. A total of 67 American Contact Dermatitis Society members participated in the survey. Respondents most frequently recommend application of leave-on products twice daily (46.0%) and rinse-off products once daily (43.5%). The most commonly used anatomical sites include the forearm (38.7%) and antecubital fossa (32.3%). Most respondents continue ROAT for 1 (49.2%) or 2 weeks (31.7%). Literature review of 32 studies (26 leave-on, 6 rinse-off) revealed that application frequency is most common at twice daily for both leave-on (96.2%) and rinse-off (50.0%) products. The most common anatomical site is the forearm (62.5%), with an overall study duration of 3 to 4 weeks (65.6%). When comparing ROAT clinical and research practice, the majority trend was consistent for leave-on product application frequency and anatomical site, but not for rinse-off product application frequency, or overall duration. Further research is needed to determine best practice recommendations.

  20. SolarSoft Web Services

    NASA Astrophysics Data System (ADS)

    Freeland, S.; Hurlburt, N.

    2005-12-01

    The SolarSoft system (SSW) is a set of integrated software libraries, databases, and system utilities which provide a common programming and data analysis environment for solar physics. The system includes contributions from a large community base, representing the efforts of many NASA PI team MO&DA teams,spanning many years and multiple NASA and international orbital and ground based missions. The SSW general use libraries include Many hundreds of utilities which are instrument and mission independent. A large subset are also SOLAR independent, such as time conversions, digital detector cleanup, time series analysis, mathematics, image display, WWW server communications and the like. PI teams may draw on these general purpose libraries for analysis and application development while concentrating efforts on instrument specific calibration issues rather than reinvention of general use software. By the same token, PI teams are encouraged to contribute new applications or enhancements to existing utilities which may have more general interest. Recent areas of intense evolution include space weather applications, automated distributed data access and analysis, interfaces with the ongoing Virtual Solar Observatory efforts, and externalization of SolarSoft power through Web Services. We will discuss the current status of SSW web services and demonstrate how this facilitates accessing the underlying power of SolarSoft in more abstract terms. In this context, we will describe the use of SSW services within the Collaborative Sun Earth Connector environment.

  1. Addressing practical challenges in utility optimization of mobile wireless sensor networks

    NASA Astrophysics Data System (ADS)

    Eswaran, Sharanya; Misra, Archan; La Porta, Thomas; Leung, Kin

    2008-04-01

    This paper examines the practical challenges in the application of the distributed network utility maximization (NUM) framework to the problem of resource allocation and sensor device adaptation in a mission-centric wireless sensor network (WSN) environment. By providing rich (multi-modal), real-time information about a variety of (often inaccessible or hostile) operating environments, sensors such as video, acoustic and short-aperture radar enhance the situational awareness of many battlefield missions. Prior work on the applicability of the NUM framework to mission-centric WSNs has focused on tackling the challenges introduced by i) the definition of an individual mission's utility as a collective function of multiple sensor flows and ii) the dissemination of an individual sensor's data via a multicast tree to multiple consuming missions. However, the practical application and performance of this framework is influenced by several parameters internal to the framework and also by implementation-specific decisions. This is made further complex due to mobile nodes. In this paper, we use discrete-event simulations to study the effects of these parameters on the performance of the protocol in terms of speed of convergence, packet loss, and signaling overhead thereby addressing the challenges posed by wireless interference and node mobility in ad-hoc battlefield scenarios. This study provides better understanding of the issues involved in the practical adaptation of the NUM framework. It also helps identify potential avenues of improvement within the framework and protocol.

  2. High-power diffusing-tip fibers for interstitial photocoagulation

    NASA Astrophysics Data System (ADS)

    Sinofsky, Edward L.; Farr, Norman; Baxter, Lincoln; Weiler, William

    1997-05-01

    A line of optical fiber based diffusing tips has been designed, developed, and tested that are capable of distributing tens of watts of cw laser power over lengths ranging from two millimeters to over 10 cm. The result is a flexible non-stick diffuser capable of coagulating large volumes of tissue in reasonably short exposures of 3 - 5 minutes. Sub-millimeter diameter devices have a distinct effect on reducing the force needed to insert the applicator interstitially into tissue. Utilizing our design approach, we have produced diffusers based on 200 micrometer core fiber that has delivered over 35 watts of Nd:YAG energy over diffusion lengths as short as 4 mm. These applicators are being tested for applications in oncology, cardiology, electrophysiology, urology and gynecology.

  3. Software-defined Radio Based Measurement Platform for Wireless Networks

    PubMed Central

    Chao, I-Chun; Lee, Kang B.; Candell, Richard; Proctor, Frederick; Shen, Chien-Chung; Lin, Shinn-Yan

    2015-01-01

    End-to-end latency is critical to many distributed applications and services that are based on computer networks. There has been a dramatic push to adopt wireless networking technologies and protocols (such as WiFi, ZigBee, WirelessHART, Bluetooth, ISA100.11a, etc.) into time-critical applications. Examples of such applications include industrial automation, telecommunications, power utility, and financial services. While performance measurement of wired networks has been extensively studied, measuring and quantifying the performance of wireless networks face new challenges and demand different approaches and techniques. In this paper, we describe the design of a measurement platform based on the technologies of software-defined radio (SDR) and IEEE 1588 Precision Time Protocol (PTP) for evaluating the performance of wireless networks. PMID:27891210

  4. Electrochemical system including lamella settler crystallizer

    DOEpatents

    Maimoni, Arturo

    1988-01-01

    A crystallizer which incorporates a lamella settler and which is particularly applicable for use in batteries and power cells for electric vehicles or stationary applications. The lamella settler can be utilized for coarse particle separation or for agglomeration, and is particularly applicable to aluminum-air batteries or power cells for solving the hydrargillite (aluminum-hydroxide) removal problems from such batteries. This invention provides the advantages of very low energy consumption, turbulence, shear, cost and maintenance. Thus, due to the low shear and low turbulence of this invention, it is particularly effective in the control of aluminum hydroxide particle size distribution in the various sections of an aluminum-air system, as will as in other elecrochemical systems requiring separation for phases of different densities.

  5. Lamella settler crystallizer

    DOEpatents

    Maimoni, A.

    1990-12-18

    A crystallizer is described which incorporates a lamella settler and which is particularly applicable for use in batteries and power cells for electric vehicles or stationary applications. The lamella settler can be utilized for coarse particle separation or for agglomeration, and is particularly applicable to aluminum-air batteries or power cells for solving the hydrargillite (aluminum-hydroxide) removal problems from such batteries. This invention provides the advantages of very low energy consumption, turbulence, shear, cost and maintenance. Thus, due to the low shear and low turbulence of this invention, it is particularly effective in the control of aluminum hydroxide particle size distribution in the various sections of an aluminum-air system, as well as in other electrochemical systems requiring separation for phases of different densities. 3 figs.

  6. Software-defined Radio Based Measurement Platform for Wireless Networks.

    PubMed

    Chao, I-Chun; Lee, Kang B; Candell, Richard; Proctor, Frederick; Shen, Chien-Chung; Lin, Shinn-Yan

    2015-10-01

    End-to-end latency is critical to many distributed applications and services that are based on computer networks. There has been a dramatic push to adopt wireless networking technologies and protocols (such as WiFi, ZigBee, WirelessHART, Bluetooth, ISA100.11a, etc. ) into time-critical applications. Examples of such applications include industrial automation, telecommunications, power utility, and financial services. While performance measurement of wired networks has been extensively studied, measuring and quantifying the performance of wireless networks face new challenges and demand different approaches and techniques. In this paper, we describe the design of a measurement platform based on the technologies of software-defined radio (SDR) and IEEE 1588 Precision Time Protocol (PTP) for evaluating the performance of wireless networks.

  7. Growth stimulation of Brevibacterium sp. by siderophores.

    PubMed

    Noordman, W H; Reissbrodt, R; Bongers, R S; Rademaker, J L W; Bockelmann, W; Smit, G

    2006-09-01

    To assess which types of siderophores are typically produced by Brevibacterium and how siderophore production and utilization traits are distributed within this genus. During co-cultivation experiments it was found that growth of B. linens Br5 was stimulated by B. linens NIZO B1410 by two orders of magnitude. The stimulation was caused by the production of hydroxamate siderophores by B. linens NIZO B1410 that enabled the siderophore-auxotrophic strain Br5 to grow faster under the applied iron-limited growth conditions. Different patterns of siderophore production and utilization were observed within the genus Brevibacterium. These patterns did not reflect the phylogenetic relations within the group as determined by partial 16S rDNA sequencing. Most Brevibacterium strains were found to utilize hydroxamate siderophores. Brevibacteria can produce and utilize siderophores although certain strains within this genus are siderophore-auxotrophic. It is reported for the first time that brevibacteria produce and utilize siderophores. This knowledge can be utilized to stimulate growth of auxotrophic strains under certain conditions. Enhancing the growth rate of Brevibacterium is of importance for the application of this species, for example, for cheese manufacturing or for industrial production of enzymes or metabolites.

  8. Exploring Empirical Rank-Frequency Distributions Longitudinally through a Simple Stochastic Process

    PubMed Central

    Finley, Benjamin J.; Kilkki, Kalevi

    2014-01-01

    The frequent appearance of empirical rank-frequency laws, such as Zipf’s law, in a wide range of domains reinforces the importance of understanding and modeling these laws and rank-frequency distributions in general. In this spirit, we utilize a simple stochastic cascade process to simulate several empirical rank-frequency distributions longitudinally. We focus especially on limiting the process’s complexity to increase accessibility for non-experts in mathematics. The process provides a good fit for many empirical distributions because the stochastic multiplicative nature of the process leads to an often observed concave rank-frequency distribution (on a log-log scale) and the finiteness of the cascade replicates real-world finite size effects. Furthermore, we show that repeated trials of the process can roughly simulate the longitudinal variation of empirical ranks. However, we find that the empirical variation is often less that the average simulated process variation, likely due to longitudinal dependencies in the empirical datasets. Finally, we discuss the process limitations and practical applications. PMID:24755621

  9. Exploring empirical rank-frequency distributions longitudinally through a simple stochastic process.

    PubMed

    Finley, Benjamin J; Kilkki, Kalevi

    2014-01-01

    The frequent appearance of empirical rank-frequency laws, such as Zipf's law, in a wide range of domains reinforces the importance of understanding and modeling these laws and rank-frequency distributions in general. In this spirit, we utilize a simple stochastic cascade process to simulate several empirical rank-frequency distributions longitudinally. We focus especially on limiting the process's complexity to increase accessibility for non-experts in mathematics. The process provides a good fit for many empirical distributions because the stochastic multiplicative nature of the process leads to an often observed concave rank-frequency distribution (on a log-log scale) and the finiteness of the cascade replicates real-world finite size effects. Furthermore, we show that repeated trials of the process can roughly simulate the longitudinal variation of empirical ranks. However, we find that the empirical variation is often less that the average simulated process variation, likely due to longitudinal dependencies in the empirical datasets. Finally, we discuss the process limitations and practical applications.

  10. Cluster analysis based on dimensional information with applications to feature selection and classification

    NASA Technical Reports Server (NTRS)

    Eigen, D. J.; Fromm, F. R.; Northouse, R. A.

    1974-01-01

    A new clustering algorithm is presented that is based on dimensional information. The algorithm includes an inherent feature selection criterion, which is discussed. Further, a heuristic method for choosing the proper number of intervals for a frequency distribution histogram, a feature necessary for the algorithm, is presented. The algorithm, although usable as a stand-alone clustering technique, is then utilized as a global approximator. Local clustering techniques and configuration of a global-local scheme are discussed, and finally the complete global-local and feature selector configuration is shown in application to a real-time adaptive classification scheme for the analysis of remote sensed multispectral scanner data.

  11. Physics of vascular brachytherapy.

    PubMed

    Jani, S K

    1999-08-01

    Basic physics plays an important role in understanding the clinical utility of radioisotopes in brachytherapy. Vascular brachytherapy is a very unique application of localized radiation in that dose levels very close to the source are employed to treat tissues within the arterial wall. This article covers basic physics of radioactivity and differentiates between beta and gamma radiations. Physical parameters such as activity, half-life, exposure and absorbed dose have been explained. Finally, the dose distribution around a point source and a linear source is described. The principles of basic physics are likely to play an important role in shaping the emerging technology and its application in vascular brachytherapy.

  12. Clock Agreement Among Parallel Supercomputer Nodes

    DOE Data Explorer

    Jones, Terry R.; Koenig, Gregory A.

    2014-04-30

    This dataset presents measurements that quantify the clock synchronization time-agreement characteristics among several high performance computers including the current world's most powerful machine for open science, the U.S. Department of Energy's Titan machine sited at Oak Ridge National Laboratory. These ultra-fast machines derive much of their computational capability from extreme node counts (over 18000 nodes in the case of the Titan machine). Time-agreement is commonly utilized by parallel programming applications and tools, distributed programming application and tools, and system software. Our time-agreement measurements detail the degree of time variance between nodes and how that variance changes over time. The dataset includes empirical measurements and the accompanying spreadsheets.

  13. Design and implementation of a CORBA-based genome mapping system prototype.

    PubMed

    Hu, J; Mungall, C; Nicholson, D; Archibald, A L

    1998-01-01

    CORBA (Common Object Request Broker Architecture), as an open standard, is considered to be a good solution for the development and deployment of applications in distributed heterogeneous environments. This technology can be applied in the bioinformatics area to enhance utilization, management and interoperation between biological resources. This paper investigates issues in developing CORBA applications for genome mapping information systems in the Internet environment with emphasis on database connectivity and graphical user interfaces. The design and implementation of a CORBA prototype for an animal genome mapping database are described. The prototype demonstration is available via: http://www.ri.bbsrc.ac.uk/ark_corba/. jian.hu@bbsrc.ac.uk

  14. Magneto-Optical Relaxation Measurements of Functionalized Nanoparticles as a Novel Biosensor

    PubMed Central

    Aurich, Konstanze; Glöckl, Gunnar; Nagel, Stefan; Weitschies, Werner

    2009-01-01

    Measurements of magneto-optical relaxation signals of magnetic nanoparticles functionalized with biomolecules are a novel biosensing tool. Upon transmission of a laser beam through a nanoparticle suspension in a pulsed magnetic field, the properties of the laser beam change. This can be detected by optical methods. Biomolecular binding events leading to aggregation of nanoparticles are ascertainable by calculating the relaxation time and from this, the hydrodynamic diameters of the involved particles from the optical signal. Interaction between insulin-like growth factor 1 (IGF-1) and its antibody was utilized for demonstration of the measurement setup applicability as an immunoassay. Furthermore, a formerly developed kinetic model was utilized in order to determine kinetic parameters of the interaction. Beside utilization of the method as an immunoassay it can be applied for the characterization of diverse magnetic nanoparticles regarding their size and size distribution. PMID:22408511

  15. An Adaptive Priority Tuning System for Optimized Local CPU Scheduling using BOINC Clients

    NASA Astrophysics Data System (ADS)

    Mnaouer, Adel B.; Ragoonath, Colin

    2010-11-01

    Volunteer Computing (VC) is a Distributed Computing model which utilizes idle CPU cycles from computing resources donated by volunteers who are connected through the Internet to form a very large-scale, loosely coupled High Performance Computing environment. Distributed Volunteer Computing environments such as the BOINC framework is concerned mainly with the efficient scheduling of the available resources to the applications which require them. The BOINC framework thus contains a number of scheduling policies/algorithms both on the server-side and on the client which work together to maximize the available resources and to provide a degree of QoS in an environment which is highly volatile. This paper focuses on the BOINC client and introduces an adaptive priority tuning client side middleware application which improves the execution times of Work Units (WUs) while maintaining an acceptable Maximum Response Time (MRT) for the end user. We have conducted extensive experimentation of the proposed system and the results show clear speedup of BOINC applications using our optimized middleware as opposed to running using the original BOINC client.

  16. Efficient implementation of multidimensional fast fourier transform on a distributed-memory parallel multi-node computer

    DOEpatents

    Bhanot, Gyan V [Princeton, NJ; Chen, Dong [Croton-On-Hudson, NY; Gara, Alan G [Mount Kisco, NY; Giampapa, Mark E [Irvington, NY; Heidelberger, Philip [Cortlandt Manor, NY; Steinmacher-Burow, Burkhard D [Mount Kisco, NY; Vranas, Pavlos M [Bedford Hills, NY

    2012-01-10

    The present in invention is directed to a method, system and program storage device for efficiently implementing a multidimensional Fast Fourier Transform (FFT) of a multidimensional array comprising a plurality of elements initially distributed in a multi-node computer system comprising a plurality of nodes in communication over a network, comprising: distributing the plurality of elements of the array in a first dimension across the plurality of nodes of the computer system over the network to facilitate a first one-dimensional FFT; performing the first one-dimensional FFT on the elements of the array distributed at each node in the first dimension; re-distributing the one-dimensional FFT-transformed elements at each node in a second dimension via "all-to-all" distribution in random order across other nodes of the computer system over the network; and performing a second one-dimensional FFT on elements of the array re-distributed at each node in the second dimension, wherein the random order facilitates efficient utilization of the network thereby efficiently implementing the multidimensional FFT. The "all-to-all" re-distribution of array elements is further efficiently implemented in applications other than the multidimensional FFT on the distributed-memory parallel supercomputer.

  17. Efficient implementation of a multidimensional fast fourier transform on a distributed-memory parallel multi-node computer

    DOEpatents

    Bhanot, Gyan V [Princeton, NJ; Chen, Dong [Croton-On-Hudson, NY; Gara, Alan G [Mount Kisco, NY; Giampapa, Mark E [Irvington, NY; Heidelberger, Philip [Cortlandt Manor, NY; Steinmacher-Burow, Burkhard D [Mount Kisco, NY; Vranas, Pavlos M [Bedford Hills, NY

    2008-01-01

    The present in invention is directed to a method, system and program storage device for efficiently implementing a multidimensional Fast Fourier Transform (FFT) of a multidimensional array comprising a plurality of elements initially distributed in a multi-node computer system comprising a plurality of nodes in communication over a network, comprising: distributing the plurality of elements of the array in a first dimension across the plurality of nodes of the computer system over the network to facilitate a first one-dimensional FFT; performing the first one-dimensional FFT on the elements of the array distributed at each node in the first dimension; re-distributing the one-dimensional FFT-transformed elements at each node in a second dimension via "all-to-all" distribution in random order across other nodes of the computer system over the network; and performing a second one-dimensional FFT on elements of the array re-distributed at each node in the second dimension, wherein the random order facilitates efficient utilization of the network thereby efficiently implementing the multidimensional FFT. The "all-to-all" re-distribution of array elements is further efficiently implemented in applications other than the multidimensional FFT on the distributed-memory parallel supercomputer.

  18. NASA JPL Distributed Systems Technology (DST) Object-Oriented Component Approach for Software Inter-Operability and Reuse

    NASA Technical Reports Server (NTRS)

    Hall, Laverne; Hung, Chaw-Kwei; Lin, Imin

    2000-01-01

    The purpose of this paper is to provide a description of NASA JPL Distributed Systems Technology (DST) Section's object-oriented component approach to open inter-operable systems software development and software reuse. It will address what is meant by the terminology object component software, give an overview of the component-based development approach and how it relates to infrastructure support of software architectures and promotes reuse, enumerate on the benefits of this approach, and give examples of application prototypes demonstrating its usage and advantages. Utilization of the object-oriented component technology approach for system development and software reuse will apply to several areas within JPL, and possibly across other NASA Centers.

  19. Survey of remote sensing applications

    USGS Publications Warehouse

    Deutsch, Morris

    1974-01-01

    Data from the first earth resources technology satellite (ERTS) as well as from NASA and other aircraft, contain much of the information indicative of the distribution of groundwater and the extent of its utilization. Thermal infrared imagery from aircraft is particularly valuable in studying groundwater discharge to the sea and other surface water bodies. Color infrared photography from aircraft and space is also used to locate areas of potential groundwater development. Anomalies in vegetation, soils, moisture, and their pattern of distribution may be indicative of underlying groundwater conditions. Remote sensing may be used directly or indirectly to identify stream reaches for test holes or production wells. Similarly, location of submarine springs increase effectiveness of groundwater exploration in the coastal zone.

  20. Information architecture for a planetary 'exploration web'

    NASA Technical Reports Server (NTRS)

    Lamarra, N.; McVittie, T.

    2002-01-01

    'Web services' is a common way of deploying distributed applications whose software components and data sources may be in different locations, formats, languages, etc. Although such collaboration is not utilized significantly in planetary exploration, we believe there is significant benefit in developing an architecture in which missions could leverage each others capabilities. We believe that an incremental deployment of such an architecture could significantly contribute to the evolution of increasingly capable, efficient, and even autonomous remote exploration.

  1. Wind Energy Resource Atlas of Sri Lanka and the Maldives

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Elliott, D.; Schwartz, M.; Scott, G.

    2003-08-01

    The Wind Energy Resource Atlas of Sri Lanka and the Maldives, produced by the National Renewable Energy Laboratory's (NREL's) wind resource group identifies the wind characteristics and distribution of the wind resource in Sri Lanka and the Maldives. The detailed wind resource maps and other information contained in the atlas facilitate the identification of prospective areas for use of wind energy technologies, both for utility-scale power generation and off-grid wind energy applications.

  2. Adaptive Core Simulation Employing Discrete Inverse Theory - Part II: Numerical Experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abdel-Khalik, Hany S.; Turinsky, Paul J.

    2005-07-15

    Use of adaptive simulation is intended to improve the fidelity and robustness of important core attribute predictions such as core power distribution, thermal margins, and core reactivity. Adaptive simulation utilizes a selected set of past and current reactor measurements of reactor observables, i.e., in-core instrumentation readings, to adapt the simulation in a meaningful way. The companion paper, ''Adaptive Core Simulation Employing Discrete Inverse Theory - Part I: Theory,'' describes in detail the theoretical background of the proposed adaptive techniques. This paper, Part II, demonstrates several computational experiments conducted to assess the fidelity and robustness of the proposed techniques. The intentmore » is to check the ability of the adapted core simulator model to predict future core observables that are not included in the adaption or core observables that are recorded at core conditions that differ from those at which adaption is completed. Also, this paper demonstrates successful utilization of an efficient sensitivity analysis approach to calculate the sensitivity information required to perform the adaption for millions of input core parameters. Finally, this paper illustrates a useful application for adaptive simulation - reducing the inconsistencies between two different core simulator code systems, where the multitudes of input data to one code are adjusted to enhance the agreement between both codes for important core attributes, i.e., core reactivity and power distribution. Also demonstrated is the robustness of such an application.« less

  3. Multiattribute Fixed-State Utility Assessment.

    DTIC Science & Technology

    1981-03-27

    of a companion distribution, are presented in Appendix A. Because of the theory of conditional expected utility and the modelling of the utilities by...obtain approximations to the moments, using the companion V ) ) distribution discussed in Section 3. The moments of both distributions are discussed...1956b, 21, 207-216. Slavic, P. "From Shakespeare to Simon: Speculation -- and some evidence -- about man’s ability to process information." O g . a

  4. Solid-State Fault Current Limiter Development : Design and Testing Update of a 15kV SSCL Power Stack

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dr. Ram Adapa; Mr. Dante Piccone

    2012-04-30

    ABSTRACT The Solid-State Fault Current Limiter (SSCL) is a promising technology that can be applied to utility power delivery systems to address the problem of increasing fault currents associated with load growth. As demand continues to grow, more power is added to utility system either by increasing generator capacity or by adding distributed generators, resulting in higher available fault currents, often beyond the capabilities of the present infrastructure. The SSCL is power-electronics based equipment designed to work with the present utility system to address this problem. The SSCL monitors the line current and dynamically inserts additional impedance into the linemore » in the event of a fault being detected. The SSCL is based on a modular design and can be configured for 5kV through 69kV systems at nominal current ratings of 1000A to 4000A. Results and Findings This report provides the final test results on the development of 15kV class SSCL single phase power stack. The scope of work included the design of the modular standard building block sub-assemblies, the design and manufacture of the power stack and the testing of the power stack for the key functional tests of continuous current capability and fault current limiting action. Challenges and Objectives Solid-State Current Limiter technology impacts a wide spectrum of utility engineering and operating personnel. It addresses the problems associated with load growth both at Transmission and Distribution class networks. The design concept is pioneering in terms of developing the most efficient and compact power electronics equipment for utility use. The initial test results of the standard building blocks are promising. The independent laboratory tests of the power stack are promising. However the complete 3 phase system needs rigorous testing for performance and reliability. Applications, Values, and Use The SSCL is an intelligent power-electronics device which is modular in design and can provide current limiting or current interrupting capabilities. It can be applied to variety of applications from distribution class to transmission class power delivery grids and networks. It can also be applied to single major commercial and industrial loads and distributed generator supplies. The active switching of devices can be further utilized for protection of substation transformers. The stress on the system can be reduced substantially improving the life of the power system. It minimizes the voltage sag by speedy elimination of heavy fault currents and promises to be an important element of the utility power system. DOE Perspective This development effort is now focused on a 15kV system. This project will help mitigate the challenges of increasing available fault current. DOE has made a major contribution in providing a cost effective SSCL designed to integrate seamlessly into the Transmission and Distribution networks of today and the future. Approach SSCL development program for a 69kV SSCL was initiated which included the use of the Super GTO advanced semiconductor device which won the 2007 R&D100 Award. In the beginning, steps were identified to accomplish the economically viable design of a 69kV class Solid State Current Limiter that is extremely reliable, cost effective, and compact enough to be applied in urban transmission. The prime thrust in design and development was to encompass the 1000A and the 3000A ratings and provide a modular design to cover the wide range of applications. The focus of the project was then shifted to a 15kV class SSCL. The specifications for the 15kV power stack are reviewed. The design changes integrated into the 15kV power stack are discussed. In this Technical Update the complete project is summarized followed by a detailed test report. The power stack independent high voltage laboratory test requirements and results are presented. Keywords Solid State Current Limiter, SSCL, Fault Current Limiter, Fault Current Controller, Power electronics controller, Intelligent power-electronics Device, IED« less

  5. Planning for distributed workflows: constraint-based coscheduling of computational jobs and data placement in distributed environments

    NASA Astrophysics Data System (ADS)

    Makatun, Dzmitry; Lauret, Jérôme; Rudová, Hana; Šumbera, Michal

    2015-05-01

    When running data intensive applications on distributed computational resources long I/O overheads may be observed as access to remotely stored data is performed. Latencies and bandwidth can become the major limiting factor for the overall computation performance and can reduce the CPU/WallTime ratio to excessive IO wait. Reusing the knowledge of our previous research, we propose a constraint programming based planner that schedules computational jobs and data placements (transfers) in a distributed environment in order to optimize resource utilization and reduce the overall processing completion time. The optimization is achieved by ensuring that none of the resources (network links, data storages and CPUs) are oversaturated at any moment of time and either (a) that the data is pre-placed at the site where the job runs or (b) that the jobs are scheduled where the data is already present. Such an approach eliminates the idle CPU cycles occurring when the job is waiting for the I/O from a remote site and would have wide application in the community. Our planner was evaluated and simulated based on data extracted from log files of batch and data management systems of the STAR experiment. The results of evaluation and estimation of performance improvements are discussed in this paper.

  6. A Photo Storm Report Mobile Application, Processing/Distribution System, and AWIPS-II Display Concept

    NASA Astrophysics Data System (ADS)

    Longmore, S. P.; Bikos, D.; Szoke, E.; Miller, S. D.; Brummer, R.; Lindsey, D. T.; Hillger, D.

    2014-12-01

    The increasing use of mobile phones equipped with digital cameras and the ability to post images and information to the Internet in real-time has significantly improved the ability to report events almost instantaneously. In the context of severe weather reports, a representative digital image conveys significantly more information than a simple text or phone relayed report to a weather forecaster issuing severe weather warnings. It also allows the forecaster to reasonably discern the validity and quality of a storm report. Posting geo-located, time stamped storm report photographs utilizing a mobile phone application to NWS social media weather forecast office pages has generated recent positive feedback from forecasters. Building upon this feedback, this discussion advances the concept, development, and implementation of a formalized Photo Storm Report (PSR) mobile application, processing and distribution system and Advanced Weather Interactive Processing System II (AWIPS-II) plug-in display software.The PSR system would be composed of three core components: i) a mobile phone application, ii) a processing and distribution software and hardware system, and iii) AWIPS-II data, exchange and visualization plug-in software. i) The mobile phone application would allow web-registered users to send geo-location, view direction, and time stamped PSRs along with severe weather type and comments to the processing and distribution servers. ii) The servers would receive PSRs, convert images and information to NWS network bandwidth manageable sizes in an AWIPS-II data format, distribute them on the NWS data communications network, and archive the original PSRs for possible future research datasets. iii) The AWIPS-II data and exchange plug-ins would archive PSRs, and the visualization plug-in would display PSR locations, times and directions by hour, similar to surface observations. Hovering on individual PSRs would reveal photo thumbnails and clicking on them would display the full resolution photograph.Here, we present initial NWS forecaster feedback received from social media posted PSRs, motivating the possible advantages of PSRs within AWIPS-II, the details of developing and implementing a PSR system, and possible future applications beyond severe weather reports and AWIPS-II.

  7. Energy management of a university campus utilizing short-term load forecasting with an artificial neural network

    NASA Astrophysics Data System (ADS)

    Palchak, David

    Electrical load forecasting is a tool that has been utilized by distribution designers and operators as a means for resource planning and generation dispatch. The techniques employed in these predictions are proving useful in the growing market of consumer, or end-user, participation in electrical energy consumption. These predictions are based on exogenous variables, such as weather, and time variables, such as day of week and time of day as well as prior energy consumption patterns. The participation of the end-user is a cornerstone of the Smart Grid initiative presented in the Energy Independence and Security Act of 2007, and is being made possible by the emergence of enabling technologies such as advanced metering infrastructure. The optimal application of the data provided by an advanced metering infrastructure is the primary motivation for the work done in this thesis. The methodology for using this data in an energy management scheme that utilizes a short-term load forecast is presented. The objective of this research is to quantify opportunities for a range of energy management and operation cost savings of a university campus through the use of a forecasted daily electrical load profile. The proposed algorithm for short-term load forecasting is optimized for Colorado State University's main campus, and utilizes an artificial neural network that accepts weather and time variables as inputs. The performance of the predicted daily electrical load is evaluated using a number of error measurements that seek to quantify the best application of the forecast. The energy management presented utilizes historical electrical load data from the local service provider to optimize the time of day that electrical loads are being managed. Finally, the utilization of forecasts in the presented energy management scenario is evaluated based on cost and energy savings.

  8. Feasibility Studies of Vortex Flow Impact On the Proliferation of Algae in Hydrogen Production for Fuel Cell Applications

    NASA Astrophysics Data System (ADS)

    Miskon, Azizi; A/L Thanakodi, Suresh; Shiema Moh Nazar, Nazatul; Kit Chong, Marcus Wai; Sobri Takriff, Mohd; Fakir Kamarudin, Kamrul; Aziz Norzali, Abdul; Nooraya Mohd Tawil, Siti

    2016-11-01

    The instability of crude oil price in global market as well as the sensitivity towards green energy increases, more research works being carried out to find alternative energy replacing the depleting of fossil fuels. Photobiological hydrogen production system using algae is one of the promising alternative energy source. However, the yield of hydrogen utilizing the current photobioreactor (PBR) is still low for commercial application due to restricted light penetration into the deeper regions of the reactor. Therefore, this paper studies the feasibility of vortex flow impact utilizing magnetic stirring in hydrogen production for fuel cell applications. For comparison of results, a magnetic stirrer is placed under a PBR of algae to stir the algae to obtain an even distribution of sunlight to the algae while the controlled PBR of algae kept in static. The produced hydrogen level was measured using hydrogen sensor circuit and the data collected were communicated to laptop using Arduino Uno. The results showed more cell counts and hydrogen produced in the PBR under the influence of magnetic stirring compared to static PBR by an average of 8 percent in 4 days.

  9. A constrained multinomial Probit route choice model in the metro network: Formulation, estimation and application

    PubMed Central

    Zhang, Yongsheng; Wei, Heng; Zheng, Kangning

    2017-01-01

    Considering that metro network expansion brings us with more alternative routes, it is attractive to integrate the impacts of routes set and the interdependency among alternative routes on route choice probability into route choice modeling. Therefore, the formulation, estimation and application of a constrained multinomial probit (CMNP) route choice model in the metro network are carried out in this paper. The utility function is formulated as three components: the compensatory component is a function of influencing factors; the non-compensatory component measures the impacts of routes set on utility; following a multivariate normal distribution, the covariance of error component is structured into three parts, representing the correlation among routes, the transfer variance of route, and the unobserved variance respectively. Considering multidimensional integrals of the multivariate normal probability density function, the CMNP model is rewritten as Hierarchical Bayes formula and M-H sampling algorithm based Monte Carlo Markov Chain approach is constructed to estimate all parameters. Based on Guangzhou Metro data, reliable estimation results are gained. Furthermore, the proposed CMNP model also shows a good forecasting performance for the route choice probabilities calculation and a good application performance for transfer flow volume prediction. PMID:28591188

  10. Rapid Analysis of Microalgal Triacylglycerols with Direct-Infusion Mass Spectrometry

    DOE PAGES

    Christensen, Earl; Sudasinghe, Nilusha; Dandamudi, Kodanda Phani Raj; ...

    2015-09-01

    Cultivation of microalgae has the potential to provide lipid-derived feedstocks for conversion to liquid transportation fuels. Lipid extracts from microalgae are significantly more complex than those of traditional seed oils, and their composition changes significantly throughout the microalgal growth period. With three acyl side chains per molecule, triglycerides (TAGs) are an important fuel precursor, and the distribution of acyl chain composition for TAGs has a significant impact on fuel properties and processing. Therefore, determination of the distribution of microalgal TAG production is needed to assess the value of algal extracts designed for fuel production and to optimize strain, cultivation, andmore » harvesting practices. Methods utilized for TAG speciation commonly involve complicated and time-consuming chromatographic techniques. Here we present a method for TAG speciation and quantification based on direct-infusion mass spectrometry, which provides rapid characterization of TAG profiles without chromatographic separation. Specifically, we utilize Fourier transform ion cyclotron resonance mass spectrometry (FT-ICR MS) to provide a reference library of TAGs for the microalgae Nannochloropsis sp. that provides the basis for high-throughput TAG quantitation by time-of-flight mass spectrometry (TOF MS). In conclusion, we demonstrate the application of this novel approach for lipid characterization with respect to TAG compound distribution, which informs both immediate and future strain and process optimization strategies.« less

  11. A novel method to characterize silica bodies in grasses.

    PubMed

    Dabney, Clemon; Ostergaard, Jason; Watkins, Eric; Chen, Changbin

    2016-01-01

    The deposition of silicon into epidermal cells of grass species is thought to be an important mechanism that plants use as a defense against pests and environmental stresses. There are a number of techniques available to study the size, density and distribution pattern of silica bodies in grass leaves. However, none of those techniques can provide a high-throughput analysis, especially for a great number of samples. We developed a method utilizing the autofluorescence of silica bodies to investigate their size and distribution, along with the number of carbon inclusions within the silica bodies of perennial grass species Koeleria macrantha. Fluorescence images were analyzed by image software Adobe Photoshop CS5 or ImageJ that remarkably facilitated the quantification of silica bodies in the dry ash. We observed three types of silica bodies or silica body related mineral structures. Silica bodies were detected on both abaxial and adaxial epidermis of K. macrantha leaves, although their sizes, density, and distribution patterns were different. No auto-fluorescence was detected from carbon inclusions. The combination of fluorescence microscopy and image processing software displayed efficient utilization in the identification and quantification of silica bodies in K. macrantha leaf tissues, which should applicable to biological, ecological and geological studies of grasses including forage, turf grasses and cereal crops.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Christensen, Earl; Sudasinghe, Nilusha; Dandamudi, Kodanda Phani Raj

    Cultivation of microalgae has the potential to provide lipid-derived feedstocks for conversion to liquid transportation fuels. Lipid extracts from microalgae are significantly more complex than those of traditional seed oils, and their composition changes significantly throughout the microalgal growth period. With three acyl side chains per molecule, triglycerides (TAGs) are an important fuel precursor, and the distribution of acyl chain composition for TAGs has a significant impact on fuel properties and processing. Therefore, determination of the distribution of microalgal TAG production is needed to assess the value of algal extracts designed for fuel production and to optimize strain, cultivation, andmore » harvesting practices. Methods utilized for TAG speciation commonly involve complicated and time-consuming chromatographic techniques. Here we present a method for TAG speciation and quantification based on direct-infusion mass spectrometry, which provides rapid characterization of TAG profiles without chromatographic separation. Specifically, we utilize Fourier transform ion cyclotron resonance mass spectrometry (FT-ICR MS) to provide a reference library of TAGs for the microalgae Nannochloropsis sp. that provides the basis for high-throughput TAG quantitation by time-of-flight mass spectrometry (TOF MS). In conclusion, we demonstrate the application of this novel approach for lipid characterization with respect to TAG compound distribution, which informs both immediate and future strain and process optimization strategies.« less

  13. Pharmacy-based statewide naloxone distribution: A novel "top-down, bottom-up" approach.

    PubMed

    Morton, Kate J; Harrand, Brianna; Floyd, Carly Cloud; Schaefer, Craig; Acosta, Julie; Logan, Bridget Claire; Clark, Karen

    To highlight New Mexico's multifaceted approach to widespread pharmacy naloxone distribution and to share the interventions as a tool for improving pharmacy-based naloxone practices in other states. New Mexico had the second highest drug overdose death rate in 2014 of which 53% were related to prescription opioids. Opioid overdose death is preventable through the use of naloxone, a safe and effective medication that reverses the effects of prescription opioids and heroin. Pharmacists can play an important role in providing naloxone to individuals who use prescription opioids. Not applicable. Not applicable. A multifaceted approach was utilized in New Mexico from the top down with legislative passage of provisions for a statewide standing order and New Mexico Department of Health support for pharmacy-based naloxone delivery. A bottom up approach was also initiated with the development and implementation of a training program for pharmacists and pharmacy technicians. Naloxone Medicaid claims were used to illustrate statewide distribution and utilization of the pharmacist statewide standing order for naloxone. Percent of pharmacies dispensing naloxone in each county were calculated. Trained pharmacy staff completed a program evaluation form. Questions about quality of instruction and ability of trainer to meet stated objectives were rated on a Likert scale. There were 808 naloxone Medicaid claims from 100 outpatient pharmacies during the first half of 2016, a 9-fold increase over 2014. The "A Dose of R x eality" training program evaluation indicated that participants felt the training was free from bias and met all stated objectives (4 out of 4 on Likert scale). A multi-pronged approach coupling state and community collaboration was successful in overcoming barriers and challenges associated with pharmacy naloxone distribution and ensured its success as an effective avenue for naloxone acquisition in urban and rural communities. Copyright © 2017 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  14. Restricted access Improved hydrogeophysical characterization and monitoring through parallel modeling and inversion of time-domain resistivity andinduced-polarization data

    USGS Publications Warehouse

    Johnson, Timothy C.; Versteeg, Roelof J.; Ward, Andy; Day-Lewis, Frederick D.; Revil, André

    2010-01-01

    Electrical geophysical methods have found wide use in the growing discipline of hydrogeophysics for characterizing the electrical properties of the subsurface and for monitoring subsurface processes in terms of the spatiotemporal changes in subsurface conductivity, chargeability, and source currents they govern. Presently, multichannel and multielectrode data collections systems can collect large data sets in relatively short periods of time. Practitioners, however, often are unable to fully utilize these large data sets and the information they contain because of standard desktop-computer processing limitations. These limitations can be addressed by utilizing the storage and processing capabilities of parallel computing environments. We have developed a parallel distributed-memory forward and inverse modeling algorithm for analyzing resistivity and time-domain induced polar-ization (IP) data. The primary components of the parallel computations include distributed computation of the pole solutions in forward mode, distributed storage and computation of the Jacobian matrix in inverse mode, and parallel execution of the inverse equation solver. We have tested the corresponding parallel code in three efforts: (1) resistivity characterization of the Hanford 300 Area Integrated Field Research Challenge site in Hanford, Washington, U.S.A., (2) resistivity characterization of a volcanic island in the southern Tyrrhenian Sea in Italy, and (3) resistivity and IP monitoring of biostimulation at a Superfund site in Brandywine, Maryland, U.S.A. Inverse analysis of each of these data sets would be limited or impossible in a standard serial computing environment, which underscores the need for parallel high-performance computing to fully utilize the potential of electrical geophysical methods in hydrogeophysical applications.

  15. Workload Characterization of CFD Applications Using Partial Differential Equation Solvers

    NASA Technical Reports Server (NTRS)

    Waheed, Abdul; Yan, Jerry; Saini, Subhash (Technical Monitor)

    1998-01-01

    Workload characterization is used for modeling and evaluating of computing systems at different levels of detail. We present workload characterization for a class of Computational Fluid Dynamics (CFD) applications that solve Partial Differential Equations (PDEs). This workload characterization focuses on three high performance computing platforms: SGI Origin2000, EBM SP-2, a cluster of Intel Pentium Pro bases PCs. We execute extensive measurement-based experiments on these platforms to gather statistics of system resource usage, which results in workload characterization. Our workload characterization approach yields a coarse-grain resource utilization behavior that is being applied for performance modeling and evaluation of distributed high performance metacomputing systems. In addition, this study enhances our understanding of interactions between PDE solver workloads and high performance computing platforms and is useful for tuning these applications.

  16. Fluorescence Sensors for Early Detection of Nitrification in Drinking Water Distribution Systems – Interference Corrections (Abstract)

    EPA Science Inventory

    Nitrification event detection in chloraminated drinking water distribution systems (DWDSs) remains an ongoing challenge for many drinking water utilities, including Dallas Water Utilities (DWU) and the City of Houston (CoH). Each year, these utilities experience nitrification eve...

  17. Fluorescence Sensors for Early Detection of Nitrification in Drinking Water Distribution Systems – Interference Corrections (Poster)

    EPA Science Inventory

    Nitrification event detection in chloraminated drinking water distribution systems (DWDSs) remains an ongoing challenge for many drinking water utilities, including Dallas Water Utilities (DWU) and the City of Houston (CoH). Each year, these utilities experience nitrification eve...

  18. Developing Fluorescence Sensor Systems for Early Detection of Nitrification Events in Chloraminated Drinking Water Distribution Systems

    EPA Science Inventory

    Detection of nitrification events in chloraminated drinking water distribution systems remains an ongoing challenge for many drinking water utilities, including Dallas Water Utilities (DWU) and the City of Houston (CoH). Each year, these utilities experience nitrification events ...

  19. A lightweight distributed framework for computational offloading in mobile cloud computing.

    PubMed

    Shiraz, Muhammad; Gani, Abdullah; Ahmad, Raja Wasim; Adeel Ali Shah, Syed; Karim, Ahmad; Rahman, Zulkanain Abdul

    2014-01-01

    The latest developments in mobile computing technology have enabled intensive applications on the modern Smartphones. However, such applications are still constrained by limitations in processing potentials, storage capacity and battery lifetime of the Smart Mobile Devices (SMDs). Therefore, Mobile Cloud Computing (MCC) leverages the application processing services of computational clouds for mitigating resources limitations in SMDs. Currently, a number of computational offloading frameworks are proposed for MCC wherein the intensive components of the application are outsourced to computational clouds. Nevertheless, such frameworks focus on runtime partitioning of the application for computational offloading, which is time consuming and resources intensive. The resource constraint nature of SMDs require lightweight procedures for leveraging computational clouds. Therefore, this paper presents a lightweight framework which focuses on minimizing additional resources utilization in computational offloading for MCC. The framework employs features of centralized monitoring, high availability and on demand access services of computational clouds for computational offloading. As a result, the turnaround time and execution cost of the application are reduced. The framework is evaluated by testing prototype application in the real MCC environment. The lightweight nature of the proposed framework is validated by employing computational offloading for the proposed framework and the latest existing frameworks. Analysis shows that by employing the proposed framework for computational offloading, the size of data transmission is reduced by 91%, energy consumption cost is minimized by 81% and turnaround time of the application is decreased by 83.5% as compared to the existing offloading frameworks. Hence, the proposed framework minimizes additional resources utilization and therefore offers lightweight solution for computational offloading in MCC.

  20. A Lightweight Distributed Framework for Computational Offloading in Mobile Cloud Computing

    PubMed Central

    Shiraz, Muhammad; Gani, Abdullah; Ahmad, Raja Wasim; Adeel Ali Shah, Syed; Karim, Ahmad; Rahman, Zulkanain Abdul

    2014-01-01

    The latest developments in mobile computing technology have enabled intensive applications on the modern Smartphones. However, such applications are still constrained by limitations in processing potentials, storage capacity and battery lifetime of the Smart Mobile Devices (SMDs). Therefore, Mobile Cloud Computing (MCC) leverages the application processing services of computational clouds for mitigating resources limitations in SMDs. Currently, a number of computational offloading frameworks are proposed for MCC wherein the intensive components of the application are outsourced to computational clouds. Nevertheless, such frameworks focus on runtime partitioning of the application for computational offloading, which is time consuming and resources intensive. The resource constraint nature of SMDs require lightweight procedures for leveraging computational clouds. Therefore, this paper presents a lightweight framework which focuses on minimizing additional resources utilization in computational offloading for MCC. The framework employs features of centralized monitoring, high availability and on demand access services of computational clouds for computational offloading. As a result, the turnaround time and execution cost of the application are reduced. The framework is evaluated by testing prototype application in the real MCC environment. The lightweight nature of the proposed framework is validated by employing computational offloading for the proposed framework and the latest existing frameworks. Analysis shows that by employing the proposed framework for computational offloading, the size of data transmission is reduced by 91%, energy consumption cost is minimized by 81% and turnaround time of the application is decreased by 83.5% as compared to the existing offloading frameworks. Hence, the proposed framework minimizes additional resources utilization and therefore offers lightweight solution for computational offloading in MCC. PMID:25127245

  1. Understanding I/O workload characteristics of a Peta-scale storage system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Youngjae; Gunasekaran, Raghul

    2015-01-01

    Understanding workload characteristics is critical for optimizing and improving the performance of current systems and software, and architecting new storage systems based on observed workload patterns. In this paper, we characterize the I/O workloads of scientific applications of one of the world s fastest high performance computing (HPC) storage cluster, Spider, at the Oak Ridge Leadership Computing Facility (OLCF). OLCF flagship petascale simulation platform, Titan, and other large HPC clusters, in total over 250 thousands compute cores, depend on Spider for their I/O needs. We characterize the system utilization, the demands of reads and writes, idle time, storage space utilization,more » and the distribution of read requests to write requests for the Peta-scale Storage Systems. From this study, we develop synthesized workloads, and we show that the read and write I/O bandwidth usage as well as the inter-arrival time of requests can be modeled as a Pareto distribution. We also study the I/O load imbalance problems using I/O performance data collected from the Spider storage system.« less

  2. In-situ sensory technique for in-service quality monitoring: measurement of the complex Young's modulus of polymers

    NASA Astrophysics Data System (ADS)

    Zhou, Shunhua; Liang, Chen; Rogers, Craig A.; Sun, Fanping P.; Vick, L.

    1993-07-01

    Applications of polymeric adhesives in joining different materials have necessitated quantitative health inspection of adhesive joints (coverage, state of cure, adhesive strength, location of voids, etc.). A new in-situ sensory method has been proposed in this paper to inspect the amount and distribution of the critical constituents of polymers and to measure the characteristic parameters (complex Young's modulus and damping). In this technique, ferromagnetic particles have been embedded in a polymeric matrix, similar to a particle- reinforced composite. The dynamic signatures extracted from the tests as a result of magnetic excitation of the embedded ferromagnetic particles are used to evaluate the complex Young's modulus of the host polymers. Moreover, the amplitude of the frequency response is utilized to identify the amount and distribution of embedded particles in polymeric materials or adhesive joints. The results predicted from the theoretical model agree well with the experimental results. The theoretical analyses and the experimental work conducted have demonstrated the utility of the sensory technique presented for in-service health interrogation.

  3. Groundwater pumping by heterogeneous users

    NASA Astrophysics Data System (ADS)

    Saak, Alexander E.; Peterson, Jeffrey M.

    2012-08-01

    Farm size is a significant determinant of both groundwater-irrigated farm acreage and groundwater-irrigation-application rates per unit land area. This paper analyzes the patterns of groundwater exploitation when resource users in the area overlying a common aquifer are heterogeneous. In the presence of user heterogeneity, the common resource problem consists of inefficient dynamic and spatial allocation of groundwater because it impacts income distribution not only across periods but also across farmers. Under competitive allocation, smaller farmers pump groundwater faster if farmers have a constant marginal periodic utility of income. However, it is possible that larger farmers pump faster if the Arrow-Pratt coefficient of relative risk-aversion is sufficiently decreasing in income. A greater farm-size inequality may either moderate or amplify income inequality among farmers. Its effect on welfare depends on the curvature properties of the agricultural output function and the farmer utility of income. Also, it is shown that a flat-rate quota policy that limits the quantity of groundwater extraction per unit land area may have unintended consequences for the income distribution among farmers.

  4. Collectives for Multiple Resource Job Scheduling Across Heterogeneous Servers

    NASA Technical Reports Server (NTRS)

    Tumer, K.; Lawson, J.

    2003-01-01

    Efficient management of large-scale, distributed data storage and processing systems is a major challenge for many computational applications. Many of these systems are characterized by multi-resource tasks processed across a heterogeneous network. Conventional approaches, such as load balancing, work well for centralized, single resource problems, but breakdown in the more general case. In addition, most approaches are often based on heuristics which do not directly attempt to optimize the world utility. In this paper, we propose an agent based control system using the theory of collectives. We configure the servers of our network with agents who make local job scheduling decisions. These decisions are based on local goals which are constructed to be aligned with the objective of optimizing the overall efficiency of the system. We demonstrate that multi-agent systems in which all the agents attempt to optimize the same global utility function (team game) only marginally outperform conventional load balancing. On the other hand, agents configured using collectives outperform both team games and load balancing (by up to four times for the latter), despite their distributed nature and their limited access to information.

  5. Optimal reactive power planning for distribution systems considering intermittent wind power using Markov model and genetic algorithm

    NASA Astrophysics Data System (ADS)

    Li, Cheng

    Wind farms, photovoltaic arrays, fuel cells, and micro-turbines are all considered to be Distributed Generation (DG). DG is defined as the generation of power which is dispersed throughout a utility's service territory and either connected to the utility's distribution system or isolated in a small grid. This thesis addresses modeling and economic issues pertaining to the optimal reactive power planning for distribution system with wind power generation (WPG) units. Wind farms are inclined to cause reverse power flows and voltage variations due to the random-like outputs of wind turbines. To deal with this kind of problem caused by wide spread usage of wind power generation, this thesis investigates voltage and reactive power controls in such a distribution system. Consequently static capacitors (SC) and transformer taps are introduced into the system and treated as controllers. For the purpose of getting optimum voltage and realizing reactive power control, the research proposes a proper coordination among the controllers like on-load tap changer (OLTC), feeder-switched capacitors. What's more, in order to simulate its uncertainty, the wind power generation is modeled by the Markov model. In that way, calculating the probabilities for all the scenarios is possible. Some outputs with consecutive and discrete values have been used for transition between successive time states and within state wind speeds. The thesis will describe the method to generate the wind speed time series from the transition probability matrix. After that, utilizing genetic algorithm, the optimal locations of SCs, the sizes of SCs and transformer taps are determined so as to minimize the cost or minimize the power loss, and more importantly improve voltage profiles. The applicability of the proposed method is verified through simulation on a 9-bus system and a 30-bus system respectively. At last, the simulation results indicate that as long as the available capacitors are able to sufficiently compensate the reactive power demand, the DG operation no longer imposes a significant effect on the voltage fluctuations in the distribution system. And the proposed approach is efficient, simple and straightforward.

  6. Foundational Report Series. Advanced Distribution management Systems for Grid Modernization (Importance of DMS for Distribution Grid Modernization)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Jianhui

    2015-09-01

    Grid modernization is transforming the operation and management of electric distribution systems from manual, paper-driven business processes to electronic, computer-assisted decisionmaking. At the center of this business transformation is the distribution management system (DMS), which provides a foundation from which optimal levels of performance can be achieved in an increasingly complex business and operating environment. Electric distribution utilities are facing many new challenges that are dramatically increasing the complexity of operating and managing the electric distribution system: growing customer expectations for service reliability and power quality, pressure to achieve better efficiency and utilization of existing distribution system assets, and reductionmore » of greenhouse gas emissions by accommodating high penetration levels of distributed generating resources powered by renewable energy sources (wind, solar, etc.). Recent “storm of the century” events in the northeastern United States and the lengthy power outages and customer hardships that followed have greatly elevated the need to make power delivery systems more resilient to major storm events and to provide a more effective electric utility response during such regional power grid emergencies. Despite these newly emerging challenges for electric distribution system operators, only a small percentage of electric utilities have actually implemented a DMS. This paper discusses reasons why a DMS is needed and why the DMS may emerge as a mission-critical system that will soon be considered essential as electric utilities roll out their grid modernization strategies.« less

  7. Coincidence and covariance data acquisition in photoelectron and -ion spectroscopy. I. Formal theory

    NASA Astrophysics Data System (ADS)

    Mikosch, Jochen; Patchkovskii, Serguei

    2013-10-01

    We derive a formal theory of noisy Poisson processes with multiple outcomes. We obtain simple, compact expressions for the probability distribution function of arbitrarily complex composite events and its moments. We illustrate the utility of the theory by analyzing properties of coincidence and covariance photoelectron-photoion detection involving single-ionization events. The results and techniques introduced in this work are directly applicable to more general coincidence and covariance experiments, including multiple ionization and multiple-ion fragmentation pathways.

  8. Applications of remote sensing in resource management in Nebraska

    NASA Technical Reports Server (NTRS)

    Drew, J. V.

    1975-01-01

    A computer-generated graphic display of land use data was developed. The level II inventory data for Sarpy County, Nebraska, was placed on magnetic tape. This data could then be displayed in a map format for comparative analysis of amount and distribution of the various categories of land use. The presentation scale can be varied and thus utilized as a direct guide for cartographic purposes during preparation for publication. In addition, the inventory and classification system was further refined.

  9. Retrieval of Mid-tropospheric CO2 Directly from AIRS Measurements

    NASA Technical Reports Server (NTRS)

    Olsen, Edward T.; Chahine, Moustafa T.; Chen, Luke L.; Pagano, Thomas S.

    2008-01-01

    We apply the method of Vanishing Partial Derivatives (VPD) to AIRS spectra to retrieve daily the global distribution of CO2 at a nadir geospatial resolution of 90 km x 90 km without requiring a first-guess input beyond the global average. Our retrievals utilize the 15 (micro)m band radiances, a complex spectral region. This method may be of value in other applications, in which spectral signatures of multiple species are not well isolated spectrally from one another.

  10. Concept for a power system controller for large space electrical power systems

    NASA Technical Reports Server (NTRS)

    Lollar, L. F.; Lanier, J. R., Jr.; Graves, J. R.

    1981-01-01

    The development of technology for a fail-operatonal power system controller (PSC) utilizing microprocessor technology for managing the distribution and power processor subsystems of a large multi-kW space electrical power system is discussed. The specific functions which must be performed by the PSC, the best microprocessor available to do the job, and the feasibility, cost savings, and applications of a PSC were determined. A limited function breadboard version of a PSC was developed to demonstrate the concept and potential cost savings.

  11. Brain tissues volume measurements from 2D MRI using parametric approach

    NASA Astrophysics Data System (ADS)

    L'vov, A. A.; Toropova, O. A.; Litovka, Yu. V.

    2018-04-01

    The purpose of the paper is to propose a fully automated method of volume assessment of structures within human brain. Our statistical approach uses maximum interdependency principle for decision making process of measurements consistency and unequal observations. Detecting outliers performed using maximum normalized residual test. We propose a statistical model which utilizes knowledge of tissues distribution in human brain and applies partial data restoration for precision improvement. The approach proposes completed computationally efficient and independent from segmentation algorithm used in the application.

  12. An application of queueing theory to the design of channel requirements for special purpose communications satellites. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Hein, G. F.

    1974-01-01

    Special purpose satellites are very cost sensitive to the number of broadcast channels, usually will have Poisson arrivals, fairly low utilization (less than 35%), and a very high availability requirement. To solve the problem of determining the effects of limiting C the number of channels, the Poisson arrival, infinite server queueing model will be modified to describe the many server case. The model is predicated on the reproductive property of the Poisson distribution.

  13. Distribution Atlas of Proliferating Bone Marrow in Non-Small Cell Lung Cancer Patients Measured by FLT-PET/CT Imaging, With Potential Applicability in Radiation Therapy Planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Campbell, Belinda A., E-mail: Belinda.Campbell@petermac.org; Callahan, Jason; Bressel, Mathias

    Purpose: Proliferating bone marrow is exquisitely sensitive to ionizing radiation. Knowledge of its distribution could improve radiation therapy planning to minimize unnecessary marrow exposure and avoid consequential prolonged myelosuppression. [18F]-Fluoro-3-deoxy-3-L-fluorothymidine (FLT)–positron emission tomography (PET) is a novel imaging modality that provides detailed quantitative images of proliferating tissues, including bone marrow. We used FLT-PET imaging in cancer patients to produce an atlas of marrow distribution with potential clinical utility. Methods and Materials: The FLT-PET and fused CT scans of eligible patients with non-small cell lung cancer (no distant metastases, no prior cytotoxic exposure, no hematologic disorders) were reviewed. The proportions of skeletalmore » FLT activity in 10 predefined bony regions were determined and compared according to age, sex, and recent smoking status. Results: Fifty-one patients were studied: 67% male; median age 68 (range, 31-87) years; 8% never smokers; 70% no smoking in the preceding 3 months. Significant differences in marrow distribution occurred between sex and age groups. No effect was detected from smoking in the preceding 3 months. Using the mean percentages of FLT uptake per body region, we created an atlas of the distribution of functional bone marrow in 4 subgroups defined by sex and age. Conclusions: This atlas has potential utility for estimating the distribution of active marrow in adult cancer patients to guide radiation therapy planning. However, because of interindividual variation it should be used with caution when radiation therapy risks ablating large proportions of active marrow; in such cases, individual FLT-PET scans may be required.« less

  14. Monomer functionalized silica coated with Ag nanoparticles for enhanced SERS hotspots

    NASA Astrophysics Data System (ADS)

    Newmai, M. Boazbou; Verma, Manoj; Kumar, P. Senthil

    2018-05-01

    Mesoporous silica (SiO2) spheres are well-known for their excellent chromatographic properties such as the relatively high specific surface, large pore volume, uniform particle size, narrow pore size distribution with favorable pore connectivity; whereas the noble metal Ag nanoparticles have unique size/shape dependant surface plasmon resonance with wide ranging applications. Thus, the desire to synchronize both their properties for specific applications has naturally prompted research in the design and synthesis of core-shell type novel nanoAg@mesoSiO2 nanocomposites, which display potential utility in applications such as photothermal therapy, photocatalysis, molecular sensing, and photovoltaics. In the present work, SiO2 spheres were carefully functionalized with the monomer, N-vinyl pyrrolidone (NVP), which cohesively controls the uniform mass transfer of Ag+ metal ions, thereby enabling its sequential reduction to zerovalent Ag (in the presence of slightly excess NaOH) by electron transfer from nucleophilic attack of the NVP vinyl group by the water molecules even under ambient conditions. Complete metal nanoshell coverage of the silica surface was obtained after multiple Ag deposition cycles, as systematically confirmed from the BET, TEM, optical and FTIR characterization. Our present Ag-coated silica spheres were directly utilized as viable SERS substrates with high sensitivity in contrast with other long chain polymer/surfactant coated silica spheres, owing to the presence of significant number of nanogaps enhanced SERS 'hotspots', which were methodically analyzed utilizing two example analytes, such as crystal violet (CV) and calendula officinalis (CaF).

  15. VASA: Interactive Computational Steering of Large Asynchronous Simulation Pipelines for Societal Infrastructure.

    PubMed

    Ko, Sungahn; Zhao, Jieqiong; Xia, Jing; Afzal, Shehzad; Wang, Xiaoyu; Abram, Greg; Elmqvist, Niklas; Kne, Len; Van Riper, David; Gaither, Kelly; Kennedy, Shaun; Tolone, William; Ribarsky, William; Ebert, David S

    2014-12-01

    We present VASA, a visual analytics platform consisting of a desktop application, a component model, and a suite of distributed simulation components for modeling the impact of societal threats such as weather, food contamination, and traffic on critical infrastructure such as supply chains, road networks, and power grids. Each component encapsulates a high-fidelity simulation model that together form an asynchronous simulation pipeline: a system of systems of individual simulations with a common data and parameter exchange format. At the heart of VASA is the Workbench, a visual analytics application providing three distinct features: (1) low-fidelity approximations of the distributed simulation components using local simulation proxies to enable analysts to interactively configure a simulation run; (2) computational steering mechanisms to manage the execution of individual simulation components; and (3) spatiotemporal and interactive methods to explore the combined results of a simulation run. We showcase the utility of the platform using examples involving supply chains during a hurricane as well as food contamination in a fast food restaurant chain.

  16. A Hyaluronan-Based Injectable Hydrogel Improves the Survival and Integration of Stem Cell Progeny following Transplantation.

    PubMed

    Ballios, Brian G; Cooke, Michael J; Donaldson, Laura; Coles, Brenda L K; Morshead, Cindi M; van der Kooy, Derek; Shoichet, Molly S

    2015-06-09

    The utility of stem cells and their progeny in adult transplantation models has been limited by poor survival and integration. We designed an injectable and bioresorbable hydrogel blend of hyaluronan and methylcellulose (HAMC) and tested it with two cell types in two animal models, thereby gaining an understanding of its general applicability for enhanced cell distribution, survival, integration, and functional repair relative to conventional cell delivery in saline. HAMC improves cell survival and integration of retinal stem cell (RSC)-derived rods in the retina. The pro-survival mechanism of HAMC is ascribed to the interaction of the CD44 receptor with HA. Transient disruption of the retinal outer limiting membrane, combined with HAMC delivery, results in significantly improved rod survival and visual function. HAMC also improves the distribution, viability, and functional repair of neural stem and progenitor cells (NSCs). The HAMC delivery system improves cell transplantation efficacy in two CNS models, suggesting broad applicability. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  17. Microprocessor control and networking for the amps breadboard

    NASA Technical Reports Server (NTRS)

    Floyd, Stephen A.

    1987-01-01

    Future space missions will require more sophisticated power systems, implying higher costs and more extensive crew and ground support involvement. To decrease this human involvement, as well as to protect and most efficiently utilize this important resource, NASA has undertaken major efforts to promote progress in the design and development of autonomously managed power systems. Two areas being actively pursued are autonomous power system (APS) breadboards and knowledge-based expert system (KBES) applications. The former are viewed as a requirement for the timely development of the latter. Not only will they serve as final testbeds for the various KBES applications, but will play a major role in the knowledge engineering phase of their development. The current power system breadboard designs are of a distributed microprocessor nature. The distributed nature, plus the need to connect various external computer capabilities (i.e., conventional host computers and symbolic processors), places major emphasis on effective networking. The communications and networking technologies for the first power system breadboard/test facility are described.

  18. Some Examples of the Applications of the Transonic and Supersonic Area Rules to the Prediction of Wave Drag

    NASA Technical Reports Server (NTRS)

    Nelson, Robert L.; Welsh, Clement J.

    1960-01-01

    The experimental wave drags of bodies and wing-body combinations over a wide range of Mach numbers are compared with the computed drags utilizing a 24-term Fourier series application of the supersonic area rule and with the results of equivalent-body tests. The results indicate that the equivalent-body technique provides a good method for predicting the wave drag of certain wing-body combinations at and below a Mach number of 1. At Mach numbers greater than 1, the equivalent-body wave drags can be misleading. The wave drags computed using the supersonic area rule are shown to be in best agreement with the experimental results for configurations employing the thinnest wings. The wave drags for the bodies of revolution presented in this report are predicted to a greater degree of accuracy by using the frontal projections of oblique areas than by using normal areas. A rapid method of computing wing area distributions and area-distribution slopes is given in an appendix.

  19. Evaluation of the maximum-likelihood adaptive neural system (MLANS) applications to noncooperative IFF

    NASA Astrophysics Data System (ADS)

    Chernick, Julian A.; Perlovsky, Leonid I.; Tye, David M.

    1994-06-01

    This paper describes applications of maximum likelihood adaptive neural system (MLANS) to the characterization of clutter in IR images and to the identification of targets. The characterization of image clutter is needed to improve target detection and to enhance the ability to compare performance of different algorithms using diverse imagery data. Enhanced unambiguous IFF is important for fratricide reduction while automatic cueing and targeting is becoming an ever increasing part of operations. We utilized MLANS which is a parametric neural network that combines optimal statistical techniques with a model-based approach. This paper shows that MLANS outperforms classical classifiers, the quadratic classifier and the nearest neighbor classifier, because on the one hand it is not limited to the usual Gaussian distribution assumption and can adapt in real time to the image clutter distribution; on the other hand MLANS learns from fewer samples and is more robust than the nearest neighbor classifiers. Future research will address uncooperative IFF using fused IR and MMW data.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Youssef, Tarek A.; Elsayed, Ahmed T.; Mohammed, Osama A.

    This study presents the design and implementation of a communication and control infrastructure for smart grid operation. The proposed infrastructure enhances the reliability of the measurements and control network. The advantages of utilizing the data-centric over message-centric communication approach are discussed in the context of smart grid applications. The data distribution service (DDS) is used to implement a data-centric common data bus for the smart grid. This common data bus improves the communication reliability, enabling distributed control and smart load management. These enhancements are achieved by avoiding a single point of failure while enabling peer-to-peer communication and an automatic discoverymore » feature for dynamic participating nodes. The infrastructure and ideas presented in this paper were implemented and tested on the smart grid testbed. A toolbox and application programing interface for the testbed infrastructure are developed in order to facilitate interoperability and remote access to the testbed. This interface allows control, monitoring, and performing of experiments remotely. Furthermore, it could be used to integrate multidisciplinary testbeds to study complex cyber-physical systems (CPS).« less

  1. Enrichment and characterization of ferritin for nanomaterial applications

    NASA Astrophysics Data System (ADS)

    Ghirlando, Rodolfo; Mutskova, Radina; Schwartz, Chad

    2016-01-01

    Ferritin is a ubiquitous iron storage protein utilized as a nanomaterial for labeling biomolecules and nanoparticle construction. Commercially available preparations of horse spleen ferritin, widely used as a starting material, contain a distribution of ferritins with different iron loads. We describe a detailed approach to the enrichment of differentially loaded ferritin molecules by common biophysical techniques such as size exclusion chromatography and preparative ultracentrifugation, and characterize these preparations by dynamic light scattering, and analytical ultracentrifugation. We demonstrate a combination of methods to standardize an approach for determining the chemical load of nearly any particle, including nanoparticles and metal colloids. Purification and characterization of iron content in monodisperse ferritin species is particularly critical for several applications in nanomaterial science.

  2. Application of a liquid crystal spatial light modulator to laser marking.

    PubMed

    Parry, Jonathan P; Beck, Rainer J; Shephard, Jonathan D; Hand, Duncan P

    2011-04-20

    Laser marking is demonstrated using a nanosecond (ns) pulse duration laser in combination with a liquid crystal spatial light modulator to generate two-dimensional patterns directly onto thin films and bulk metal surfaces. Previous demonstrations of laser marking with such devices have been limited to low average power lasers. Application in the ns regime enables more complex, larger scale marks to be generated with more widely available and industrially proven laser systems. The dynamic nature of the device is utilized to improve mark quality by reducing the impact of the inherently speckled intensity distribution across the generated image and reduce thermal effects in the marked surface. © 2011 Optical Society of America

  3. Reliability and cost/worth evaluation of generating systems utilizing wind and solar energy

    NASA Astrophysics Data System (ADS)

    Bagen

    The utilization of renewable energy resources such as wind and solar energy for electric power supply has received considerable attention in recent years due to adverse environmental impacts and fuel cost escalation associated with conventional generation. At the present time, wind and/or solar energy sources are utilized to generate electric power in many applications. Wind and solar energy will become important sources for power generation in the future because of their environmental, social and economic benefits, together with public support and government incentives. The wind and sunlight are, however, unstable and variable energy sources, and behave far differently than conventional sources. Energy storage systems are, therefore, often required to smooth the fluctuating nature of the energy conversion system especially in small isolated applications. The research work presented in this thesis is focused on the development and application of reliability and economic benefits assessment associated with incorporating wind energy, solar energy and energy storage in power generating systems. A probabilistic approach using sequential Monte Carlo simulation was employed in this research and a number of analyses were conducted with regards to the adequacy and economic assessment of generation systems containing wind energy, solar energy and energy storage. The evaluation models and techniques incorporate risk index distributions and different operating strategies associated with diesel generation in small isolated systems. Deterministic and probabilistic techniques are combined in this thesis using a system well-being approach to provide useful adequacy indices for small isolated systems that include renewable energy and energy storage. The concepts presented and examples illustrated in this thesis will help power system planners and utility managers to assess the reliability and economic benefits of utilizing wind energy conversion systems, solar energy conversion systems and energy storage in electric power systems and provide useful input to the managerial decision process.

  4. Assessment of Nitrification in Distribution Systems of Waters with Elevated Ammonia Levels

    EPA Science Inventory

    The objective of this work is to monitor ammonia, nitrite, and nitrate in drinking water from the distribution systems of four drinking water utilities in Illinois. A monthly drinking water distribution system water quality monitoring protocol for each water utility in Illinois h...

  5. Rules implementing Sections 201 and 210 of the Public Utility Regulatory Policies Act of 1978: a regulatory history

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Danziger, R.N.; Caples, P.W.; Huning, J.R.

    1980-09-15

    An analysis is made of the rules implementing Sections 201 and 210 of the Public Utility Regulatory Policies Act of 1978 (PURPA). The act provides that utilities must purchase power from qualifying producers of electricity at nondiscriminatory rates, and it exempts private generators from virtually all state and Federal utility regulations. Most of the analysis presented is taken from the perspective of photovoltaics (PV) and solar thermal electric point-focusing distributed receivers (pfdr). It is felt, however, that the analysis is applicable both to cogeneration and other emerging technologies. Chapters presented are: The FERC Response to Oral Comments on the Proposedmore » Rules Implementing Sections 201 and 210 of PURPA; Additional Changes Made or Not Made That Were Addressed in Other Than Oral Testimony; View on the Proposed Rules Implementing Sections 201 and 210 of PURPA; Response to Comments on the Proposed 201 and 210 Rules; and Summary Analysis of the Environmental Assessment of the Rules. Pertinent reference material is provided in the Appendices, including the text of the rules. (MCW)« less

  6. Self-organized criticality in complex systems: Applicability to the interoccurrent and recurrent statistical behavior of earthquakes

    NASA Astrophysics Data System (ADS)

    Abaimov, Sergey G.

    The concept of self-organized criticality is associated with scale-invariant, fractal behavior; this concept is also applicable to earthquake systems. It is known that the interoccurrent frequency-size distribution of earthquakes in a region is scale-invariant and obeys the Gutenberg-Richter power-law dependence. Also, the interoccurrent time-interval distribution is known to obey Poissonian statistics excluding aftershocks. However, to estimate the hazard risk for a region it is necessary to know also the recurrent behavior of earthquakes at a given point on a fault. This behavior has been investigated in the literature, however, major questions remain unresolved. The reason is the small number of earthquakes in observed sequences. To overcome this difficulty this research utilizes numerical simulations of a slider-block model and a sand-pile model. Also, experimental observations of creep events on the creeping section of the San Andreas fault are processed and sequences up to 100 events are studied. Then the recurrent behavior of earthquakes at a given point on a fault or at a given fault is investigated. It is shown that both the recurrent frequency-size and the time-interval behaviors of earthquakes obey the Weibull distribution.

  7. Concurrent Image Processing Executive (CIPE). Volume 2: Programmer's guide

    NASA Technical Reports Server (NTRS)

    Williams, Winifred I.

    1990-01-01

    This manual is intended as a guide for application programmers using the Concurrent Image Processing Executive (CIPE). CIPE is intended to become the support system software for a prototype high performance science analysis workstation. In its current configuration CIPE utilizes a JPL/Caltech Mark 3fp Hypercube with a Sun-4 host. CIPE's design is capable of incorporating other concurrent architectures as well. CIPE provides a programming environment to applications' programmers to shield them from various user interfaces, file transactions, and architectural complexities. A programmer may choose to write applications to use only the Sun-4 or to use the Sun-4 with the hypercube. A hypercube program will use the hypercube's data processors and optionally the Weitek floating point accelerators. The CIPE programming environment provides a simple set of subroutines to activate user interface functions, specify data distributions, activate hypercube resident applications, and to communicate parameters to and from the hypercube.

  8. CloVR: a virtual machine for automated and portable sequence analysis from the desktop using cloud computing.

    PubMed

    Angiuoli, Samuel V; Matalka, Malcolm; Gussman, Aaron; Galens, Kevin; Vangala, Mahesh; Riley, David R; Arze, Cesar; White, James R; White, Owen; Fricke, W Florian

    2011-08-30

    Next-generation sequencing technologies have decentralized sequence acquisition, increasing the demand for new bioinformatics tools that are easy to use, portable across multiple platforms, and scalable for high-throughput applications. Cloud computing platforms provide on-demand access to computing infrastructure over the Internet and can be used in combination with custom built virtual machines to distribute pre-packaged with pre-configured software. We describe the Cloud Virtual Resource, CloVR, a new desktop application for push-button automated sequence analysis that can utilize cloud computing resources. CloVR is implemented as a single portable virtual machine (VM) that provides several automated analysis pipelines for microbial genomics, including 16S, whole genome and metagenome sequence analysis. The CloVR VM runs on a personal computer, utilizes local computer resources and requires minimal installation, addressing key challenges in deploying bioinformatics workflows. In addition CloVR supports use of remote cloud computing resources to improve performance for large-scale sequence processing. In a case study, we demonstrate the use of CloVR to automatically process next-generation sequencing data on multiple cloud computing platforms. The CloVR VM and associated architecture lowers the barrier of entry for utilizing complex analysis protocols on both local single- and multi-core computers and cloud systems for high throughput data processing.

  9. Design of Distributed Controllers Seeking Optimal Power Flow Solutions Under Communication Constraints

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dall'Anese, Emiliano; Simonetto, Andrea; Dhople, Sairaj

    This paper focuses on power distribution networks featuring inverter-interfaced distributed energy resources (DERs), and develops feedback controllers that drive the DER output powers to solutions of time-varying AC optimal power flow (OPF) problems. Control synthesis is grounded on primal-dual-type methods for regularized Lagrangian functions, as well as linear approximations of the AC power-flow equations. Convergence and OPF-solution-tracking capabilities are established while acknowledging: i) communication-packet losses, and ii) partial updates of control signals. The latter case is particularly relevant since it enables asynchronous operation of the controllers where DER setpoints are updated at a fast time scale based on local voltagemore » measurements, and information on the network state is utilized if and when available, based on communication constraints. As an application, the paper considers distribution systems with high photovoltaic integration, and demonstrates that the proposed framework provides fast voltage-regulation capabilities, while enabling the near real-time pursuit of solutions of AC OPF problems.« less

  10. Design of Distributed Controllers Seeking Optimal Power Flow Solutions under Communication Constraints: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dall'Anese, Emiliano; Simonetto, Andrea; Dhople, Sairaj

    This paper focuses on power distribution networks featuring inverter-interfaced distributed energy resources (DERs), and develops feedback controllers that drive the DER output powers to solutions of time-varying AC optimal power flow (OPF) problems. Control synthesis is grounded on primal-dual-type methods for regularized Lagrangian functions, as well as linear approximations of the AC power-flow equations. Convergence and OPF-solution-tracking capabilities are established while acknowledging: i) communication-packet losses, and ii) partial updates of control signals. The latter case is particularly relevant since it enables asynchronous operation of the controllers where DER setpoints are updated at a fast time scale based on local voltagemore » measurements, and information on the network state is utilized if and when available, based on communication constraints. As an application, the paper considers distribution systems with high photovoltaic integration, and demonstrates that the proposed framework provides fast voltage-regulation capabilities, while enabling the near real-time pursuit of solutions of AC OPF problems.« less

  11. A series approximation model for optical light transport and output intensity field distribution in large aspect ratio cylindrical scintillation crystals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tobias, Benjamin John

    A series approximation has been derived for the transport of optical photons within a cylindrically symmetric light pipe and applied to the task of evaluating both the origin and angular distribution of light reaching the output plane. This analytic expression finds particular utility in first-pass photonic design applications since it may be evaluated at a very modest computational cost and is readily parameterized for relevant design constraints. It has been applied toward quantitative exploration of various scintillation crystal preparations and their impact on both quantum efficiency and noise, reproducing sensible dependencies and providing physical justification for certain gamma ray cameramore » design choices.« less

  12. Evaluating the risk of water distribution system failure: A shared frailty model

    NASA Astrophysics Data System (ADS)

    Clark, Robert M.; Thurnau, Robert C.

    2011-12-01

    Condition assessment (CA) Modeling is drawing increasing interest as a technique that can assist in managing drinking water infrastructure. This paper develops a model based on the application of a Cox proportional hazard (PH)/shared frailty model and applies it to evaluating the risk of failure in drinking water networks using data from the Laramie Water Utility (located in Laramie, Wyoming, USA). Using the risk model a cost/ benefit analysis incorporating the inspection value method (IVM), is used to assist in making improved repair, replacement and rehabilitation decisions for selected drinking water distribution system pipes. A separate model is developed to predict failures in prestressed concrete cylinder pipe (PCCP). Various currently available inspection technologies are presented and discussed.

  13. Observation of FeGe skyrmions by electron phase microscopy with hole-free phase plate

    NASA Astrophysics Data System (ADS)

    Kotani, Atsuhiro; Harada, Ken; Malac, Marek; Salomons, Mark; Hayashida, Misa; Mori, Shigeo

    2018-05-01

    We report application of hole-free phase plate (HFPP) to imaging of magnetic skyrmion lattices. Using HFPP imaging, we observed skyrmions in FeGe, and succeeded in obtaining phase contrast images that reflect the sample magnetization distribution. According to the Aharonov-Bohm effect, the electron phase is shifted by the magnetic flux due to sample magnetization. The differential processing of the intensity in a HFPP image allows us to successfully reconstruct the magnetization map of the skyrmion lattice. Furthermore, the calculated phase shift due to the magnetization of the thin film was consistent with that measured by electron holography experiment, which demonstrates that HFPP imaging can be utilized for analysis of magnetic fields and electrostatic potential distribution at the nanoscale.

  14. One-sided measurement-device-independent quantum key distribution

    NASA Astrophysics Data System (ADS)

    Cao, Wen-Fei; Zhen, Yi-Zheng; Zheng, Yu-Lin; Li, Li; Chen, Zeng-Bing; Liu, Nai-Le; Chen, Kai

    2018-01-01

    Measurement-device-independent quantum key distribution (MDI-QKD) protocol was proposed to remove all the detector side channel attacks, while its security relies on the trusted encoding systems. Here we propose a one-sided MDI-QKD (1SMDI-QKD) protocol, which enjoys detection loophole-free advantage, and at the same time weakens the state preparation assumption in MDI-QKD. The 1SMDI-QKD can be regarded as a modified MDI-QKD, in which Bob's encoding system is trusted, while Alice's is uncharacterized. For the practical implementation, we also provide a scheme by utilizing coherent light source with an analytical two decoy state estimation method. Simulation with realistic experimental parameters shows that the protocol has a promising performance, and thus can be applied to practical QKD applications.

  15. Potential utilization of the absolute point cumulative semivariogram technique for the evaluation of distribution coefficient.

    PubMed

    Külahci, Fatih; Sen, Zekâi

    2009-09-15

    The classical solid/liquid distribution coefficient, K(d), for radionuclides in water-sediment systems is dependent on many parameters such as flow, geology, pH, acidity, alkalinity, total hardness, radioactivity concentration, etc. in a region. Considerations of all these effects require a regional analysis with an effective methodology, which has been based on the concept of the cumulative semivariogram concept in this paper. Although classical K(d) calculations are punctual and cannot represent regional pattern, in this paper a regional calculation methodology is suggested through the use of Absolute Point Cumulative SemiVariogram (APCSV) technique. The application of the methodology is presented for (137)Cs and (90)Sr measurements at a set of points in Keban Dam reservoir, Turkey.

  16. Tips and Tricks for Successful Application of Statistical Methods to Biological Data.

    PubMed

    Schlenker, Evelyn

    2016-01-01

    This chapter discusses experimental design and use of statistics to describe characteristics of data (descriptive statistics) and inferential statistics that test the hypothesis posed by the investigator. Inferential statistics, based on probability distributions, depend upon the type and distribution of the data. For data that are continuous, randomly and independently selected, as well as normally distributed more powerful parametric tests such as Student's t test and analysis of variance (ANOVA) can be used. For non-normally distributed or skewed data, transformation of the data (using logarithms) may normalize the data allowing use of parametric tests. Alternatively, with skewed data nonparametric tests can be utilized, some of which rely on data that are ranked prior to statistical analysis. Experimental designs and analyses need to balance between committing type 1 errors (false positives) and type 2 errors (false negatives). For a variety of clinical studies that determine risk or benefit, relative risk ratios (random clinical trials and cohort studies) or odds ratios (case-control studies) are utilized. Although both use 2 × 2 tables, their premise and calculations differ. Finally, special statistical methods are applied to microarray and proteomics data, since the large number of genes or proteins evaluated increase the likelihood of false discoveries. Additional studies in separate samples are used to verify microarray and proteomic data. Examples in this chapter and references are available to help continued investigation of experimental designs and appropriate data analysis.

  17. Prestressing force monitoring method for a box girder through distributed long-gauge FBG sensors

    NASA Astrophysics Data System (ADS)

    Chen, Shi-Zhi; Wu, Gang; Xing, Tuo; Feng, De-Cheng

    2018-01-01

    Monitoring prestressing forces is essential for prestressed concrete box girder bridges. However, the current monitoring methods used for prestressing force were not applicable for a box girder neither because of the sensor’s setup being constrained or shear lag effect not being properly considered. Through combining with the previous analysis model of shear lag effect in the box girder, this paper proposed an indirect monitoring method for on-site determination of prestressing force in a concrete box girder utilizing the distributed long-gauge fiber Bragg grating sensor. The performance of this method was initially verified using numerical simulation for three different distribution forms of prestressing tendons. Then, an experiment involving two concrete box girders was conducted to study the feasibility of this method under different prestressing levels preliminarily. The results of both numerical simulation and lab experiment validated this method’s practicability in a box girder.

  18. Distributed reservation control protocols for random access broadcasting channels

    NASA Technical Reports Server (NTRS)

    Greene, E. P.; Ephremides, A.

    1981-01-01

    Attention is given to a communication network consisting of an arbitrary number of nodes which can communicate with each other via a time-division multiple access (TDMA) broadcast channel. The reported investigation is concerned with the development of efficient distributed multiple access protocols for traffic consisting primarily of single packet messages in a datagram mode of operation. The motivation for the design of the protocols came from the consideration of efficient multiple access utilization of moderate to high bandwidth (4-40 Mbit/s capacity) communication satellite channels used for the transmission of short (1000-10,000 bits) fixed length packets. Under these circumstances, the ratio of roundtrip propagation time to packet transmission time is between 100 to 10,000. It is shown how a TDMA channel can be adaptively shared by datagram traffic and constant bandwidth users such as in digital voice applications. The distributed reservation control protocols described are a hybrid between contention and reservation protocols.

  19. Hierarchical control framework for integrated coordination between distributed energy resources and demand response

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Di; Lian, Jianming; Sun, Yannan

    Demand response is representing a significant but largely untapped resource that can greatly enhance the flexibility and reliability of power systems. In this paper, a hierarchical control framework is proposed to facilitate the integrated coordination between distributed energy resources and demand response. The proposed framework consists of coordination and device layers. In the coordination layer, various resource aggregations are optimally coordinated in a distributed manner to achieve the system-level objectives. In the device layer, individual resources are controlled in real time to follow the optimal power generation or consumption dispatched from the coordination layer. For the purpose of practical applications,more » a method is presented to determine the utility functions of controllable loads by taking into account the real-time load dynamics and the preferences of individual customers. The effectiveness of the proposed framework is validated by detailed simulation studies.« less

  20. Materials for low temperature SOFCs.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krumpelt, M.; Ralph, J.; Cruse, T.

    2002-08-02

    Solid oxide fuel cells (SOFCs) are one of the potentially most efficient and clean energy conversion technologies for electric utility applications. Laboratory cells have shown extraordinary durability, and actual utility-scale prototypes have worked very well. The main obstacle to commercialization has been the relatively high manufacturing cost. To reduce these costs, efforts have been underway for several years to adapt manufacturing technology from the semiconductor industry to the SOFCs; however, tape casting, screen printing and similar methods are more applicable to planar configurations than to the more proven tubular ones. In planar cells the bipolar plate and edge seals becomemore » more critical elements, and material selection may have repercussions for the other fuel cell components. Ferritic stainless steel bipolar plates may be a good choice for reducing the cost of the stacks, but ferritic steels oxidize rapidly at temperatures above 800 C. Inexorably, one is led to the conclusion that anodes, cathodes and electrolytes operating below 800 C need to be found. Another motivation for developing planar SOFCs operating at reduced temperature is the prospect of new non-utility applications. The U.S. Department of Energy has initiated the Solid State Energy Conversion Alliance (SECA) program for developing small modular stacks ranging in capacity from 5 to 10 kW{sup (1)}. This size range meets the power requirements of auxiliary power units for heavy and perhaps even light-duty vehicles, and also for remote stationary applications. In terms of electric capacity, the distributed electric utility market may well exceed the potential market for APUs, but the number of units produced could be higher for the latter, yielding cost benefits related to mass production. On the other hand, the fuel for use in transportation or remote stationary applications will consist of gasoline, diesel or propane, which contain higher sulfur levels than natural gas. Anodes with some resistance to sulfur poisoning would be desirable. Also, during the more frequent shutdowns and startups in these applications, the anodes may get exposed to air. Typical nickel-based SOFC anodes may not tolerate air exposure very well and may need to be modified. Argonne National Laboratory is engaged in developing new materials options for SECA applications, as discussed here.« less

  1. CallWall: tracking resident calls to improve clinical utilization of pathology laboratories.

    PubMed

    Buck, Thomas P; Connor, Ian M; Horowitz, Gary L; Arnaout, Ramy A

    2011-07-01

    Clinical pathology (CP) laboratories are used for millions of tests each year. These lead to thousands of calls to CP residents. However, although laboratory utilization is a frequent topic of study, clinical utilization--the content of the interactions between clinicians and CP residents--is not. Because it reflects questions about laboratory utilization, clinical utilization could suggest ways to improve both training and care by reducing diagnostic error. To build and implement a secure, scalable Web-based system to allow CP residents at any hospital to track the calls they receive, the interaction's context, and the action taken as a result, with evidence where applicable, and to use this system to report on clinical utilization at a major academic hospital. Entries were analyzed from a nearly year-long period to describe the clinical utilization of CP at a large academic teaching hospital. Sixteen residents logged 847 calls during 10 months, roughly evenly distributed among transfusion medicine, chemistry, microbiology, and hematopathology. Calls covered 94 different analytes in chemistry and 71 different organisms or tests in microbiology. Analysis revealed areas where CP can improve clinical care through educating the clinical services, for example, about ordering Rh immune globulin, testosterone testing, and diagnosis of tick-borne diseases. Documenting calls also highlighted patterns among residents. Clinical utilization is a potentially rich knowledge base for improving patient care and resident training. Our resident call-tracking system is a useful way for measuring clinical utilization and mining it for actionable information.

  2. Distribution Feeder Modeling for Time-Series Simulation of Voltage Management Strategies: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Giraldez Miner, Julieta I; Gotseff, Peter; Nagarajan, Adarsh

    This paper presents techniques to create baseline distribution models using a utility feeder from Hawai'ian Electric Company. It describes the software-to-software conversion, steady-state, and time-series validations of a utility feeder model. It also presents a methodology to add secondary low-voltage circuit models to accurately capture the voltage at the customer meter level. This enables preparing models to perform studies that simulate how customer-sited resources integrate into legacy utility distribution system operations.

  3. Allocating health care: cost-utility analysis, informed democratic decision making, or the veil of ignorance?

    PubMed

    Goold, S D

    1996-01-01

    Assuming that rationing health care is unavoidable, and that it requires moral reasoning, how should we allocate limited health care resources? This question is difficult because our pluralistic, liberal society has no consensus on a conception of distributive justice. In this article I focus on an alternative: Who shall decide how to ration health care, and how shall this be done to respect autonomy, pluralism, liberalism, and fairness? I explore three processes for making rationing decisions: cost-utility analysis, informed democratic decision making, and applications of the veil of ignorance. I evaluate these processes as examples of procedural justice, assuming that there is no outcome considered the most just. I use consent as a criterion to judge competing processes so that rationing decisions are, to some extent, self-imposed. I also examine the processes' feasibility in our current health care system. Cost-utility analysis does not meet criteria for actual or presumed consent, even if costs and health-related utility could be measured perfectly. Existing structures of government cannot creditably assimilate the information required for sound rationing decisions, and grassroots efforts are not representative. Applications of the veil of ignorance are more useful for identifying principles relevant to health care rationing than for making concrete rationing decisions. I outline a process of decision making, specifically for health care, that relies on substantive, selected representation, respects pluralism, liberalism, and deliberative democracy, and could be implemented at the community or organizational level.

  4. Advanced batteries for load-leveling - The utility perspective on system integration

    NASA Astrophysics Data System (ADS)

    Delmonaco, J. L.; Lewis, P. A.; Roman, H. T.; Zemkoski, J.

    1982-09-01

    Rechargeable battery systems for applications as utility load-leveling units, particularly in urban areas, are discussed. Particular attention is given to advanced lead-acid, zinc-halogen, sodium-sulfer, and lithium-iron sulfide battery systems, noting that battery charging can proceed at light load hours and requires no fuel on-site. Each battery site will have a master site controller and related subsystems necessary for ensuring grid-quality power output from the batteries and charging when feasible. The actual interconnection with the grid is envisioned as similar to transmission, subtransmission, or distribution systems similar to cogeneration or wind-derived energy interconnections. Analyses are presented of factors influencing the planning economics, impacts on existing grids through solid-state converters, and operational and maintenance considerations. Finally, research directions towards large scale battery implementation are outlined.

  5. Development of on-line monitoring system for Nuclear Power Plant (NPP) using neuro-expert, noise analysis, and modified neural networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Subekti, M.; Center for Development of Reactor Safety Technology, National Nuclear Energy Agency of Indonesia, Puspiptek Complex BO.80, Serpong-Tangerang, 15340; Ohno, T.

    2006-07-01

    The neuro-expert has been utilized in previous monitoring-system research of Pressure Water Reactor (PWR). The research improved the monitoring system by utilizing neuro-expert, conventional noise analysis and modified neural networks for capability extension. The parallel method applications required distributed architecture of computer-network for performing real-time tasks. The research aimed to improve the previous monitoring system, which could detect sensor degradation, and to perform the monitoring demonstration in High Temperature Engineering Tested Reactor (HTTR). The developing monitoring system based on some methods that have been tested using the data from online PWR simulator, as well as RSG-GAS (30 MW research reactormore » in Indonesia), will be applied in HTTR for more complex monitoring. (authors)« less

  6. Smart distribution systems

    DOE PAGES

    Jiang, Yazhou; Liu, Chen -Ching; Xu, Yin

    2016-04-19

    The increasing importance of system reliability and resilience is changing the way distribution systems are planned and operated. To achieve a distribution system self-healing against power outages, emerging technologies and devices, such as remote-controlled switches (RCSs) and smart meters, are being deployed. The higher level of automation is transforming traditional distribution systems into the smart distribution systems (SDSs) of the future. The availability of data and remote control capability in SDSs provides distribution operators with an opportunity to optimize system operation and control. In this paper, the development of SDSs and resulting benefits of enhanced system capabilities are discussed. Amore » comprehensive survey is conducted on the state-of-the-art applications of RCSs and smart meters in SDSs. Specifically, a new method, called Temporal Causal Diagram (TCD), is used to incorporate outage notifications from smart meters for enhanced outage management. To fully utilize the fast operation of RCSs, the spanning tree search algorithm is used to develop service restoration strategies. Optimal placement of RCSs and the resulting enhancement of system reliability are discussed. Distribution system resilience with respect to extreme events is presented. Furthermore, test cases are used to demonstrate the benefit of SDSs. Active management of distributed generators (DGs) is introduced. Future research in a smart distribution environment is proposed.« less

  7. Distributed controller clustering in software defined networks.

    PubMed

    Abdelaziz, Ahmed; Fong, Ang Tan; Gani, Abdullah; Garba, Usman; Khan, Suleman; Akhunzada, Adnan; Talebian, Hamid; Choo, Kim-Kwang Raymond

    2017-01-01

    Software Defined Networking (SDN) is an emerging promising paradigm for network management because of its centralized network intelligence. However, the centralized control architecture of the software-defined networks (SDNs) brings novel challenges of reliability, scalability, fault tolerance and interoperability. In this paper, we proposed a novel clustered distributed controller architecture in the real setting of SDNs. The distributed cluster implementation comprises of multiple popular SDN controllers. The proposed mechanism is evaluated using a real world network topology running on top of an emulated SDN environment. The result shows that the proposed distributed controller clustering mechanism is able to significantly reduce the average latency from 8.1% to 1.6%, the packet loss from 5.22% to 4.15%, compared to distributed controller without clustering running on HP Virtual Application Network (VAN) SDN and Open Network Operating System (ONOS) controllers respectively. Moreover, proposed method also shows reasonable CPU utilization results. Furthermore, the proposed mechanism makes possible to handle unexpected load fluctuations while maintaining a continuous network operation, even when there is a controller failure. The paper is a potential contribution stepping towards addressing the issues of reliability, scalability, fault tolerance, and inter-operability.

  8. Exploiting on-node heterogeneity for in-situ analytics of climate simulations via a functional partitioning framework

    NASA Astrophysics Data System (ADS)

    Sapra, Karan; Gupta, Saurabh; Atchley, Scott; Anantharaj, Valentine; Miller, Ross; Vazhkudai, Sudharshan

    2016-04-01

    Efficient resource utilization is critical for improved end-to-end computing and workflow of scientific applications. Heterogeneous node architectures, such as the GPU-enabled Titan supercomputer at the Oak Ridge Leadership Computing Facility (OLCF), present us with further challenges. In many HPC applications on Titan, the accelerators are the primary compute engines while the CPUs orchestrate the offloading of work onto the accelerators, and moving the output back to the main memory. On the other hand, applications that do not exploit GPUs, the CPU usage is dominant while the GPUs idle. We utilized Heterogenous Functional Partitioning (HFP) runtime framework that can optimize usage of resources on a compute node to expedite an application's end-to-end workflow. This approach is different from existing techniques for in-situ analyses in that it provides a framework for on-the-fly analysis on-node by dynamically exploiting under-utilized resources therein. We have implemented in the Community Earth System Model (CESM) a new concurrent diagnostic processing capability enabled by the HFP framework. Various single variate statistics, such as means and distributions, are computed in-situ by launching HFP tasks on the GPU via the node local HFP daemon. Since our current configuration of CESM does not use GPU resources heavily, we can move these tasks to GPU using the HFP framework. Each rank running the atmospheric model in CESM pushes the variables of of interest via HFP function calls to the HFP daemon. This node local daemon is responsible for receiving the data from main program and launching the designated analytics tasks on the GPU. We have implemented these analytics tasks in C and use OpenACC directives to enable GPU acceleration. This methodology is also advantageous while executing GPU-enabled configurations of CESM when the CPUs will be idle during portions of the runtime. In our implementation results, we demonstrate that it is more efficient to use HFP framework to offload the tasks to GPUs instead of doing it in the main application. We observe increased resource utilization and overall productivity in this approach by using HFP framework for end-to-end workflow.

  9. Multiscale Characterization of the Probability Density Functions of Velocity and Temperature Increment Fields

    NASA Astrophysics Data System (ADS)

    DeMarco, Adam Ward

    The turbulent motions with the atmospheric boundary layer exist over a wide range of spatial and temporal scales and are very difficult to characterize. Thus, to explore the behavior of such complex flow enviroments, it is customary to examine their properties from a statistical perspective. Utilizing the probability density functions of velocity and temperature increments, deltau and deltaT, respectively, this work investigates their multiscale behavior to uncover the unique traits that have yet to be thoroughly studied. Utilizing diverse datasets, including idealized, wind tunnel experiments, atmospheric turbulence field measurements, multi-year ABL tower observations, and mesoscale models simulations, this study reveals remarkable similiarities (and some differences) between the small and larger scale components of the probability density functions increments fields. This comprehensive analysis also utilizes a set of statistical distributions to showcase their ability to capture features of the velocity and temperature increments' probability density functions (pdfs) across multiscale atmospheric motions. An approach is proposed for estimating their pdfs utilizing the maximum likelihood estimation (MLE) technique, which has never been conducted utilizing atmospheric data. Using this technique, we reveal the ability to estimate higher-order moments accurately with a limited sample size, which has been a persistent concern for atmospheric turbulence research. With the use robust Goodness of Fit (GoF) metrics, we quantitatively reveal the accuracy of the distributions to the diverse dataset. Through this analysis, it is shown that the normal inverse Gaussian (NIG) distribution is a prime candidate to be used as an estimate of the increment pdfs fields. Therefore, using the NIG model and its parameters, we display the variations in the increments over a range of scales revealing some unique scale-dependent qualities under various stability and ow conditions. This novel approach can provide a method of characterizing increment fields with the sole use of only four pdf parameters. Also, we investigate the capability of the current state-of-the-art mesoscale atmospheric models to predict the features and highlight the potential for use for future model development. With the knowledge gained in this study, a number of applications can benefit by using our methodology, including the wind energy and optical wave propagation fields.

  10. Analysis of interferograms of refractive index inhomogeneities produced in optical materials

    NASA Astrophysics Data System (ADS)

    Tarjányi, N.

    2014-12-01

    Optical homogeneity of materials intended for optical applications is one of the criterions which decide on an appropriate application method for the material. The existence of a refractive index inhomogeneity inside a material may disqualify it from utilization or by contrary, provide an advantage. For observation of a refractive index inhomogeneity, even a weak one, it is convenient to use any of interferometric methods. They are very sensitive and provide information on spatial distribution of the refractive index, immediately. One can use them also in case when the inhomogeneity evolves in time, usually due to action of some external fields. Then, the stream of interferograms provides a dynamic evolution of a spatial distribution of the inhomogeneity. In the contribution, there are presented results of the analysis of interferograms obtained by observing the creation of a refractive index inhomogeneity due to illumination of thin layers of a polyvinyl-alcohol/acrylamide photopolymer and a plate of photorefractive crystal, lithium niobate, by light and a refractive index inhomogeneity originated at the boundary of two layers of polydimethylsiloxane. The obtained dependences can be used for studying of the mechanisms responsible for the inhomogeneity creation, designing various technical applications or for diagnostics of fabricated components.

  11. In-flight fiber optic acoustic emission sensor (FAESense) system for the real time detection, localization, and classification of damage in composite aircraft structures

    NASA Astrophysics Data System (ADS)

    Mendoza, Edgar; Prohaska, John; Kempen, Connie; Esterkin, Yan; Sun, Sunjian

    2013-05-01

    Acoustic emission sensing is a leading structural health monitoring technique use for the early warning detection of structural damage associated with impacts, cracks, fracture, and delaminations in advanced materials. Current AE systems based on electronic PZT transducers suffer from various limitations that prevent its wide dynamic use in practical avionics and aerospace applications where weight, size and power are critical for operation. This paper describes progress towards the development of a wireless in-flight distributed fiber optic acoustic emission monitoring system (FAESense™) suitable for the onboard-unattended detection, localization, and classification of damage in avionics and aerospace structures. Fiber optic AE sensors offer significant advantages over its counterpart electronic AE sensors by using a high-density array of micron-size AE transducers distributed and multiplex over long lengths of a standard single mode optical fiber. Immediate SHM applications are found in commercial and military aircraft, helicopters, spacecraft, wind mil turbine blades, and in next generation weapon systems, as well as in the petrochemical and aerospace industries, civil structures, power utilities, and a wide spectrum of other applications.

  12. A Software Architecture for Intelligent Synthesis Environments

    NASA Technical Reports Server (NTRS)

    Filman, Robert E.; Norvig, Peter (Technical Monitor)

    2001-01-01

    The NASA's Intelligent Synthesis Environment (ISE) program is a grand attempt to develop a system to transform the way complex artifacts are engineered. This paper discusses a "middleware" architecture for enabling the development of ISE. Desirable elements of such an Intelligent Synthesis Architecture (ISA) include remote invocation; plug-and-play applications; scripting of applications; management of design artifacts, tools, and artifact and tool attributes; common system services; system management; and systematic enforcement of policies. This paper argues that the ISA extend conventional distributed object technology (DOT) such as CORBA and Product Data Managers with flexible repositories of product and tool annotations and "plug-and-play" mechanisms for inserting "ility" or orthogonal concerns into the system. I describe the Object Infrastructure Framework, an Aspect Oriented Programming (AOP) environment for developing distributed systems that provides utility insertion and enables consistent annotation maintenance. This technology can be used to enforce policies such as maintaining the annotations of artifacts, particularly the provenance and access control rules of artifacts-, performing automatic datatype transformations between representations; supplying alternative servers of the same service; reporting on the status of jobs and the system; conveying privileges throughout an application; supporting long-lived transactions; maintaining version consistency; and providing software redundancy and mobility.

  13. Dynamic Distribution and Layouting of Model-Based User Interfaces in Smart Environments

    NASA Astrophysics Data System (ADS)

    Roscher, Dirk; Lehmann, Grzegorz; Schwartze, Veit; Blumendorf, Marco; Albayrak, Sahin

    The developments in computer technology in the last decade change the ways of computer utilization. The emerging smart environments make it possible to build ubiquitous applications that assist users during their everyday life, at any time, in any context. But the variety of contexts-of-use (user, platform and environment) makes the development of such ubiquitous applications for smart environments and especially its user interfaces a challenging and time-consuming task. We propose a model-based approach, which allows adapting the user interface at runtime to numerous (also unknown) contexts-of-use. Based on a user interface modelling language, defining the fundamentals and constraints of the user interface, a runtime architecture exploits the description to adapt the user interface to the current context-of-use. The architecture provides automatic distribution and layout algorithms for adapting the applications also to contexts unforeseen at design time. Designers do not specify predefined adaptations for each specific situation, but adaptation constraints and guidelines. Furthermore, users are provided with a meta user interface to influence the adaptations according to their needs. A smart home energy management system serves as running example to illustrate the approach.

  14. Intelligent Systems Technologies and Utilization of Earth Observation Data

    NASA Technical Reports Server (NTRS)

    Ramapriyan, H. K.; McConaughy, G. R.; Morse, H. S.

    2004-01-01

    The addition of raw data and derived geophysical parameters from several Earth observing satellites over the last decade to the data held by NASA data centers has created a data rich environment for the Earth science research and applications communities. The data products are being distributed to a large and diverse community of users. Due to advances in computational hardware, networks and communications, information management and software technologies, significant progress has been made in the last decade in archiving and providing data to users. However, to realize the full potential of the growing data archives, further progress is necessary in the transformation of data into information, and information into knowledge that can be used in particular applications. Sponsored by NASA s Intelligent Systems Project within the Computing, Information and Communication Technology (CICT) Program, a conceptual architecture study has been conducted to examine ideas to improve data utilization through the addition of intelligence into the archives in the context of an overall knowledge building system (KBS). Potential Intelligent Archive concepts include: 1) Mining archived data holdings to improve metadata to facilitate data access and usability; 2) Building intelligence about transformations on data, information, knowledge, and accompanying services; 3) Recognizing the value of results, indexing and formatting them for easy access; 4) Interacting as a cooperative node in a web of distributed systems to perform knowledge building; and 5) Being aware of other nodes in the KBS, participating in open systems interfaces and protocols for virtualization, and achieving collaborative interoperability.

  15. Effects of Home Energy Management Systems on Distribution Utilities and Feeders Under Various Market Structures: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ruth, Mark; Pratt, Annabelle; Lunacek, Monte

    2015-07-17

    The combination of distributed energy resources (DER) and retail tariff structures to provide benefits to both utility consumers and the utilities is poorly understood. To improve understanding, an Integrated Energy System Model (IESM) is being developed to simulate the physical and economic aspects of DER technologies, the buildings where they reside, and feeders servicing them. The IESM was used to simulate 20 houses with home energy management systems on a single feeder under a time of use tariff to estimate economic and physical impacts on both the households and the distribution utilities. HEMS reduce consumers’ electric bills by precooling housesmore » in the hours before peak electricity pricing. Household savings are greater than the reduction utility net revenue indicating that HEMS can provide a societal benefit providing tariffs are structured so that utilities remain solvent. Utilization of HEMS reduce peak loads during high price hours but shifts it to hours with off-peak and shoulder prices and resulting in a higher peak load.« less

  16. Assumption-versus data-based approaches to summarizing species' ranges.

    PubMed

    Peterson, A Townsend; Navarro-Sigüenza, Adolfo G; Gordillo, Alejandro

    2018-06-01

    For conservation decision making, species' geographic distributions are mapped using various approaches. Some such efforts have downscaled versions of coarse-resolution extent-of-occurrence maps to fine resolutions for conservation planning. We examined the quality of the extent-of-occurrence maps as range summaries and the utility of refining those maps into fine-resolution distributional hypotheses. Extent-of-occurrence maps tend to be overly simple, omit many known and well-documented populations, and likely frequently include many areas not holding populations. Refinement steps involve typological assumptions about habitat preferences and elevational ranges of species, which can introduce substantial error in estimates of species' true areas of distribution. However, no model-evaluation steps are taken to assess the predictive ability of these models, so model inaccuracies are not noticed. Whereas range summaries derived by these methods may be useful in coarse-grained, global-extent studies, their continued use in on-the-ground conservation applications at fine spatial resolutions is not advisable in light of reliance on assumptions, lack of real spatial resolution, and lack of testing. In contrast, data-driven techniques that integrate primary data on biodiversity occurrence with remotely sensed data that summarize environmental dimensions (i.e., ecological niche modeling or species distribution modeling) offer data-driven solutions based on a minimum of assumptions that can be evaluated and validated quantitatively to offer a well-founded, widely accepted method for summarizing species' distributional patterns for conservation applications. © 2016 Society for Conservation Biology.

  17. Optimization of cell seeding in a 2D bio-scaffold system using computational models.

    PubMed

    Ho, Nicholas; Chua, Matthew; Chui, Chee-Kong

    2017-05-01

    The cell expansion process is a crucial part of generating cells on a large-scale level in a bioreactor system. Hence, it is important to set operating conditions (e.g. initial cell seeding distribution, culture medium flow rate) to an optimal level. Often, the initial cell seeding distribution factor is neglected and/or overlooked in the design of a bioreactor using conventional seeding distribution methods. This paper proposes a novel seeding distribution method that aims to maximize cell growth and minimize production time/cost. The proposed method utilizes two computational models; the first model represents cell growth patterns whereas the second model determines optimal initial cell seeding positions for adherent cell expansions. Cell growth simulation from the first model demonstrates that the model can be a representation of various cell types with known probabilities. The second model involves a combination of combinatorial optimization, Monte Carlo and concepts of the first model, and is used to design a multi-layer 2D bio-scaffold system that increases cell production efficiency in bioreactor applications. Simulation results have shown that the recommended input configurations obtained from the proposed optimization method are the most optimal configurations. The results have also illustrated the effectiveness of the proposed optimization method. The potential of the proposed seeding distribution method as a useful tool to optimize the cell expansion process in modern bioreactor system applications is highlighted. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. The Utility of Free Software for Gravity and Magnetic Advanced Data Processing

    NASA Astrophysics Data System (ADS)

    Grandis, Hendra; Dahrin, Darharta

    2017-04-01

    The lack of computational tools, i.e. software, often hinders the proper teaching and application of geophysical data processing in academic institutions in Indonesia. Although there are academic licensing options for commercial software, such options are still way beyond the financial capability of some academic institutions. Academic community members (both lecturers and students) are supposed to be creative and resourceful to overcome such situation. Therefore, capability for writing computer programs or codes is a necessity. However, there are also many computer programs and even software that are freely available on the internet. Generally, the utility of the freely distributed software is limited for demonstration only or for visualizing and exchanging data. The paper discusses the utility of Geosoft’s Oasis Montaj Viewer along with USGS GX programs that are available for free. Useful gravity and magnetic advanced data processing (i.e. gradient calculation, spectral analysis etc.) can be performed “correctly” without any approximation that sometimes leads to dubious results and interpretation.

  19. A library of protein cage architectures as nanomaterials.

    PubMed

    Flenniken, M L; Uchida, M; Liepold, L O; Kang, S; Young, M J; Douglas, T

    2009-01-01

    Virus capsids and other structurally related cage-like proteins such as ferritins, dps, and heat shock proteins have three distinct surfaces (inside, outside, interface) that can be exploited to generate nanomaterials with multiple functionality by design. Protein cages are biological in origin and each cage exhibits extremely homogeneous size distribution. This homogeneity can be used to attain a high degree of homogeneity of the templated material and its associated property. A series of protein cages exhibiting diversity in size, functionality, and chemical and thermal stabilities can be utilized for materials synthesis under a variety of conditions. Since synthetic approaches to materials science often use harsh temperature and pH, it is an advantage to utilize protein cages from extreme environments. In this chapter, we review recent studies on discovering novel protein cages from harsh natural environments such as the acidic thermal hot springs at Yellowstone National Park (YNP) and on utilizing protein cages as nano-scale platforms for developing nanomaterials with wide range of applications from electronics to biomedicine.

  20. Foundational Report Series: Advanced Distribution Management Systems for Grid Modernization, DMS Integration of Distributed Energy Resources and Microgrids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Singh, Ravindra; Reilly, James T.; Wang, Jianhui

    Deregulation of the electric utility industry, environmental concerns associated with traditional fossil fuel-based power plants, volatility of electric energy costs, Federal and State regulatory support of “green” energy, and rapid technological developments all support the growth of Distributed Energy Resources (DERs) in electric utility systems and ensure an important role for DERs in the smart grid and other aspects of modern utilities. DERs include distributed generation (DG) systems, such as renewables; controllable loads (also known as demand response); and energy storage systems. This report describes the role of aggregators of DERs in providing optimal services to distribution networks, through DERmore » monitoring and control systems—collectively referred to as a Distributed Energy Resource Management System (DERMS)—and microgrids in various configurations.« less

  1. Programmable dispersion on a photonic integrated circuit for classical and quantum applications.

    PubMed

    Notaros, Jelena; Mower, Jacob; Heuck, Mikkel; Lupo, Cosmo; Harris, Nicholas C; Steinbrecher, Gregory R; Bunandar, Darius; Baehr-Jones, Tom; Hochberg, Michael; Lloyd, Seth; Englund, Dirk

    2017-09-04

    We demonstrate a large-scale tunable-coupling ring resonator array, suitable for high-dimensional classical and quantum transforms, in a CMOS-compatible silicon photonics platform. The device consists of a waveguide coupled to 15 ring-based dispersive elements with programmable linewidths and resonance frequencies. The ability to control both quality factor and frequency of each ring provides an unprecedented 30 degrees of freedom in dispersion control on a single spatial channel. This programmable dispersion control system has a range of applications, including mode-locked lasers, quantum key distribution, and photon-pair generation. We also propose a novel application enabled by this circuit - high-speed quantum communications using temporal-mode-based quantum data locking - and discuss the utility of the system for performing the high-dimensional unitary optical transformations necessary for a quantum data locking demonstration.

  2. Distributed usability evaluation of the Pennsylvania Cancer Atlas

    PubMed Central

    Bhowmick, Tanuka; Robinson, Anthony C; Gruver, Adrienne; MacEachren, Alan M; Lengerich, Eugene J

    2008-01-01

    Background The Pennsylvania Cancer Atlas (PA-CA) is an interactive online atlas to help policy-makers, program managers, and epidemiologists with tasks related to cancer prevention and control. The PA-CA includes maps, graphs, tables, that are dynamically linked to support data exploration and decision-making with spatio-temporal cancer data. Our Atlas development process follows a user-centered design approach. To assess the usability of the initial versions of the PA-CA, we developed and applied a novel strategy for soliciting user feedback through multiple distributed focus groups and surveys. Our process of acquiring user feedback leverages an online web application (e-Delphi). In this paper we describe the PA-CA, detail how we have adapted e-Delphi web application to support usability and utility evaluation of the PA-CA, and present the results of our evaluation. Results We report results from four sets of users. Each group provided structured individual and group assessments of the PA-CA as well as input on the kinds of users and applications for which it is best suited. Overall reactions to the PA-CA are quite positive. Participants did, however, provide a range of useful suggestions. Key suggestions focused on improving interaction functions, enhancing methods of temporal analysis, addressing data issues, and providing additional data displays and help functions. These suggestions were incorporated in each design and implementation iteration for the PA-CA and used to inform a set of web-atlas design principles. Conclusion For the Atlas, we find that a design that utilizes linked map, graph, and table views is understandable to and perceived to be useful by the target audience of cancer prevention and control professionals. However, it is clear that considerable variation in experience using maps and graphics exists and for those with less experience, integrated tutorials and help features are needed. In relation to our usability assessment strategy, we find that our distributed, web-based method for soliciting user input is generally effective. Advantages include the ability to gather information from users distributed in time and space and the relative anonymity of the participants while disadvantages include less control over when and how often participants provide input and challenges for obtaining rich input. PMID:18620565

  3. The status of food irradiation technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sivinski, J.S.

    1989-01-01

    Irradiation is a mature technology for many uses, such as medical product sterilization, crosslinking of plastics, application of coatings, stabilization of natural and synthetic rubbers prior to vulcanization, and in plant genetics. It also has many potential applications in the food and agriculture industries, especially in the postharvest activities associated with processing, storing, and distribution and in utilization and consumption. The safety of food irradiation has been thoroughly studied and established by distinguished scientists of international stature and unimpeachable credentials. Approximately 30 countries permit food irradiation and it is commercially used in 21. Parasites are of serious concern since theirmore » impact on human health and economic productivity is significant, especially in developing countries with sanitation and food control problems. Parasites in meat and fish can be rendered sterile or inactivated with irradiation, and the potential for improved human health is significant. The second area for immediate use of irradiation is in meeting plant quarantine requirements. The benefits described above and the approval of the scientific community are moving the technology toward greater utilization.« less

  4. Directed Thermal Diffusions through Metamaterial Source Illusion with Homogeneous Natural Media

    PubMed Central

    Xu, Guoqiang; Zhang, Haochun; Jin, Liang

    2018-01-01

    Owing to the utilization of transformation optics, many significant research and development achievements have expanded the applications of illusion devices into thermal fields. However, most of the current studies on relevant thermal illusions used to reshape the thermal fields are dependent of certain pre-designed geometric profiles with complicated conductivity configurations. In this paper, we propose a methodology for designing a new class of thermal source illusion devices for achieving directed thermal diffusions with natural homogeneous media. The employments of the space rotations in the linear transformation processes allow the directed thermal diffusions to be independent of the geometric profiles, and the utilization of natural homogeneous media improve the feasibility. Four schemes, with fewer types of homogeneous media filling the functional regions, are demonstrated in transient states. The expected performances are observed in each scheme. The related performance are analyzed by comparing the thermal distribution characteristics and the illusion effectiveness on the measured lines. The findings obtained in this paper see applications in the development of directed diffusions with minimal thermal loss, used in novel “multi-beam” thermal generation, thermal lenses, solar receivers, and waveguide. PMID:29671833

  5. Towards Improved Satellite-In Situ Oceanographic Data Interoperability and Associated Value Added Services at the Podaac

    NASA Astrophysics Data System (ADS)

    Tsontos, V. M.; Huang, T.; Holt, B.

    2015-12-01

    The earth science enterprise increasingly relies on the integration and synthesis of multivariate datasets from diverse observational platforms. NASA's ocean salinity missions, that include Aquarius/SAC-D and the SPURS (Salinity Processes in the Upper Ocean Regional Study) field campaign, illustrate the value of integrated observations in support of studies on ocean circulation, the water cycle, and climate. However, the inherent heterogeneity of resulting data and the disparate, distributed systems that serve them complicates their effective utilization for both earth science research and applications. Key technical interoperability challenges include adherence to metadata and data format standards that are particularly acute for in-situ data and the lack of a unified metadata model facilitating archival and integration of both satellite and oceanographic field datasets. Here we report on efforts at the PO.DAAC, NASA's physical oceanographic data center, to extend our data management and distribution support capabilities for field campaign datasets such as those from SPURS. We also discuss value-added services, based on the integration of satellite and in-situ datasets, which are under development with a particular focus on DOMS. The distributed oceanographic matchup service (DOMS) implements a portable technical infrastructure and associated web services that will be broadly accessible via the PO.DAAC for the dynamic collocation of satellite and in-situ data, hosted by distributed data providers, in support of mission cal/val, science and operational applications.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Seal, Brian; Huque, Aminul; Rogers, Lindsey

    In 2011, EPRI began a four-year effort under the Department of Energy (DOE) SunShot Initiative Solar Energy Grid Integration Systems - Advanced Concepts (SEGIS-AC) to demonstrate smart grid ready inverters with utility communication. The objective of the project was to successfully implement and demonstrate effective utilization of inverters with grid support functionality to capture the full value of distributed photovoltaic (PV). The project leveraged ongoing investments and expanded PV inverter capabilities, to enable grid operators to better utilize these grid assets. Developing and implementing key elements of PV inverter grid support capabilities will increase the distribution system’s capacity for highermore » penetration levels of PV, while reducing the cost. The project team included EPRI, Yaskawa-Solectria Solar, Spirae, BPL Global, DTE Energy, National Grid, Pepco, EDD, NPPT and NREL. The project was divided into three phases: development, deployment, and demonstration. Within each phase, the key areas included: head-end communications for Distributed Energy Resources (DER) at the utility operations center; methods for coordinating DER with existing distribution equipment; back-end PV plant master controller; and inverters with smart-grid functionality. Four demonstration sites were chosen in three regions of the United States with different types of utility operating systems and implementations of utility-scale PV inverters. This report summarizes the project and findings from field demonstration at three utility sites.« less

  7. AGIS: Integration of new technologies used in ATLAS Distributed Computing

    NASA Astrophysics Data System (ADS)

    Anisenkov, Alexey; Di Girolamo, Alessandro; Alandes Pradillo, Maria

    2017-10-01

    The variety of the ATLAS Distributed Computing infrastructure requires a central information system to define the topology of computing resources and to store different parameters and configuration data which are needed by various ATLAS software components. The ATLAS Grid Information System (AGIS) is the system designed to integrate configuration and status information about resources, services and topology of the computing infrastructure used by ATLAS Distributed Computing applications and services. Being an intermediate middleware system between clients and external information sources (like central BDII, GOCDB, MyOSG), AGIS defines the relations between experiment specific used resources and physical distributed computing capabilities. Being in production during LHC Runl AGIS became the central information system for Distributed Computing in ATLAS and it is continuously evolving to fulfil new user requests, enable enhanced operations and follow the extension of the ATLAS Computing model. The ATLAS Computing model and data structures used by Distributed Computing applications and services are continuously evolving and trend to fit newer requirements from ADC community. In this note, we describe the evolution and the recent developments of AGIS functionalities, related to integration of new technologies recently become widely used in ATLAS Computing, like flexible computing utilization of opportunistic Cloud and HPC resources, ObjectStore services integration for Distributed Data Management (Rucio) and ATLAS workload management (PanDA) systems, unified storage protocols declaration required for PandDA Pilot site movers and others. The improvements of information model and general updates are also shown, in particular we explain how other collaborations outside ATLAS could benefit the system as a computing resources information catalogue. AGIS is evolving towards a common information system, not coupled to a specific experiment.

  8. 41 CFR 101-27.209 - Utilization and distribution of shelf-life items.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... distribution of shelf-life items. 101-27.209 Section 101-27.209 Public Contracts and Property Management... PROCUREMENT 27-INVENTORY MANAGEMENT 27.2-Management of Shelf-Life Materials § 101-27.209 Utilization and distribution of shelf-life items. Where it is determined that specified quantities of both Type I and Type II...

  9. A practical method to test the validity of the standard Gumbel distribution in logit-based multinomial choice models of travel behavior

    DOE PAGES

    Ye, Xin; Garikapati, Venu M.; You, Daehyun; ...

    2017-11-08

    Most multinomial choice models (e.g., the multinomial logit model) adopted in practice assume an extreme-value Gumbel distribution for the random components (error terms) of utility functions. This distributional assumption offers a closed-form likelihood expression when the utility maximization principle is applied to model choice behaviors. As a result, model coefficients can be easily estimated using the standard maximum likelihood estimation method. However, maximum likelihood estimators are consistent and efficient only if distributional assumptions on the random error terms are valid. It is therefore critical to test the validity of underlying distributional assumptions on the error terms that form the basismore » of parameter estimation and policy evaluation. In this paper, a practical yet statistically rigorous method is proposed to test the validity of the distributional assumption on the random components of utility functions in both the multinomial logit (MNL) model and multiple discrete-continuous extreme value (MDCEV) model. Based on a semi-nonparametric approach, a closed-form likelihood function that nests the MNL or MDCEV model being tested is derived. The proposed method allows traditional likelihood ratio tests to be used to test violations of the standard Gumbel distribution assumption. Simulation experiments are conducted to demonstrate that the proposed test yields acceptable Type-I and Type-II error probabilities at commonly available sample sizes. The test is then applied to three real-world discrete and discrete-continuous choice models. For all three models, the proposed test rejects the validity of the standard Gumbel distribution in most utility functions, calling for the development of robust choice models that overcome adverse effects of violations of distributional assumptions on the error terms in random utility functions.« less

  10. A practical method to test the validity of the standard Gumbel distribution in logit-based multinomial choice models of travel behavior

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ye, Xin; Garikapati, Venu M.; You, Daehyun

    Most multinomial choice models (e.g., the multinomial logit model) adopted in practice assume an extreme-value Gumbel distribution for the random components (error terms) of utility functions. This distributional assumption offers a closed-form likelihood expression when the utility maximization principle is applied to model choice behaviors. As a result, model coefficients can be easily estimated using the standard maximum likelihood estimation method. However, maximum likelihood estimators are consistent and efficient only if distributional assumptions on the random error terms are valid. It is therefore critical to test the validity of underlying distributional assumptions on the error terms that form the basismore » of parameter estimation and policy evaluation. In this paper, a practical yet statistically rigorous method is proposed to test the validity of the distributional assumption on the random components of utility functions in both the multinomial logit (MNL) model and multiple discrete-continuous extreme value (MDCEV) model. Based on a semi-nonparametric approach, a closed-form likelihood function that nests the MNL or MDCEV model being tested is derived. The proposed method allows traditional likelihood ratio tests to be used to test violations of the standard Gumbel distribution assumption. Simulation experiments are conducted to demonstrate that the proposed test yields acceptable Type-I and Type-II error probabilities at commonly available sample sizes. The test is then applied to three real-world discrete and discrete-continuous choice models. For all three models, the proposed test rejects the validity of the standard Gumbel distribution in most utility functions, calling for the development of robust choice models that overcome adverse effects of violations of distributional assumptions on the error terms in random utility functions.« less

  11. Numerical modeling of exciton-polariton Bose-Einstein condensate in a microcavity

    NASA Astrophysics Data System (ADS)

    Voronych, Oksana; Buraczewski, Adam; Matuszewski, Michał; Stobińska, Magdalena

    2017-06-01

    A novel, optimized numerical method of modeling of an exciton-polariton superfluid in a semiconductor microcavity was proposed. Exciton-polaritons are spin-carrying quasiparticles formed from photons strongly coupled to excitons. They possess unique properties, interesting from the point of view of fundamental research as well as numerous potential applications. However, their numerical modeling is challenging due to the structure of nonlinear differential equations describing their evolution. In this paper, we propose to solve the equations with a modified Runge-Kutta method of 4th order, further optimized for efficient computations. The algorithms were implemented in form of C++ programs fitted for parallel environments and utilizing vector instructions. The programs form the EPCGP suite which has been used for theoretical investigation of exciton-polaritons. Catalogue identifier: AFBQ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AFBQ_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: BSD-3 No. of lines in distributed program, including test data, etc.: 2157 No. of bytes in distributed program, including test data, etc.: 498994 Distribution format: tar.gz Programming language: C++ with OpenMP extensions (main numerical program), Python (helper scripts). Computer: Modern PC (tested on AMD and Intel processors), HP BL2x220. Operating system: Unix/Linux and Windows. Has the code been vectorized or parallelized?: Yes (OpenMP) RAM: 200 MB for single run Classification: 7, 7.7. Nature of problem: An exciton-polariton superfluid is a novel, interesting physical system allowing investigation of high temperature Bose-Einstein condensation of exciton-polaritons-quasiparticles carrying spin. They have brought a lot of attention due to their unique properties and potential applications in polariton-based optoelectronic integrated circuits. This is an out-of-equilibrium quantum system confined within a semiconductor microcavity. It is described by a set of nonlinear differential equations similar in spirit to the Gross-Pitaevskii (GP) equation, but their unique properties do not allow standard GP solving frameworks to be utilized. Finding an accurate and efficient numerical algorithm as well as development of optimized numerical software is necessary for effective theoretical investigation of exciton-polaritons. Solution method: A Runge-Kutta method of 4th order was employed to solve the set of differential equations describing exciton-polariton superfluids. The method was fitted for the exciton-polariton equations and further optimized. The C++ programs utilize OpenMP extensions and vector operations in order to fully utilize the computer hardware. Running time: 6h for 100 ps evolution, depending on the values of parameters

  12. 18 CFR 141.1 - FERC Form No. 1, Annual report of Major electric utilities, licensees and others.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... instrumentality engaged in generation, transmission, distribution, or sale of electric energy, however produced... the business of developing, transmitting, utilizing, or distributing power). (2) When to file and what...

  13. 18 CFR 141.1 - FERC Form No. 1, Annual report of Major electric utilities, licensees and others.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... instrumentality engaged in generation, transmission, distribution, or sale of electric energy, however produced... the business of developing, transmitting, utilizing, or distributing power). (2) When to file and what...

  14. 18 CFR 141.1 - FERC Form No. 1, Annual report of Major electric utilities, licensees and others.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... instrumentality engaged in generation, transmission, distribution, or sale of electric energy, however produced... the business of developing, transmitting, utilizing, or distributing power). (2) When to file and what...

  15. 18 CFR 141.1 - FERC Form No. 1, Annual report of Major electric utilities, licensees and others.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... instrumentality engaged in generation, transmission, distribution, or sale of electric energy, however produced... the business of developing, transmitting, utilizing, or distributing power). (2) When to file and what...

  16. Concurrent Image Processing Executive (CIPE). Volume 1: Design overview

    NASA Technical Reports Server (NTRS)

    Lee, Meemong; Groom, Steven L.; Mazer, Alan S.; Williams, Winifred I.

    1990-01-01

    The design and implementation of a Concurrent Image Processing Executive (CIPE), which is intended to become the support system software for a prototype high performance science analysis workstation are described. The target machine for this software is a JPL/Caltech Mark 3fp Hypercube hosted by either a MASSCOMP 5600 or a Sun-3, Sun-4 workstation; however, the design will accommodate other concurrent machines of similar architecture, i.e., local memory, multiple-instruction-multiple-data (MIMD) machines. The CIPE system provides both a multimode user interface and an applications programmer interface, and has been designed around four loosely coupled modules: user interface, host-resident executive, hypercube-resident executive, and application functions. The loose coupling between modules allows modification of a particular module without significantly affecting the other modules in the system. In order to enhance hypercube memory utilization and to allow expansion of image processing capabilities, a specialized program management method, incremental loading, was devised. To minimize data transfer between host and hypercube, a data management method which distributes, redistributes, and tracks data set information was implemented. The data management also allows data sharing among application programs. The CIPE software architecture provides a flexible environment for scientific analysis of complex remote sensing image data, such as planetary data and imaging spectrometry, utilizing state-of-the-art concurrent computation capabilities.

  17. Impact of Utility-Scale Distributed Wind on Transmission-Level System Operations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brancucci Martinez-Anido, C.; Hodge, B. M.

    2014-09-01

    This report presents a new renewable integration study that aims to assess the potential for adding distributed wind to the current power system with minimal or no upgrades to the distribution or transmission electricity systems. It investigates the impacts of integrating large amounts of utility-scale distributed wind power on bulk system operations by performing a case study on the power system of the Independent System Operator-New England (ISO-NE).

  18. High-speed digital wireless battlefield network

    NASA Astrophysics Data System (ADS)

    Dao, Son K.; Zhang, Yongguang; Shek, Eddie C.; van Buer, Darrel

    1999-07-01

    In the past two years, the Digital Wireless Battlefield Network consortium that consists of HRL Laboratories, Hughes Network Systems, Raytheon, and Stanford University has participated in the DARPA TRP program to leverage the efforts in the development of commercial digital wireless products for use in the 21st century battlefield. The consortium has developed an infrastructure and application testbed to support the digitized battlefield. The consortium has implemented and demonstrated this network system. Each member is currently utilizing many of the technology developed in this program in commercial products and offerings. These new communication hardware/software and the demonstrated networking features will benefit military systems and will be applicable to the commercial communication marketplace for high speed voice/data multimedia distribution services.

  19. Research for applications of remote sensing to state and local governments (ARSIG)

    NASA Technical Reports Server (NTRS)

    Foster, K. E.; Johnson, J. D.

    1973-01-01

    Remote sensing and its application to problems confronted by local and state planners are reported. The added dimension of remote sensing as a data gathering tool has been explored identifying pertinent land use factors associated with urban growth such as soil associations, soil capability, vegetation distribution, and flood prone areas. Remote sensing within rural agricultural setting has also been utilized to determine irrigation runoff volumes, cropping patterns, and land use. A variety of data sources including U-2 70 mm multispectral black and white photography, RB-57 9-inch color IR, HyAC panoramic color IR and ERTS-1 imagery have been used over selected areas of Arizona including Tucson, Arizona (NASA Test Site #30) and the Sulphur Springs Valley.

  20. Imaging MALDI MS of Dosed Brain Tissues Utilizing an Alternative Analyte Pre-extraction Approach

    NASA Astrophysics Data System (ADS)

    Quiason, Cristine M.; Shahidi-Latham, Sheerin K.

    2015-06-01

    Matrix-assisted laser desorption ionization (MALDI) imaging mass spectrometry has been adopted in the pharmaceutical industry as a useful tool to detect xenobiotic distribution within tissues. A unique sample preparation approach for MALDI imaging has been described here for the extraction and detection of cobimetinib and clozapine, which were previously undetectable in mouse and rat brain using a single matrix application step. Employing a combination of a buffer wash and a cyclohexane pre-extraction step prior to standard matrix application, the xenobiotics were successfully extracted and detected with an 8 to 20-fold gain in sensitivity. This alternative approach for sample preparation could serve as an advantageous option when encountering difficult to detect analytes.

  1. A generalized Kruskal-Wallis test incorporating group uncertainty with application to genetic association studies.

    PubMed

    Acar, Elif F; Sun, Lei

    2013-06-01

    Motivated by genetic association studies of SNPs with genotype uncertainty, we propose a generalization of the Kruskal-Wallis test that incorporates group uncertainty when comparing k samples. The extended test statistic is based on probability-weighted rank-sums and follows an asymptotic chi-square distribution with k - 1 degrees of freedom under the null hypothesis. Simulation studies confirm the validity and robustness of the proposed test in finite samples. Application to a genome-wide association study of type 1 diabetic complications further demonstrates the utilities of this generalized Kruskal-Wallis test for studies with group uncertainty. The method has been implemented as an open-resource R program, GKW. © 2013, The International Biometric Society.

  2. A condition-specific codon optimization approach for improved heterologous gene expression in Saccharomyces cerevisiae

    PubMed Central

    2014-01-01

    Background Heterologous gene expression is an important tool for synthetic biology that enables metabolic engineering and the production of non-natural biologics in a variety of host organisms. The translational efficiency of heterologous genes can often be improved by optimizing synonymous codon usage to better match the host organism. However, traditional approaches for optimization neglect to take into account many factors known to influence synonymous codon distributions. Results Here we define an alternative approach for codon optimization that utilizes systems level information and codon context for the condition under which heterologous genes are being expressed. Furthermore, we utilize a probabilistic algorithm to generate multiple variants of a given gene. We demonstrate improved translational efficiency using this condition-specific codon optimization approach with two heterologous genes, the fluorescent protein-encoding eGFP and the catechol 1,2-dioxygenase gene CatA, expressed in S. cerevisiae. For the latter case, optimization for stationary phase production resulted in nearly 2.9-fold improvements over commercial gene optimization algorithms. Conclusions Codon optimization is now often a standard tool for protein expression, and while a variety of tools and approaches have been developed, they do not guarantee improved performance for all hosts of applications. Here, we suggest an alternative method for condition-specific codon optimization and demonstrate its utility in Saccharomyces cerevisiae as a proof of concept. However, this technique should be applicable to any organism for which gene expression data can be generated and is thus of potential interest for a variety of applications in metabolic and cellular engineering. PMID:24636000

  3. An investigation into the utilization of HCMM thermal data for the discrimination of volcanic and Eolian geological units. [Newberry Volcano, Oregon

    NASA Technical Reports Server (NTRS)

    Head, J. W., III (Principal Investigator)

    1982-01-01

    The nature of the HCMM data set and the general geologic application of thermal inertia imaging using HCMM data were studied. The CCT's of five sites of interest were obtained and displayed. The upgrading of the image display system was investigated. Fragment/block size distributions in various terrain types wre characterized with emphasis on volcanic terrain. The SEASAT L-band radar images of volcanic landforms at Newberry Volcano, Oregon, were analyzed and compared to imagery from LANDSAT band 7.

  4. Enterprise Cloud Architecture for Chinese Ministry of Railway

    NASA Astrophysics Data System (ADS)

    Shan, Xumei; Liu, Hefeng

    Enterprise like PRC Ministry of Railways (MOR), is facing various challenges ranging from highly distributed computing environment and low legacy system utilization, Cloud Computing is increasingly regarded as one workable solution to address this. This article describes full scale cloud solution with Intel Tashi as virtual machine infrastructure layer, Hadoop HDFS as computing platform, and self developed SaaS interface, gluing virtual machine and HDFS with Xen hypervisor. As a result, on demand computing task application and deployment have been tackled per MOR real working scenarios at the end of article.

  5. A switchable spin-wave signal splitter for magnonic networks

    NASA Astrophysics Data System (ADS)

    Heussner, F.; Serga, A. A.; Brächer, T.; Hillebrands, B.; Pirro, P.

    2017-09-01

    The influence of an inhomogeneous magnetization distribution on the propagation of caustic-like spin-wave beams in unpatterned magnetic films has been investigated by utilizing micromagnetic simulations. Our study reveals a locally controllable and reconfigurable tractability of the beam directions. This feature is used to design a device combining split and switch functionalities for spin-wave signals on the micrometer scale. A coherent transmission of spin-wave signals through the device is verified. This attests the applicability in magnonic networks where the information is encoded in the phase of the spin waves.

  6. Atlas de Recursos Eólicos del Estado de Oaxaca (The Spanish version of Wind Energy Resource Atlas of Oaxaca) (in Spanish)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Elliott, D.; Schwartz, M.; Scott, G.

    The Oaxaca Wind Resource Atlas, produced by the National Renewable Energy Laboratory's (NREL's) wind resource group, is the result of an extensive mapping study for the Mexican State of Oaxaca. This atlas identifies the wind characteristics and distribution of the wind resource in Oaxaca. The detailed wind resource maps and other information contained in the atlas facilitate the identification of prospective areas for use of wind energy technologies, both for utility-scale power generation and off-grid wind energy applications.

  7. A criterion for establishing life limits. [for Space Shuttle Main Engine service

    NASA Technical Reports Server (NTRS)

    Skopp, G. H.; Porter, A. A.

    1990-01-01

    The development of a rigorous statistical method that would utilize hardware-demonstrated reliability to evaluate hardware capability and provide ground rules for safe flight margin is discussed. A statistical-based method using the Weibull/Weibayes cumulative distribution function is described. Its advantages and inadequacies are pointed out. Another, more advanced procedure, Single Flight Reliability (SFR), determines a life limit which ensures that the reliability of any single flight is never less than a stipulated value at a stipulated confidence level. Application of the SFR method is illustrated.

  8. Photothermal Nanocomposite Hydrogel Actuator with Electric-Field-Induced Gradient and Oriented Structure.

    PubMed

    Yang, Yang; Tan, Yun; Wang, Xionglei; An, Wenli; Xu, Shimei; Liao, Wang; Wang, Yuzhong

    2018-03-07

    Recent research of hydrogel actuators is still not sophisticated enough to meet the requirement of fast, reversible, complex, and robust reconfiguration. Here, we present a new kind of poly( N-isopropylacrylamide)/graphene oxide gradient hydrogel by utilizing direct current electric field to induce gradient and oriented distribution of graphene oxide into poly( N-isopropylacrylamide) hydrogel. Upon near-infrared light irradiation, the hydrogel exhibited excellent comprehensive actuation performance as a result of directional bending deformation, promising great potential in the application of soft actuators and optomechanical system.

  9. Building a generalized distributed system model

    NASA Technical Reports Server (NTRS)

    Mukkamala, R.

    1993-01-01

    The key elements in the 1992-93 period of the project are the following: (1) extensive use of the simulator to implement and test - concurrency control algorithms, interactive user interface, and replica control algorithms; and (2) investigations into the applicability of data and process replication in real-time systems. In the 1993-94 period of the project, we intend to accomplish the following: (1) concentrate on efforts to investigate the effects of data and process replication on hard and soft real-time systems - especially we will concentrate on the impact of semantic-based consistency control schemes on a distributed real-time system in terms of improved reliability, improved availability, better resource utilization, and reduced missed task deadlines; and (2) use the prototype to verify the theoretically predicted performance of locking protocols, etc.

  10. Distributed intelligent control and status networking

    NASA Technical Reports Server (NTRS)

    Fortin, Andre; Patel, Manoj

    1993-01-01

    Over the past two years, the Network Control Systems Branch (Code 532) has been investigating control and status networking technologies. These emerging technologies use distributed processing over a network to accomplish a particular custom task. These networks consist of small intelligent 'nodes' that perform simple tasks. Containing simple, inexpensive hardware and software, these nodes can be easily developed and maintained. Once networked, the nodes can perform a complex operation without a central host. This type of system provides an alternative to more complex control and status systems which require a central computer. This paper will provide some background and discuss some applications of this technology. It will also demonstrate the suitability of one particular technology for the Space Network (SN) and discuss the prototyping activities of Code 532 utilizing this technology.

  11. Responses of human sensory characteristics to 532 nm pulse laser stimuli.

    PubMed

    Kim, Ji-Sun; Oh, Han-Byeol; Kim, A-Hee; Kim, Jun-Sik; Lee, Eun-Suk; Goh, Bong-Jun; Kim, Jae-Young; Jang, Kyungmin; Park, Jong-Rak; Chung, Soon-Cheol; Jun, Jae-Hoon

    2016-04-29

    Lasers are advantageous in some applications to stimulate a small target area and is used in various fields such as optogenetic, photoimmunological and neurophysiological studies. This study aims to implement a non-contact sense of touch without damaging biological tissues using laser. Various laser parameters were utilized in safety range to induce a sense of touch and investigate the human responses. With heat distribution simulation, the amount of changes in the temperature and the tendency in laser parameters of sensory stimulation were analyzed. The results showed the identified tactile responses in safety range with various laser parameters and temperature distribution for the laser stimulus was obtained through the simulation. This study can be applied to the areas of sensory receptor stimulation, neurophysiology and clinical medicine.

  12. Super-resolution three-dimensional fluorescence and optical diffraction tomography of live cells using structured illumination generated by a digital micromirror device.

    PubMed

    Shin, Seungwoo; Kim, Doyeon; Kim, Kyoohyun; Park, YongKeun

    2018-06-15

    We present a multimodal approach for measuring the three-dimensional (3D) refractive index (RI) and fluorescence distributions of live cells by combining optical diffraction tomography (ODT) and 3D structured illumination microscopy (SIM). A digital micromirror device is utilized to generate structured illumination patterns for both ODT and SIM, which enables fast and stable measurements. To verify its feasibility and applicability, the proposed method is used to measure the 3D RI distribution and 3D fluorescence image of various samples, including a cluster of fluorescent beads, and the time-lapse 3D RI dynamics of fluorescent beads inside a HeLa cell, from which the trajectory of the beads in the HeLa cell is analyzed using spatiotemporal correlations.

  13. HWDA: A coherence recognition and resolution algorithm for hybrid web data aggregation

    NASA Astrophysics Data System (ADS)

    Guo, Shuhang; Wang, Jian; Wang, Tong

    2017-09-01

    Aiming at the object confliction recognition and resolution problem for hybrid distributed data stream aggregation, a distributed data stream object coherence solution technology is proposed. Firstly, the framework was defined for the object coherence conflict recognition and resolution, named HWDA. Secondly, an object coherence recognition technology was proposed based on formal language description logic and hierarchical dependency relationship between logic rules. Thirdly, a conflict traversal recognition algorithm was proposed based on the defined dependency graph. Next, the conflict resolution technology was prompted based on resolution pattern matching including the definition of the three types of conflict, conflict resolution matching pattern and arbitration resolution method. At last, the experiment use two kinds of web test data sets to validate the effect of application utilizing the conflict recognition and resolution technology of HWDA.

  14. Research on illumination uniformity of high-power LED array light source

    NASA Astrophysics Data System (ADS)

    Yu, Xiaolong; Wei, Xueye; Zhang, Ou; Zhang, Xinwei

    2018-06-01

    Uniform illumination is one of the most important problem that must be solved in the application of high-power LED array. A numerical optimization algorithm, is applied to obtain the best LED array typesetting so that the light intensity of the target surface is evenly distributed. An evaluation function is set up through the standard deviation of the illuminance function, then the particle swarm optimization algorithm is utilized to optimize different arrays. Furthermore, the light intensity distribution is obtained by optical ray tracing method. Finally, a hybrid array is designed and the optical ray tracing method is applied to simulate the array. The simulation results, which is consistent with the traditional theoretical calculation, show that the algorithm introduced in this paper is reasonable and effective.

  15. Assessing corporate restructurings in the electric utility industry: A framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Malko, J.R.

    1996-12-31

    Corporate restructurings of electric utilities in the United States have become an important and controversial issue during the 1980s. Regulators and electric utility executives have different perspectives concerning corporate restructurings associated with diversification, mergers, and functional separation of generation, transmission, and distribution. Regulators attempt to regulate electric utilities effectively in order to assure that adequate electricity services are provided at reasonable cost and to protect the public interest which includes considering choices and risks to customers. Regulators are considering and developing new regulatory approaches in order to address corporate restructurings and balance regulation and competitive pressures. Electric utility executives typicallymore » view corporate restructurings as a potential partial solution to financial challenges and problems and are analyzing corporate restructuring activities within the framework of the corporate strategic planning process. Executives attempt to find new sources of economic value and consider risks and potential returns to investors in an increasingly competitive environment. The parent holding company is generally used as the basic corporate form for restructuring activities in the electric utility industry. However, the wholly-owned utility subsidiary structure remains in use for some restructurings. The primary purpose of this paper is to propose a framework to assess corporate restructurings in the electric utility industry from a public policy perspective. This paper is organized in the following manner. First, different types of corporate restructurings in the electric utility industry are examined. Second, reasons for corporate restructuring activities are represented. Third, a framework for assessing corporate restructuring activities is proposed. Fourth, the application of the framework is discussed.« less

  16. Alternative Fuels Data Center

    Science.gov Websites

    Utility Company Electric Vehicle (EV) Charging Load Projection Requirement The Public Utilities Regulatory Authority requires electric distribution companies to integrate EV charging load projections into the EV charging load projections for the company's distribution planning. (Reference Connecticut

  17. Evaluation of COTS Rad Detection Apps

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wagner, Eric

    2014-02-01

    Mobile applications are currently under distribution to smart phones utilizing the built-in charge coupled-device (CCD) camera as a radiation detector. The CCD detector has a very low but measurable gamma interaction cross section so the mechanism is feasible, especially for higher dose rate environments. Given that in a large release of radioactive material these ‘crowd sourced’ measurements will be put forth for consideration, a testing and evaluation of the accuracy and uncertainty of the Apps is a critical endeavor. Not only is the accuracy of the reported measurement of concern to the immediate user’s safety, a quantitative uncertainty is requiredmore » for a government response such as the Federal Radiological Monitoring and Assessment Center (FRMAC) to accept the values for consideration in the determination of regions exceeding protective action guidelines. Already, prompted by the Fukushima nuclear material releases, several repositories of this crowd-sourced data have been created (http://japan.failedrobot.com, http://www.stubbytour.com/nuc/index_en.asp, and http://www.rdtn.org) although the question remains as to the reliability of measurements incorporated into these repositories. In cases of conflict between the real-time published crowd-sourced data and governmental protective actions prepared literature should be on-hand documenting why the difference, if any, exists. Four applications for iOS devices were obtained along with hardware to benchmark their performance. Gamma/X-Ray Detector by Stephan Hotto, Geiger Camera by Senscare, and RadioactivityCounter App by Hotray LTD are all the applications available for distribution within the US that utilize the CCD camera sensor for detection of radiation levels. The CellRad app under development by Idaho National Laboratory for the Android platform was evaluated. In addition, iRad Geiger with the associated hardware accessory was also benchmarked. Radiation fields were generated in a Cs-137 JL Shepherd Model 89 shielded calibration range. The accuracy of the exposure rate within the box calibrator is +/- 5% of the system settings and NIST traceable. This is the same calibration unit utilized for calibration of US DOE exposure rate meters. Measurements were performed from 0.2 to 40,000 mR/hr. Included in the following sections are discussions on each of the evaluated applications and their performance in reporting radiation field measurements. Unfortunately the applications do not provide a readily identifiable quality of the measurement in order to produce error bars or even out of range conditions. In general all the CCD based applications had issues detecting consistent measurements in radiation fields less than 5 mR/hr. This is most likely attributable to the electronic noise level on the CCD’s becoming comparable to the signal level due to the ionizing radiation photons.« less

  18. Monitoring and control requirement definition study for Dispersed Storage and Generation (DSG). Volume 4, appendix C: Identification from utility visits of present and future approaches to integration of DSG into distribution networks

    NASA Technical Reports Server (NTRS)

    1980-01-01

    Visits to four utilities concerned with the use of DSG power sources on their distribution networks yielded useful impressions of present and future approaches to the integration of DSGs into electrical distribution network. Different approaches to future utility systems with DSG are beginning to take shape. The new DSG sources will be in decentralized locations with some measure of centralized control. The utilities have yet to establish firmly the communication and control means or their organization. For the present, the means for integrating the DSGs and their associated monitoring and control equipment into a unified system have not been decided.

  19. Fabrication of PDMS-Based Microfluidic Devices: Application for Synthesis of Magnetic Nanoparticles

    NASA Astrophysics Data System (ADS)

    Thu, Vu Thi; Mai, An Ngoc; Le The Tam; Van Trung, Hoang; Thu, Phung Thi; Tien, Bui Quang; Thuat, Nguyen Tran; Lam, Tran Dai

    2016-05-01

    In this work, we have developed a convenient approach to synthesize magnetic nanoparticles with relatively high magnetization and controllable sizes. This was realized by combining the traditional co-precipitation method and microfluidic techniques inside microfluidic devices. The device was first designed, and then fabricated using simplified soft-lithography techniques. The device was utilized to synthesize magnetite nanoparticles. The synthesized nanomaterials were thoroughly characterized using field emission scanning electron microscopy and a vibrating sample magnetometer. The results demonstrated that the as-prepared device can be utilized as a simple and effective tool to synthesize magnetic nanoparticles with the sizes less than 10 nm and magnetization more than 50 emu/g. The development of these devices opens new strategies to synthesize nanomaterials with more precise dimensions at narrow size-distribution and with controllable behaviors.

  20. GPS-based tracking system for TOPEX orbit determination

    NASA Technical Reports Server (NTRS)

    Melbourne, W. G.

    1984-01-01

    A tracking system concept is discussed that is based on the utilization of the constellation of Navstar satellites in the Global Positioning System (GPS). The concept involves simultaneous and continuous metric tracking of the signals from all visible Navstar satellites by approximately six globally distributed ground terminals and by the TOPEX spacecraft at 1300-km altitude. Error studies indicate that this system could be capable of obtaining decimeter position accuracies and, most importantly, around 5 cm in the radial component which is key to exploiting the full accuracy potential of the altimetric measurements for ocean topography. Topics covered include: background of the GPS, the precision mode for utilization of the system, past JPL research for using the GPS in precision applications, the present tracking system concept for high accuracy satellite positioning, and results from a proof-of-concept demonstration.

  1. Advanced Image Processing for NASA Applications

    NASA Technical Reports Server (NTRS)

    LeMoign, Jacqueline

    2007-01-01

    The future of space exploration will involve cooperating fleets of spacecraft or sensor webs geared towards coordinated and optimal observation of Earth Science phenomena. The main advantage of such systems is to utilize multiple viewing angles as well as multiple spatial and spectral resolutions of sensors carried on multiple spacecraft but acting collaboratively as a single system. Within this framework, our research focuses on all areas related to sensing in collaborative environments, which means systems utilizing intracommunicating spatially distributed sensor pods or crafts being deployed to monitor or explore different environments. This talk will describe the general concept of sensing in collaborative environments, will give a brief overview of several technologies developed at NASA Goddard Space Flight Center in this area, and then will concentrate on specific image processing research related to that domain, specifically image registration and image fusion.

  2. High-Performance Monitoring Architecture for Large-Scale Distributed Systems Using Event Filtering

    NASA Technical Reports Server (NTRS)

    Maly, K.

    1998-01-01

    Monitoring is an essential process to observe and improve the reliability and the performance of large-scale distributed (LSD) systems. In an LSD environment, a large number of events is generated by the system components during its execution or interaction with external objects (e.g. users or processes). Monitoring such events is necessary for observing the run-time behavior of LSD systems and providing status information required for debugging, tuning and managing such applications. However, correlated events are generated concurrently and could be distributed in various locations in the applications environment which complicates the management decisions process and thereby makes monitoring LSD systems an intricate task. We propose a scalable high-performance monitoring architecture for LSD systems to detect and classify interesting local and global events and disseminate the monitoring information to the corresponding end- points management applications such as debugging and reactive control tools to improve the application performance and reliability. A large volume of events may be generated due to the extensive demands of the monitoring applications and the high interaction of LSD systems. The monitoring architecture employs a high-performance event filtering mechanism to efficiently process the large volume of event traffic generated by LSD systems and minimize the intrusiveness of the monitoring process by reducing the event traffic flow in the system and distributing the monitoring computation. Our architecture also supports dynamic and flexible reconfiguration of the monitoring mechanism via its Instrumentation and subscription components. As a case study, we show how our monitoring architecture can be utilized to improve the reliability and the performance of the Interactive Remote Instruction (IRI) system which is a large-scale distributed system for collaborative distance learning. The filtering mechanism represents an Intrinsic component integrated with the monitoring architecture to reduce the volume of event traffic flow in the system, and thereby reduce the intrusiveness of the monitoring process. We are developing an event filtering architecture to efficiently process the large volume of event traffic generated by LSD systems (such as distributed interactive applications). This filtering architecture is used to monitor collaborative distance learning application for obtaining debugging and feedback information. Our architecture supports the dynamic (re)configuration and optimization of event filters in large-scale distributed systems. Our work represents a major contribution by (1) survey and evaluating existing event filtering mechanisms In supporting monitoring LSD systems and (2) devising an integrated scalable high- performance architecture of event filtering that spans several kev application domains, presenting techniques to improve the functionality, performance and scalability. This paper describes the primary characteristics and challenges of developing high-performance event filtering for monitoring LSD systems. We survey existing event filtering mechanisms and explain key characteristics for each technique. In addition, we discuss limitations with existing event filtering mechanisms and outline how our architecture will improve key aspects of event filtering.

  3. Utility photovoltaic group: Status report

    NASA Astrophysics Data System (ADS)

    Serfass, Jeffrey A.; Hester, Stephen L.; Wills, Bethany N.

    1996-01-01

    The Utility PhotoVoltaic Group (UPVG) was formed in October of 1992 with a mission to accelerate the use of cost-effective small-scale and emerging grid-connected applications of photovoltaics for the benefit of electric utilities and their customers. The UPVG is now implementing a program to install up to 50 megawatts of photovoltaics in small-scale and grid-connected applications. This program, called TEAM-UP, is a partnership of the U.S. electric utility industry and the U.S. Department of Energy to help develop utility PV markets. TEAM-UP is a utility-directed program to significantly increase utility PV experience by promoting installations of utility PV systems. Two primary program areas are proposed for TEAM-UP: (1) Small-Scale Applications (SSA)—an initiative to aggregate utility purchases of small-scale, grid-independent applications; and (2) Grid-Connected Applications (GCA)—an initiative to identify and competitively award cost-sharing contracts for grid-connected PV systems with high market growth potential, or collective purchase programs involving multiple buyers. This paper describes these programs and outlines the schedule, the procurement status, and the results of the TEAM-UP process.

  4. Hierarchical Data Distribution Scheme for Peer-to-Peer Networks

    NASA Astrophysics Data System (ADS)

    Bhushan, Shashi; Dave, M.; Patel, R. B.

    2010-11-01

    In the past few years, peer-to-peer (P2P) networks have become an extremely popular mechanism for large-scale content sharing. P2P systems have focused on specific application domains (e.g. music files, video files) or on providing file system like capabilities. P2P is a powerful paradigm, which provides a large-scale and cost-effective mechanism for data sharing. P2P system may be used for storing data globally. Can we implement a conventional database on P2P system? But successful implementation of conventional databases on the P2P systems is yet to be reported. In this paper we have presented the mathematical model for the replication of the partitions and presented a hierarchical based data distribution scheme for the P2P networks. We have also analyzed the resource utilization and throughput of the P2P system with respect to the availability, when a conventional database is implemented over the P2P system with variable query rate. Simulation results show that database partitions placed on the peers with higher availability factor perform better. Degradation index, throughput, resource utilization are the parameters evaluated with respect to the availability factor.

  5. The SIMRAND methodology: Theory and application for the simulation of research and development projects

    NASA Technical Reports Server (NTRS)

    Miles, R. F., Jr.

    1986-01-01

    A research and development (R&D) project often involves a number of decisions that must be made concerning which subset of systems or tasks are to be undertaken to achieve the goal of the R&D project. To help in this decision making, SIMRAND (SIMulation of Research ANd Development Projects) is a methodology for the selection of the optimal subset of systems or tasks to be undertaken on an R&D project. Using alternative networks, the SIMRAND methodology models the alternative subsets of systems or tasks under consideration. Each path through an alternative network represents one way of satisfying the project goals. Equations are developed that relate the system or task variables to the measure of reference. Uncertainty is incorporated by treating the variables of the equations probabilistically as random variables, with cumulative distribution functions assessed by technical experts. Analytical techniques of probability theory are used to reduce the complexity of the alternative networks. Cardinal utility functions over the measure of preference are assessed for the decision makers. A run of the SIMRAND Computer I Program combines, in a Monte Carlo simulation model, the network structure, the equations, the cumulative distribution functions, and the utility functions.

  6. Mars Entry Atmospheric Data System Trajectory Reconstruction Algorithms and Flight Results

    NASA Technical Reports Server (NTRS)

    Karlgaard, Christopher D.; Kutty, Prasad; Schoenenberger, Mark; Shidner, Jeremy; Munk, Michelle

    2013-01-01

    The Mars Entry Atmospheric Data System is a part of the Mars Science Laboratory, Entry, Descent, and Landing Instrumentation project. These sensors are a system of seven pressure transducers linked to ports on the entry vehicle forebody to record the pressure distribution during atmospheric entry. These measured surface pressures are used to generate estimates of atmospheric quantities based on modeled surface pressure distributions. Specifically, angle of attack, angle of sideslip, dynamic pressure, Mach number, and freestream atmospheric properties are reconstructed from the measured pressures. Such data allows for the aerodynamics to become decoupled from the assumed atmospheric properties, allowing for enhanced trajectory reconstruction and performance analysis as well as an aerodynamic reconstruction, which has not been possible in past Mars entry reconstructions. This paper provides details of the data processing algorithms that are utilized for this purpose. The data processing algorithms include two approaches that have commonly been utilized in past planetary entry trajectory reconstruction, and a new approach for this application that makes use of the pressure measurements. The paper describes assessments of data quality and preprocessing, and results of the flight data reduction from atmospheric entry, which occurred on August 5th, 2012.

  7. Carboxylesterase inhibitors

    PubMed Central

    Hatfield, M. Jason; Potter, Philip M.

    2011-01-01

    Introduction Carboxylesterases play major roles in the hydrolysis of numerous therapeutically active compounds. This is, in part, due to the prevalence of the ester moiety in these small molecules. However, the impact these enzymes may play on drug stability and pharmacokinetics is rarely considered prior to molecule development. Therefore, the application of selective inhibitors of this class of proteins may have utility in modulating the metabolism, distribution and toxicity of agents that are subjected to enzyme hydrolysis. Areas covered This review details the development of all such compounds dating back to 1986, but principally focuses on the very recent identification of selective human carboxylesterases inhibitors. Expert opinion The implementation of carboxylesterase inhibitors may significantly revolutionize drug discovery. Such molecules may allow for improved efficacy of compounds inactivated by this class of enzymes and/or reduce the toxicity of agents that are activated by these proteins. Furthermore, since lack of carboxylesterase activity appears to have no obvious biological consequence, these compounds could be applied in combination with virtually any esterified drug. Therefore, inhibitors of these proteins may have utility in altering drug hydrolysis and distribution in vivo. The characteristics, chemical and biological properties, and potential uses of such agents, are discussed here. PMID:21609191

  8. Absolute nuclear material assay using count distribution (LAMBDA) space

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prasad, Mano K.; Snyderman, Neal J.; Rowland, Mark S.

    A method of absolute nuclear material assay of an unknown source comprising counting neutrons from the unknown source and providing an absolute nuclear material assay utilizing a model to optimally compare to the measured count distributions. In one embodiment, the step of providing an absolute nuclear material assay comprises utilizing a random sampling of analytically computed fission chain distributions to generate a continuous time-evolving sequence of event-counts by spreading the fission chain distribution in time.

  9. Absolute nuclear material assay using count distribution (LAMBDA) space

    DOEpatents

    Prasad, Manoj K [Pleasanton, CA; Snyderman, Neal J [Berkeley, CA; Rowland, Mark S [Alamo, CA

    2012-06-05

    A method of absolute nuclear material assay of an unknown source comprising counting neutrons from the unknown source and providing an absolute nuclear material assay utilizing a model to optimally compare to the measured count distributions. In one embodiment, the step of providing an absolute nuclear material assay comprises utilizing a random sampling of analytically computed fission chain distributions to generate a continuous time-evolving sequence of event-counts by spreading the fission chain distribution in time.

  10. A Multi Agent-Based Framework for Simulating Household PHEV Distribution and Electric Distribution Network Impact

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cui, Xiaohui; Liu, Cheng; Kim, Hoe Kyoung

    2011-01-01

    The variation of household attributes such as income, travel distance, age, household member, and education for different residential areas may generate different market penetration rates for plug-in hybrid electric vehicle (PHEV). Residential areas with higher PHEV ownership could increase peak electric demand locally and require utilities to upgrade the electric distribution infrastructure even though the capacity of the regional power grid is under-utilized. Estimating the future PHEV ownership distribution at the residential household level can help us understand the impact of PHEV fleet on power line congestion, transformer overload and other unforeseen problems at the local residential distribution network level.more » It can also help utilities manage the timing of recharging demand to maximize load factors and utilization of existing distribution resources. This paper presents a multi agent-based simulation framework for 1) modeling spatial distribution of PHEV ownership at local residential household level, 2) discovering PHEV hot zones where PHEV ownership may quickly increase in the near future, and 3) estimating the impacts of the increasing PHEV ownership on the local electric distribution network with different charging strategies. In this paper, we use Knox County, TN as a case study to show the simulation results of the agent-based model (ABM) framework. However, the framework can be easily applied to other local areas in the US.« less

  11. Increasing the realism of projected tree species ranges by incorporating migration potential: an eastern US case study

    NASA Astrophysics Data System (ADS)

    Rogers, B. M.; Jantz, P.; Goetz, S. J.

    2015-12-01

    Models of vegetation distributions are used for a wide variety of purposes, from global assessments of biome shifts and biogeochemical feedbacks to local management planning. Dynamic vegetation models, mostly mechanistic in origin, are valuable for regional to global studies but remain limited for more local-scale applications, especially those that require species-specific responses to climate change. Species distribution models (SDMs) are broadly used for such applications, but these too have several outstanding limitations, one of the most prominent being a lack of dispersal and migration. Several hybrid models have recently been developed, but these generally require detailed parameterization of species-level attributes that may not be known. Here we present an approach to couple migration potential with SDM output for a large number of species in order to more realistically project future range shifts. We focus on 40 tree species in the eastern US of potential management concern, either because of their canopy dominance, ecosystem functions, or potential for utilizing future climates. Future climates were taken from a CMIP5 model ensemble average using RCP 4.5 and 8.5 scenarios. We used Random Forests to characterize current and future environmental suitability, and modeled migration as a negative exponential kernel that is affected by forest fragmentation and the density of current seed sources. We present results in a vulnerability framework relevant for a number of ongoing management activities in the region. We find an overarching pattern of northward and eastward range shifts, with high-elevation and northern species being the most adversely impacted. Because of limitations to migration, many newly suitable areas could not be utilized without active intervention. Only a few areas exhibited consistently favorable conditions that could be utilized by the relevant species, including the central Appalachian foothills and the Florida panhandle. We suggest that a continued effort to include migration potential into vegetation models can lead to more realistic results and management-relevant products.

  12. CloVR: A virtual machine for automated and portable sequence analysis from the desktop using cloud computing

    PubMed Central

    2011-01-01

    Background Next-generation sequencing technologies have decentralized sequence acquisition, increasing the demand for new bioinformatics tools that are easy to use, portable across multiple platforms, and scalable for high-throughput applications. Cloud computing platforms provide on-demand access to computing infrastructure over the Internet and can be used in combination with custom built virtual machines to distribute pre-packaged with pre-configured software. Results We describe the Cloud Virtual Resource, CloVR, a new desktop application for push-button automated sequence analysis that can utilize cloud computing resources. CloVR is implemented as a single portable virtual machine (VM) that provides several automated analysis pipelines for microbial genomics, including 16S, whole genome and metagenome sequence analysis. The CloVR VM runs on a personal computer, utilizes local computer resources and requires minimal installation, addressing key challenges in deploying bioinformatics workflows. In addition CloVR supports use of remote cloud computing resources to improve performance for large-scale sequence processing. In a case study, we demonstrate the use of CloVR to automatically process next-generation sequencing data on multiple cloud computing platforms. Conclusion The CloVR VM and associated architecture lowers the barrier of entry for utilizing complex analysis protocols on both local single- and multi-core computers and cloud systems for high throughput data processing. PMID:21878105

  13. Hydrological Modeling in the Bull Run Watershed in Support of a Piloting Utility Modeling Applications (PUMA) Project

    NASA Astrophysics Data System (ADS)

    Nijssen, B.; Chiao, T. H.; Lettenmaier, D. P.; Vano, J. A.

    2016-12-01

    Hydrologic models with varying complexities and structures are commonly used to evaluate the impact of climate change on future hydrology. While the uncertainties in future climate projections are well documented, uncertainties in streamflow projections associated with hydrologic model structure and parameter estimation have received less attention. In this study, we implemented and calibrated three hydrologic models (the Distributed Hydrology Soil Vegetation Model (DHSVM), the Precipitation-Runoff Modeling System (PRMS), and the Variable Infiltration Capacity model (VIC)) for the Bull Run watershed in northern Oregon using consistent data sources and best practice calibration protocols. The project was part of a Piloting Utility Modeling Applications (PUMA) project with the Portland Water Bureau (PWB) under the umbrella of the Water Utility Climate Alliance (WUCA). Ultimately PWB would use the model evaluation to select a model to perform in-house climate change analysis for Bull Run Watershed. This presentation focuses on the experimental design of the comparison project, project findings and the collaboration between the team at the University of Washington and at PWB. After calibration, the three models showed similar capability to reproduce seasonal and inter-annual variations in streamflow, but differed in their ability to capture extreme events. Furthermore, the annual and seasonal hydrologic sensitivities to changes in climate forcings differed among models, potentially attributable to different model representations of snow and vegetation processes.

  14. Net current control device. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fugate, D.; Cooper, J.H.

    1998-11-01

    Net currents generally result in elevated magnetic fields because the alternate paths are distant from the circuit conductors. Investigations have shown that one of the primary sources of power frequency magnetic fields in residential buildings is currents that return to their source via paths other than the neutral conductors. As part of EPRI`s Magnetic Field Shielding Project, ferromagnetic devices, called net current control (NCC) devices, were developed and tested for use in reducing net currents on electric power cables and the resulting magnetic fields. Applied to a residential service drop, an NCC device reduces net current by forcing current offmore » local non-utility ground paths, and back onto the neutral conductor. Circuit models and basic design equations for the NCC concept were developed, and proof-of-principles tests were carried out on an actual residence with cooperation from the local utility. After proving the basic concepts, three prototype NCC devices were built and tested on a simulated neighborhood power system. Additional prototypes were built for testing by interested EPRI utility members. Results have shown that the NCC prototypes installed on residential service drops reduce net currents to milliampere levels with compromising the safety of the ground system. Although the focus was on application to residential service cables, the NCC concept is applicable to single-phase and three-phase distribution systems as well.« less

  15. Energetics and Application of Heterotrophy in Acetogenic Bacteria.

    PubMed

    Schuchmann, Kai; Müller, Volker

    2016-07-15

    Acetogenic bacteria are a diverse group of strictly anaerobic bacteria that utilize the Wood-Ljungdahl pathway for CO2 fixation and energy conservation. These microorganisms play an important part in the global carbon cycle and are a key component of the anaerobic food web. Their most prominent metabolic feature is autotrophic growth with molecular hydrogen and carbon dioxide as the substrates. However, most members also show an outstanding metabolic flexibility for utilizing a vast variety of different substrates. In contrast to autotrophic growth, which is hardly competitive, metabolic flexibility is seen as a key ability of acetogens to compete in ecosystems and might explain the almost-ubiquitous distribution of acetogenic bacteria in anoxic environments. This review covers the latest findings with respect to the heterotrophic metabolism of acetogenic bacteria, including utilization of carbohydrates, lactate, and different alcohols, especially in the model acetogen Acetobacterium woodii Modularity of metabolism, a key concept of pathway design in synthetic biology, together with electron bifurcation, to overcome energetic barriers, appears to be the basis for the amazing substrate spectrum. At the same time, acetogens depend on only a relatively small number of enzymes to expand the substrate spectrum. We will discuss the energetic advantages of coupling CO2 reduction to fermentations that exploit otherwise-inaccessible substrates and the ecological advantages, as well as the biotechnological applications of the heterotrophic metabolism of acetogens. Copyright © 2016, American Society for Microbiology. All Rights Reserved.

  16. Energetics and Application of Heterotrophy in Acetogenic Bacteria

    PubMed Central

    Schuchmann, Kai

    2016-01-01

    Acetogenic bacteria are a diverse group of strictly anaerobic bacteria that utilize the Wood-Ljungdahl pathway for CO2 fixation and energy conservation. These microorganisms play an important part in the global carbon cycle and are a key component of the anaerobic food web. Their most prominent metabolic feature is autotrophic growth with molecular hydrogen and carbon dioxide as the substrates. However, most members also show an outstanding metabolic flexibility for utilizing a vast variety of different substrates. In contrast to autotrophic growth, which is hardly competitive, metabolic flexibility is seen as a key ability of acetogens to compete in ecosystems and might explain the almost-ubiquitous distribution of acetogenic bacteria in anoxic environments. This review covers the latest findings with respect to the heterotrophic metabolism of acetogenic bacteria, including utilization of carbohydrates, lactate, and different alcohols, especially in the model acetogen Acetobacterium woodii. Modularity of metabolism, a key concept of pathway design in synthetic biology, together with electron bifurcation, to overcome energetic barriers, appears to be the basis for the amazing substrate spectrum. At the same time, acetogens depend on only a relatively small number of enzymes to expand the substrate spectrum. We will discuss the energetic advantages of coupling CO2 reduction to fermentations that exploit otherwise-inaccessible substrates and the ecological advantages, as well as the biotechnological applications of the heterotrophic metabolism of acetogens. PMID:27208103

  17. Trade-off decisions in distribution utility management

    NASA Astrophysics Data System (ADS)

    Slavickas, Rimas Anthony

    As a result of the "unbundling" of traditional monopolistic electricity generation and transmission enterprises into a free-market economy, power distribution utilities are faced with very difficult decisions pertaining to electricity supply options and quality of service to the customers. The management of distribution utilities has become increasingly complex, versatile, and dynamic to the extent that conventional, non-automated management tools are almost useless and obsolete. This thesis presents a novel and unified approach to managing electricity supply options and quality of service to customers. The technique formulates the problem in terms of variables, parameters, and constraints. An advanced Mixed Integer Programming (MIP) optimization formulation is developed together with novel, logical, decision-making algorithms. These tools enable the utility management to optimize various cost components and assess their time-trend impacts, taking into account the intangible issues such as customer perception, customer expectation, social pressures, and public response to service deterioration. The above concepts are further generalized and a Logical Proportion Analysis (LPA) methodology and associated software have been developed. Solutions using numbers are replaced with solutions using words (character strings) which more closely emulate the human decision-making process and advance the art of decision-making in the power utility environment. Using practical distribution utility operation data and customer surveys, the developments outlined in this thesis are successfully applied to several important utility management problems. These involve the evaluation of alternative electricity supply options, the impact of rate structures on utility business, and the decision of whether to continue to purchase from a main grid or generate locally (partially or totally) by building Non-Utility Generation (NUG).

  18. Energy distributions exhibited during thermal runaway of commercial lithium ion batteries used for human spaceflight applications

    NASA Astrophysics Data System (ADS)

    Yayathi, Sandeep; Walker, William; Doughty, Daniel; Ardebili, Haleh

    2016-10-01

    Lithium ion (Li-ion) batteries provide low mass and energy dense solutions necessary for space exploration, but thermal related safety concerns impede the utilization of Li-ion technology for human applications. Experimental characterization of thermal runaway energy release with accelerated rate calorimetry supports safer thermal management systems. 'Standard' accelerated rate calorimetry setup provides means to measure the addition of energy exhibited through the body of a Li-ion cell. This study considers the total energy generated during thermal runaway as distributions between cell body and hot gases via inclusion of a unique secondary enclosure inside the calorimeter; this closed system not only contains the cell body and gaseous species, but also captures energy release associated with rapid heat transfer to the system unobserved by measurements taken on the cell body. Experiments include Boston Power Swing 5300, Samsung 18650-26F and MoliCel 18650-J Li-ion cells at varied states-of-charge. An inverse relationship between state-of-charge and onset temperature is observed. Energy contained in the cell body and gaseous species are successfully characterized; gaseous energy is minimal. Significant additional energy is measured with the heating of the secondary enclosure. Improved calorimeter apparatus including a secondary enclosure provides essential capability to measuring total energy release distributions during thermal runaway.

  19. WATER DISTRIBUTION SYSTEM ANALYSIS: FIELD STUDIES, MODELING AND MANAGEMENT

    EPA Science Inventory

    The user‘s guide entitled “Water Distribution System Analysis: Field Studies, Modeling and Management” is a reference guide for water utilities and an extensive summarization of information designed to provide drinking water utility personnel (and related consultants and research...

  20. Real time testing of intelligent relays for synchronous distributed generation islanding detection

    NASA Astrophysics Data System (ADS)

    Zhuang, Davy

    As electric power systems continue to grow to meet ever-increasing energy demand, their security, reliability, and sustainability requirements also become more stringent. The deployment of distributed energy resources (DER), including generation and storage, in conventional passive distribution feeders, gives rise to integration problems involving protection and unintentional islanding. Distributed generators need to be islanded for safety reasons when disconnected or isolated from the main feeder as distributed generator islanding may create hazards to utility and third-party personnel, and possibly damage the distribution system infrastructure, including the distributed generators. This thesis compares several key performance indicators of a newly developed intelligent islanding detection relay, against islanding detection devices currently used by the industry. The intelligent relay employs multivariable analysis and data mining methods to arrive at decision trees that contain both the protection handles and the settings. A test methodology is developed to assess the performance of these intelligent relays on a real time simulation environment using a generic model based on a real-life distribution feeder. The methodology demonstrates the applicability and potential advantages of the intelligent relay, by running a large number of tests, reflecting a multitude of system operating conditions. The testing indicates that the intelligent relay often outperforms frequency, voltage and rate of change of frequency relays currently used for islanding detection, while respecting the islanding detection time constraints imposed by standing distributed generator interconnection guidelines.

  1. Application of Statistically Derived CPAS Parachute Parameters

    NASA Technical Reports Server (NTRS)

    Romero, Leah M.; Ray, Eric S.

    2013-01-01

    The Capsule Parachute Assembly System (CPAS) Analysis Team is responsible for determining parachute inflation parameters and dispersions that are ultimately used in verifying system requirements. A model memo is internally released semi-annually documenting parachute inflation and other key parameters reconstructed from flight test data. Dispersion probability distributions published in previous versions of the model memo were uniform because insufficient data were available for determination of statistical based distributions. Uniform distributions do not accurately represent the expected distributions since extreme parameter values are just as likely to occur as the nominal value. CPAS has taken incremental steps to move away from uniform distributions. Model Memo version 9 (MMv9) made the first use of non-uniform dispersions, but only for the reefing cutter timing, for which a large number of sample was available. In order to maximize the utility of the available flight test data, clusters of parachutes were reconstructed individually starting with Model Memo version 10. This allowed for statistical assessment for steady-state drag area (CDS) and parachute inflation parameters such as the canopy fill distance (n), profile shape exponent (expopen), over-inflation factor (C(sub k)), and ramp-down time (t(sub k)) distributions. Built-in MATLAB distributions were applied to the histograms, and parameters such as scale (sigma) and location (mu) were output. Engineering judgment was used to determine the "best fit" distribution based on the test data. Results include normal, log normal, and uniform (where available data remains insufficient) fits of nominal and failure (loss of parachute and skipped stage) cases for all CPAS parachutes. This paper discusses the uniform methodology that was previously used, the process and result of the statistical assessment, how the dispersions were incorporated into Monte Carlo analyses, and the application of the distributions in trajectory benchmark testing assessments with parachute inflation parameters, drag area, and reefing cutter timing used by CPAS.

  2. Ontology-based content analysis of US patent applications from 2001-2010.

    PubMed

    Weber, Lutz; Böhme, Timo; Irmer, Matthias

    2013-01-01

    Ontology-based semantic text analysis methods allow to automatically extract knowledge relationships and data from text documents. In this review, we have applied these technologies for the systematic analysis of pharmaceutical patents. Hierarchical concepts from the knowledge domains of chemical compounds, diseases and proteins were used to annotate full-text US patent applications that deal with pharmacological activities of chemical compounds and filed in the years 2001-2010. Compounds claimed in these applications have been classified into their respective compound classes to review the distribution of scaffold types or general compound classes such as natural products in a time-dependent manner. Similarly, the target proteins and claimed utility of the compounds have been classified and the most relevant were extracted. The method presented allows the discovery of the main areas of innovation as well as emerging fields of patenting activities - providing a broad statistical basis for competitor analysis and decision-making efforts.

  3. Spatially selective assembly of quantum dot light emitters in an LED using engineered peptides.

    PubMed

    Demir, Hilmi Volkan; Seker, Urartu Ozgur Safak; Zengin, Gulis; Mutlugun, Evren; Sari, Emre; Tamerler, Candan; Sarikaya, Mehmet

    2011-04-26

    Semiconductor nanocrystal quantum dots are utilized in numerous applications in nano- and biotechnology. In device applications, where several different material components are involved, quantum dots typically need to be assembled at explicit locations for enhanced functionality. Conventional approaches cannot meet these requirements where assembly of nanocrystals is usually material-nonspecific, thereby limiting the control of their spatial distribution. Here we demonstrate directed self-assembly of quantum dot emitters at material-specific locations in a color-conversion LED containing several material components including a metal, a dielectric, and a semiconductor. We achieve a spatially selective immobilization of quantum dot emitters by using the unique material selectivity characteristics provided by the engineered solid-binding peptides as smart linkers. Peptide-decorated quantum dots exhibited several orders of magnitude higher photoluminescence compared to the control groups, thus, potentially opening up novel ways to advance these photonic platforms in applications ranging from chemical to biodetection.

  4. The emerging role of cloud computing in molecular modelling.

    PubMed

    Ebejer, Jean-Paul; Fulle, Simone; Morris, Garrett M; Finn, Paul W

    2013-07-01

    There is a growing recognition of the importance of cloud computing for large-scale and data-intensive applications. The distinguishing features of cloud computing and their relationship to other distributed computing paradigms are described, as are the strengths and weaknesses of the approach. We review the use made to date of cloud computing for molecular modelling projects and the availability of front ends for molecular modelling applications. Although the use of cloud computing technologies for molecular modelling is still in its infancy, we demonstrate its potential by presenting several case studies. Rapid growth can be expected as more applications become available and costs continue to fall; cloud computing can make a major contribution not just in terms of the availability of on-demand computing power, but could also spur innovation in the development of novel approaches that utilize that capacity in more effective ways. Copyright © 2013 Elsevier Inc. All rights reserved.

  5. GLobal Integrated Design Environment (GLIDE): A Concurrent Engineering Application

    NASA Technical Reports Server (NTRS)

    McGuire, Melissa L.; Kunkel, Matthew R.; Smith, David A.

    2010-01-01

    The GLobal Integrated Design Environment (GLIDE) is a client-server software application purpose-built to mitigate issues associated with real time data sharing in concurrent engineering environments and to facilitate discipline-to-discipline interaction between multiple engineers and researchers. GLIDE is implemented in multiple programming languages utilizing standardized web protocols to enable secure parameter data sharing between engineers and researchers across the Internet in closed and/or widely distributed working environments. A well defined, HyperText Transfer Protocol (HTTP) based Application Programming Interface (API) to the GLIDE client/server environment enables users to interact with GLIDE, and each other, within common and familiar tools. One such common tool, Microsoft Excel (Microsoft Corporation), paired with its add-in API for GLIDE, is discussed in this paper. The top-level examples given demonstrate how this interface improves the efficiency of the design process of a concurrent engineering study while reducing potential errors associated with manually sharing information between study participants.

  6. Application of laminar flow control to high-bypass-ratio turbofan engine nacelles

    NASA Technical Reports Server (NTRS)

    Wie, Y. S.; Collier, F. S., Jr.; Wagner, R. D.

    1991-01-01

    Recently, the concept of the application of hybrid laminar flow to modern commercial transport aircraft was successfully flight tested on a Boeing 757 aircraft. In this limited demonstration, in which only part of the upper surface of the swept wing was designed for the attainment of laminar flow, significant local drag reduction was measured. This paper addresses the potential application of this technology to laminarize the external surface of large, modern turbofan engine nacelles which may comprise as much as 5-10 percent of the total wetted area of future commercial transports. A hybrid-laminar-flow-control (HLFC) pressure distribution is specified and the corresponding nacelle geometry is computed utilizing a predictor/corrector design method. Linear stability calculations are conducted to provide predictions of the extent of the laminar boundary layer. Performance studies are presented to determine potential benefits in terms of reduced fuel consumption.

  7. Si nanocrystals-based multilayers for luminescent and photovoltaic device applications

    NASA Astrophysics Data System (ADS)

    Lu, Peng; Li, Dongke; Cao, Yunqing; Xu, Jun; Chen, Kunji

    2018-06-01

    Low dimensional Si materials have attracted much attention because they can be developed in many kinds of new-generation nano-electronic and optoelectronic devices, among which Si nanocrystals-based multilayered material is one of the most promising candidates and has been extensively studied. By using multilayered structures, the size and distribution of nanocrystals as well as the barrier thickness between two adjacent Si nanocrystal layers can be well controlled, which is beneficial to the device applications. This paper presents an overview of the fabrication and device applications of Si nanocrystals, especially in luminescent and photovoltaic devices. We first introduce the fabrication methods of Si nanocrystals-based multilayers. Then, we systematically review the utilization of Si nanocrystals in luminescent and photovoltaic devices. Finally, some expectations for further development of the Si nanocrystals-based photonic and photovoltaic devices are proposed. Project supported by the National Natural Science Foundation of China (Nos. 11774155, 11274155).

  8. Infrared Imaging Tools for Diagnostic Applications in Dermatology.

    PubMed

    Gurjarpadhye, Abhijit Achyut; Parekh, Mansi Bharat; Dubnika, Arita; Rajadas, Jayakumar; Inayathullah, Mohammed

    Infrared (IR) imaging is a collection of non-invasive imaging techniques that utilize the IR domain of the electromagnetic spectrum for tissue assessment. A subset of these techniques construct images using back-reflected light, while other techniques rely on detection of IR radiation emitted by the tissue as a result of its temperature. Modern IR detectors sense thermal emissions and produce a heat map of surface temperature distribution in tissues. Thus, the IR spectrum offers a variety of imaging applications particularly useful in clinical diagnostic area, ranging from high-resolution, depth-resolved visualization of tissue to temperature variation assessment. These techniques have been helpful in the diagnosis of many medical conditions including skin/breast cancer, arthritis, allergy, burns, and others. In this review, we discuss current roles of IR-imaging techniques for diagnostic applications in dermatology with an emphasis on skin cancer, allergies, blisters, burns and wounds.

  9. Forecast model applications of retrieved three dimensional liquid water fields

    NASA Technical Reports Server (NTRS)

    Raymond, William H.; Olson, William S.

    1990-01-01

    Forecasts are made for tropical storm Emily using heating rates derived from the SSM/I physical retrievals described in chapters 2 and 3. Average values of the latent heating rates from the convective and stratiform cloud simulations, used in the physical retrieval, are obtained for individual 1.1 km thick vertical layers. Then, the layer-mean latent heating rates are regressed against the slant path-integrated liquid and ice precipitation water contents to determine the best fit two parameter regression coefficients for each layer. The regression formulae and retrieved precipitation water contents are utilized to infer the vertical distribution of heating rates for forecast model applications. In the forecast model, diabatic temperature contributions are calculated and used in a diabatic initialization, or in a diabatic initialization combined with a diabatic forcing procedure. Our forecasts show that the time needed to spin-up precipitation processes in tropical storm Emily is greatly accelerated through the application of the data.

  10. Voltage Impacts of Utility-Scale Distributed Wind

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Allen, A.

    2014-09-01

    Although most utility-scale wind turbines in the United States are added at the transmission level in large wind power plants, distributed wind power offers an alternative that could increase the overall wind power penetration without the need for additional transmission. This report examines the distribution feeder-level voltage issues that can arise when adding utility-scale wind turbines to the distribution system. Four of the Pacific Northwest National Laboratory taxonomy feeders were examined in detail to study the voltage issues associated with adding wind turbines at different distances from the sub-station. General rules relating feeder resistance up to the point of turbinemore » interconnection to the expected maximum voltage change levels were developed. Additional analysis examined line and transformer overvoltage conditions.« less

  11. Application of Laser Imaging for Bio/geophysical Studies

    NASA Technical Reports Server (NTRS)

    Hummel, J. R.; Goltz, S. M.; Depiero, N. L.; Degloria, D. P.; Pagliughi, F. M.

    1992-01-01

    SPARTA, Inc. has developed a low-cost, portable laser imager that, among other applications, can be used in bio/geophysical applications. In the application to be discussed here, the system was utilized as an imaging system for background features in a forested locale. The SPARTA mini-ladar system was used at the International Paper Northern Experimental Forest near Howland, Maine to assist in a project designed to study the thermal and radiometric phenomenology at forest edges. The imager was used to obtain data from three complex sites, a 'seed' orchard, a forest edge, and a building. The goal of the study was to demonstrate the usefulness of the laser imager as a tool to obtain geometric and internal structure data about complex 3-D objects in a natural background. The data from these images have been analyzed to obtain information about the distributions of the objects in a scene. A range detection algorithm has been used to identify individual objects in a laser image and an edge detection algorithm then applied to highlight the outlines of discrete objects. An example of an image processed in such a manner is shown. Described here are the results from the study. In addition, results are presented outlining how the laser imaging system could be used to obtain other important information about bio/geophysical systems, such as the distribution of woody material in forests.

  12. FAWKES Information Management for Space Situational Awareness

    NASA Astrophysics Data System (ADS)

    Spetka, S.; Ramseyer, G.; Tucker, S.

    2010-09-01

    Current space situational awareness assets can be fully utilized by managing their inputs and outputs in real time. Ideally, sensors are tasked to perform specific functions to maximize their effectiveness. Many sensors are capable of collecting more data than is needed for a particular purpose, leading to the potential to enhance a sensor’s utilization by allowing it to be re-tasked in real time when it is determined that sufficient data has been acquired to meet the first task’s requirements. In addition, understanding a situation involving fast-traveling objects in space may require inputs from more than one sensor, leading to a need for information sharing in real time. Observations that are not processed in real time may be archived to support forensic analysis for accidents and for long-term studies. Space Situational Awareness (SSA) requires an extremely robust distributed software platform to appropriately manage the collection and distribution for both real-time decision-making as well as for analysis. FAWKES is being developed as a Joint Space Operations Center (JSPOC) Mission System (JMS) compliant implementation of the AFRL Phoenix information management architecture. It implements a pub/sub/archive/query (PSAQ) approach to communications designed for high performance applications. FAWKES provides an easy to use, reliable interface for structuring parallel processing, and is particularly well suited to the requirements of SSA. In addition to supporting point-to-point communications, it offers an elegant and robust implementation of collective communications, to scatter, gather and reduce values. A query capability is also supported that enhances reliability. Archived messages can be queried to re-create a computation or to selectively retrieve previous publications. PSAQ processes express their role in a computation by subscribing to their inputs and by publishing their results. Sensors on the edge can subscribe to inputs by appropriately authorized users, allowing dynamic tasking capabilities. Previously, the publication of sensor data collected by mobile systems was demonstrated. Thumbnails of infrared imagery that were imaged in real time by an aircraft [1] were published over a grid. This airborne system subscribed to requests for and then published the requested detailed images. In another experiment a system employing video subscriptions [2] drove the analysis of live video streams, resulting in a published stream of processed video output. We are currently implementing an SSA system that uses FAWKES to deliver imagery from telescopes through a pipeline of processing steps that are performed on high performance computers. PSAQ facilitates the decomposition of a problem into components that can be distributed across processing assets from the smallest sensors in space to the largest high performance computing (HPC) centers, as well as the integration and distribution of the results, all in real time. FAWKES supports the real-time latency requirements demanded by all of these applications. It also enhances reliability by easily supporting redundant computation. This study shows how FAWKES/PSAQ is utilized in SSA applications, and presents performance results for latency and throughput that meet these needs.

  13. 26 CFR 1.42-10 - Utility allowances.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... into account, among other things, local utility rates, property type, climate and degree-day variables... which the building containing the units is located. (c) Changes in applicable utility allowance—(1) In...)), the applicable utility allowance for units changes, the new utility allowance must be used to compute...

  14. 26 CFR 1.42-10 - Utility allowances.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... into account, among other things, local utility rates, property type, climate and degree-day variables... which the building containing the units is located. (c) Changes in applicable utility allowance—(1) In...)), the applicable utility allowance for units changes, the new utility allowance must be used to compute...

  15. 26 CFR 1.42-10 - Utility allowances.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... into account, among other things, local utility rates, property type, climate and degree-day variables... which the building containing the units is located. (c) Changes in applicable utility allowance—(1) In...)), the applicable utility allowance for units changes, the new utility allowance must be used to compute...

  16. 26 CFR 1.42-10 - Utility allowances.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... into account, among other things, local utility rates, property type, climate and degree-day variables... which the building containing the units is located. (c) Changes in applicable utility allowance—(1) In...)), the applicable utility allowance for units changes, the new utility allowance must be used to compute...

  17. 26 CFR 1.42-10 - Utility allowances.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... into account, among other things, local utility rates, property type, climate and degree-day variables... which the building containing the units is located. (c) Changes in applicable utility allowance—(1) In...)), the applicable utility allowance for units changes, the new utility allowance must be used to compute...

  18. Relative importance of magnetic moments in UXO clearance applications

    USGS Publications Warehouse

    Sanchez, V.; Li, Y.; Nabighian, M.; Wright, D.

    2006-01-01

    Surface magnetic anomaly observed in UXO clearance is mainly dipolar and, as a result, the dipole is the only moment used regularly in UXO applications. The dipole moment contains intensity of magnetization information but lacks shape information. Unlike dipole, higher-order moments, such as quadrupole and octupole, encode asymmetry properties of magnetization distribution within buried targets. In order to improve our understanding of magnetization distribution within UXO and non-UXO objects and its potential utility in UXO clearance, we present results of a 3D numerical modeling study for highly susceptible metallic objects. The basis for modeling is the solution of a nonlinear integral equation, describing magnetization within isolated objects, allowing us to compute magnetic moments of the object, analyze their relationships, and provide a depiction of the surface anomaly produced by the different moments within the object. Our modeling results show significant high-order moments for more asymmetric objects situated at typical UXO burial depths, and suggest that the increased relative contribution to magnetic gradient data from these higher-order moments may provide a practical tool for improved UXO discrimination. ?? 2005 Society of Exploration Geophysicists.

  19. Range-azimuth decouple beamforming for frequency diverse array with Costas-sequence modulated frequency offsets

    NASA Astrophysics Data System (ADS)

    Wang, Zhe; Wang, Wen-Qin; Shao, Huaizong

    2016-12-01

    Different from the phased-array using the same carrier frequency for each transmit element, the frequency diverse array (FDA) uses a small frequency offset across the array elements to produce range-angle-dependent transmit beampattern. FDA radar provides new application capabilities and potentials due to its range-dependent transmit array beampattern, but the FDA using linearly increasing frequency offsets will produce a range and angle coupled transmit beampattern. In order to decouple the range-azimuth beampattern for FDA radar, this paper proposes a uniform linear array (ULA) FDA using Costas-sequence modulated frequency offsets to produce random-like energy distribution in the transmit beampattern and thumbtack transmit-receive beampattern. In doing so, the range and angle of targets can be unambiguously estimated through matched filtering and subspace decomposition algorithms in the receiver signal processor. Moreover, random-like energy distributed beampattern can also be utilized for low probability of intercept (LPI) radar applications. Numerical results show that the proposed scheme outperforms the standard FDA in focusing the transmit energy, especially in the range dimension.

  20. Simulation and Control Lab Development for Power and Energy Management for NASA Manned Deep Space Missions

    NASA Technical Reports Server (NTRS)

    McNelis, Anne M.; Beach, Raymond F.; Soeder, James F.; McNelis, Nancy B.; May, Ryan; Dever, Timothy P.; Trase, Larry

    2014-01-01

    The development of distributed hierarchical and agent-based control systems will allow for reliable autonomous energy management and power distribution for on-orbit missions. Power is one of the most critical systems on board a space vehicle, requiring quick response time when a fault or emergency is identified. As NASAs missions with human presence extend beyond low earth orbit autonomous control of vehicle power systems will be necessary and will need to reliably function for long periods of time. In the design of autonomous electrical power control systems there is a need to dynamically simulate and verify the EPS controller functionality prior to use on-orbit. This paper presents the work at NASA Glenn Research Center in Cleveland, Ohio where the development of a controls laboratory is being completed that will be utilized to demonstrate advanced prototype EPS controllers for space, aeronautical and terrestrial applications. The control laboratory hardware, software and application of an autonomous controller for demonstration with the ISS electrical power system is the subject of this paper.

  1. Loci-STREAM Version 0.9

    NASA Technical Reports Server (NTRS)

    Wright, Jeffrey; Thakur, Siddharth

    2006-01-01

    Loci-STREAM is an evolving computational fluid dynamics (CFD) software tool for simulating possibly chemically reacting, possibly unsteady flows in diverse settings, including rocket engines, turbomachines, oil refineries, etc. Loci-STREAM implements a pressure- based flow-solving algorithm that utilizes unstructured grids. (The benefit of low memory usage by pressure-based algorithms is well recognized by experts in the field.) The algorithm is robust for flows at all speeds from zero to hypersonic. The flexibility of arbitrary polyhedral grids enables accurate, efficient simulation of flows in complex geometries, including those of plume-impingement problems. The present version - Loci-STREAM version 0.9 - includes an interface with the Portable, Extensible Toolkit for Scientific Computation (PETSc) library for access to enhanced linear-equation-solving programs therein that accelerate convergence toward a solution. The name "Loci" reflects the creation of this software within the Loci computational framework, which was developed at Mississippi State University for the primary purpose of simplifying the writing of complex multidisciplinary application programs to run in distributed-memory computing environments including clusters of personal computers. Loci has been designed to relieve application programmers of the details of programming for distributed-memory computers.

  2. Kinetic model for the vibrational energy exchange in flowing molecular gas mixtures. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Offenhaeuser, F.

    1987-01-01

    The present study is concerned with the development of a computational model for the description of the vibrational energy exchange in flowing gas mixtures, taking into account a given number of energy levels for each vibrational degree of freedom. It is possible to select an arbitrary number of energy levels. The presented model uses values in the range from 10 to approximately 40. The distribution of energy with respect to these levels can differ from the equilibrium distribution. The kinetic model developed can be employed for arbitrary gaseous mixtures with an arbitrary number of vibrational degrees of freedom for each type of gas. The application of the model to CO2-H2ON2-O2-He mixtures is discussed. The obtained relations can be utilized in a study of the suitability of radiation-related transitional processes, involving the CO2 molecule, for laser applications. It is found that the computational results provided by the model agree very well with experimental data obtained for a CO2 laser. Possibilities for the activation of a 16-micron and 14-micron laser are considered.

  3. Advanced Communication and Control Solutions of Distributed Energy Resources (DER)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Asgeirsson, Haukur; Seguin, Richard; Sherding, Cameron

    2007-01-10

    This report covers work performed in Phase II of a two phase project whose objective was to demonstrate the aggregation of multiple Distributed Energy Resources (DERs) and to offer them into the energy market. The Phase I work (DE-FC36-03CH11161) created an integrated, but distributed, system and procedures to monitor and control multiple DERs from numerous manufacturers connected to the electric distribution system. Procedures were created which protect the distribution network and personnel that may be working on the network. Using the web as the communication medium for control and monitoring of the DERs, the integration of information and security wasmore » accomplished through the use of industry standard protocols such as secure SSL,VPN and ICCP. The primary objective of Phase II was to develop the procedures for marketing the power of the Phase I aggregated DERs in the energy market, increase the number of DER units, and implement the marketing procedures (interface with ISOs) for the DER generated power. The team partnered with the Midwest Independent System Operator (MISO), the local ISO, to address the energy market and demonstrate the economic dispatch of DERs in response to market signals. The selection of standards-based communication technologies offers the ability of the system to be deployed and integrated with other utilities’ resources. With the use of a data historian technology to facilitate the aggregation, the developed algorithms and procedures can be verified, audited, and modified. The team has demonstrated monitoring and control of multiple DERs as outlined in phase I report including procedures to perform these operations in a secure and safe manner. In Phase II, additional DER units were added. We also expanded on our phase I work to enhance communication security and to develop the market model of having DERs, both customer and utility owned, participate in the energy market. We are proposing a two-part DER energy market model--a utility need business model and an independent energy aggregator-business model. The approach of developing two group models of DER energy participation in the market is unique. The Detroit Edison (DECo, Utility)-led team includes: DTE Energy Technologies (Dtech, DER provider), Electrical Distribution Design (EDD, Virginia Tech company supporting EPRI’s Distribution Engineering Workstation, DEW), Systems Integration Specialists Company (SISCO, economic scheduling and real-time protocol integrator), and OSIsoft (PI software system for managing real-time information). This team is focused on developing the application engineering, including software systems necessary for DER’s integration, control and sale into the market place. Phase II Highlights Installed and tested an ICCP link with SSL (security) between DECo, the utility, and DTE Energy Technologies (DTECH), the aggregator, making DER data available to the utility for both monitoring and control. Installed and tested PI process book with circuit & DER operational models for DECo SOC/ROC operator’s use for monitoring of both utility circuit and customer DER parameters. The PI Process Book models also included DER control for the DECo SOC/ROC operators, which was tested and demonstrated control. The DER Tagging and Operating Procedures were developed, which allowed that control to be done in a safe manner, were modified for required MOC/MISO notification procedures. The Distribution Engineering Workstation (DEW) was modified to include temperature normalized load research statistics, using a 30 hour day-ahead weather feed. This allowed day-ahead forecasting of the customer load profile and the entire circuit to determine overload and low voltage problems. This forecast at the point of common coupling was passed to DTech DR SOC for use in their economic dispatch algorithm. Standard Work Instructions were developed for DER notification, sale, and operation into the MISO market. A software mechanism consisting of a suite of new and revised functionality was developed that integrated with the local ISO such that offers can be made electronically without human intervention. A suite of software was developed by DR SOC enabling DER usage in real time and day-ahead: Generation information file exchange with PI and the utility power flow A utility day-ahead information file Energy Offer Web Service Market Result Web Service Real-Time Meter Data Web Service Real-Time Notification Web Service Registered over 20 DER with MISO in Demand Response Market and demonstrated electronic sale to MISO.« less

  4. Distributed intelligent monitoring and reporting facilities

    NASA Astrophysics Data System (ADS)

    Pavlou, George; Mykoniatis, George; Sanchez-P, Jorge-A.

    1996-06-01

    Distributed intelligent monitoring and reporting facilities are of paramount importance in both service and network management as they provide the capability to monitor quality of service and utilization parameters and notify degradation so that corrective action can be taken. By intelligent, we refer to the capability of performing the monitoring tasks in a way that has the smallest possible impact on the managed network, facilitates the observation and summarization of information according to a number of criteria and in its most advanced form and permits the specification of these criteria dynamically to suit the particular policy in hand. In addition, intelligent monitoring facilities should minimize the design and implementation effort involved in such activities. The ISO/ITU Metric, Summarization and Performance management functions provide models that only partially satisfy the above requirements. This paper describes our extensions to the proposed models to support further capabilities, with the intention to eventually lead to fully dynamically defined monitoring policies. The concept of distributing intelligence is also discussed, including the consideration of security issues and the applicability of the model in ODP-based distributed processing environments.

  5. 7 CFR 1710.207 - RUS criteria for approval of load forecasts by distribution borrowers not required to maintain an...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... distribution borrowers that are unaffiliated with a power supply borrower, or by distribution borrowers that are members of a power supply borrower that has a total utility plant less than $500 million and that is not itself a member of another power supply borrower with a total utility plant of $500 million or...

  6. Technologies for network-centric C4ISR

    NASA Astrophysics Data System (ADS)

    Dunkelberger, Kirk A.

    2003-07-01

    Three technologies form the heart of any network-centric command, control, communication, intelligence, surveillance, and reconnaissance (C4ISR) system: distributed processing, reconfigurable networking, and distributed resource management. Distributed processing, enabled by automated federation, mobile code, intelligent process allocation, dynamic multiprocessing groups, check pointing, and other capabilities creates a virtual peer-to-peer computing network across the force. Reconfigurable networking, consisting of content-based information exchange, dynamic ad-hoc routing, information operations (perception management) and other component technologies forms the interconnect fabric for fault tolerant inter processor and node communication. Distributed resource management, which provides the means for distributed cooperative sensor management, foe sensor utilization, opportunistic collection, symbiotic inductive/deductive reasoning and other applications provides the canonical algorithms for network-centric enterprises and warfare. This paper introduces these three core technologies and briefly discusses a sampling of their component technologies and their individual contributions to network-centric enterprises and warfare. Based on the implied requirements, two new algorithms are defined and characterized which provide critical building blocks for network centricity: distributed asynchronous auctioning and predictive dynamic source routing. The first provides a reliable, efficient, effective approach for near-optimal assignment problems; the algorithm has been demonstrated to be a viable implementation for ad-hoc command and control, object/sensor pairing, and weapon/target assignment. The second is founded on traditional dynamic source routing (from mobile ad-hoc networking), but leverages the results of ad-hoc command and control (from the contributed auctioning algorithm) into significant increases in connection reliability through forward prediction. Emphasis is placed on the advantages gained from the closed-loop interaction of the multiple technologies in the network-centric application environment.

  7. Imaging bio-distribution of a topically applied dermatological cream on minipig skin using fluorescence lifetime imaging microscopy (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Alex, Aneesh; Chaney, Eric J.; Criley, Jennifer M.; Spillman, Darold R.; Hutchison, Phaedra B.; Li, Joanne; Marjanovic, Marina; Frey, Steve; Cook, Steven; Boppart, Stephen A.; Arp, Zane A.

    2017-02-01

    Currently there is a lack of in vivo techniques to evaluate the spatial bio-distribution of dermal drugs over time without the need to take multiple serial biopsies. To address this gap, we investigated the use of multi-photon optical imaging methods to non-invasively track drug distribution on miniature pig (Species: Sus scrofa, Strain: Göttingen) skin in vivo. Minipig skin is the standard comparative research model to human skin, and is anatomically and functionally similar. We employed fluorescence lifetime imaging microscopy (FLIM) to visualize the spatial distribution and residency time of a topically applied experimental dermatological cream. This was made possible by the endogenous fluorescent optical properties of the experimental drug (fluorescence lifetime > 3000 ps). Two different drug formulations were applied on 2 minipigs for 7 consecutive days, with the control creams applied on the contralateral side, followed by 7 days of post-application monitoring using a multi-modal optical imaging system (MPTflex-CARS, JenLab, Germany). FLIM images were obtained from the treated regions 24 hr post-application from day 1 to day 14 that allowed visualization of cellular and sub-cellular features associated with different dermal layers non-invasively to a depth of 200 µm. Five punch biopsies per animal were obtained from the corresponding treated regions between days 8 and 14 for bioanalytical analysis and comparison with results obtained using FLIM. In conclusion, utilization of non-invasive optical biopsy methods for dermal drug evaluation can provide true longitudinal monitoring of drug spatial distribution, remove sampling limitations, and be more time-efficient compared to traditional methods.

  8. 76 FR 13125 - Announcement of Grant Application Deadlines and Funding Levels; Correction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-10

    ... DEPARTMENT OF AGRICULTURE Rural Utilities Service Announcement of Grant Application Deadlines and Funding Levels; Correction AGENCY: Rural Utilities Service, USDA. ACTION: Notice of Solicitation of Applications; correction. SUMMARY: The United States Department of Agriculture's (USDA) Rural Utilities Service...

  9. 76 FR 12017 - Announcement of Grant Application Deadlines and Funding Levels

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-04

    ... DEPARTMENT OF AGRICULTURE Rural Utilities Service Announcement of Grant Application Deadlines and Funding Levels AGENCY: Rural Utilities Service, USDA. ACTION: Notice of solicitation of applications. SUMMARY: The United States Department of Agriculture's (USDA) Rural Utilities Service (RUS) announces the...

  10. Framework for Development of Object-Oriented Software

    NASA Technical Reports Server (NTRS)

    Perez-Poveda, Gus; Ciavarella, Tony; Nieten, Dan

    2004-01-01

    The Real-Time Control (RTC) Application Framework is a high-level software framework written in C++ that supports the rapid design and implementation of object-oriented application programs. This framework provides built-in functionality that solves common software development problems within distributed client-server, multi-threaded, and embedded programming environments. When using the RTC Framework to develop software for a specific domain, designers and implementers can focus entirely on the details of the domain-specific software rather than on creating custom solutions, utilities, and frameworks for the complexities of the programming environment. The RTC Framework was originally developed as part of a Space Shuttle Launch Processing System (LPS) replacement project called Checkout and Launch Control System (CLCS). As a result of the framework s development, CLCS software development time was reduced by 66 percent. The framework is generic enough for developing applications outside of the launch-processing system domain. Other applicable high-level domains include command and control systems and simulation/ training systems.

  11. Evaluation of a black-footed ferret resource utilization function model

    USGS Publications Warehouse

    Eads, D.A.; Millspaugh, J.J.; Biggins, D.E.; Jachowski, D.S.; Livieri, T.M.

    2011-01-01

    Resource utilization function (RUF) models permit evaluation of potential habitat for endangered species; ideally such models should be evaluated before use in management decision-making. We evaluated the predictive capabilities of a previously developed black-footed ferret (Mustela nigripes) RUF. Using the population-level RUF, generated from ferret observations at an adjacent yet distinct colony, we predicted the distribution of ferrets within a black-tailed prairie dog (Cynomys ludovicianus) colony in the Conata Basin, South Dakota, USA. We evaluated model performance, using data collected during post-breeding spotlight surveys (2007-2008) by assessing model agreement via weighted compositional analysis and count-metrics. Compositional analysis of home range use and colony-level availability, and core area use and home range availability, demonstrated ferret selection of the predicted Very high and High occurrence categories in 2007 and 2008. Simple count-metrics corroborated these findings and suggested selection of the Very high category in 2007 and the Very high and High categories in 2008. Collectively, these results suggested that the RUF was useful in predicting occurrence and intensity of space use of ferrets at our study site, the 2 objectives of the RUF. Application of this validated RUF would increase the resolution of habitat evaluations, permitting prediction of the distribution of ferrets within distinct colonies. Additional model evaluation at other sites, on other black-tailed prairie dog colonies of varying resource configuration and size, would increase understanding of influences upon model performance and the general utility of the RUF. ?? 2011 The Wildlife Society.

  12. Flash Cracking Reactor for Waste Plastic Processing

    NASA Technical Reports Server (NTRS)

    Timko, Michael T.; Wong, Hsi-Wu; Gonzalez, Lino A.; Broadbelt, Linda; Raviknishan, Vinu

    2013-01-01

    Conversion of waste plastic to energy is a growing problem that is especially acute in space exploration applications. Moreover, utilization of heavy hydrocarbon resources (wastes, waxes, etc.) as fuels and chemicals will be a growing need in the future. Existing technologies require a trade-off between product selectivity and feedstock conversion. The objective of this work was to maintain high plastic-to-fuel conversion without sacrificing the liquid yield. The developed technology accomplishes this goal with a combined understanding of thermodynamics, reaction rates, and mass transport to achieve high feed conversion without sacrificing product selectivity. The innovation requires a reaction vessel, hydrocarbon feed, gas feed, and pressure and temperature control equipment. Depending on the feedstock and desired product distribution, catalyst can be added. The reactor is heated to the desired tempera ture, pressurized to the desired pressure, and subject to a sweep flow at the optimized superficial velocity. Software developed under this project can be used to determine optimal values for these parameters. Product is vaporized, transferred to a receiver, and cooled to a liquid - a form suitable for long-term storage as a fuel or chemical. An important NASA application is the use of solar energy to convert waste plastic into a form that can be utilized during periods of low solar energy flux. Unlike previous work in this field, this innovation uses thermodynamic, mass transport, and reaction parameters to tune product distribution of pyrolysis cracking. Previous work in this field has used some of these variables, but never all in conjunction for process optimization. This method is useful for municipal waste incinerator operators and gas-to-liquids companies.

  13. Application of analyzer based X-ray imaging technique for detection of ultrasound induced cavitation bubbles from a physical therapy unit.

    PubMed

    Izadifar, Zahra; Belev, George; Babyn, Paul; Chapman, Dean

    2015-10-19

    The observation of ultrasound generated cavitation bubbles deep in tissue is very difficult. The development of an imaging method capable of investigating cavitation bubbles in tissue would improve the efficiency and application of ultrasound in the clinic. Among the previous imaging modalities capable of detecting cavitation bubbles in vivo, the acoustic detection technique has the positive aspect of in vivo application. However the size of the initial cavitation bubble and the amplitude of the ultrasound that produced the cavitation bubbles, affect the timing and amplitude of the cavitation bubbles' emissions. The spatial distribution of cavitation bubbles, driven by 0.8835 MHz therapeutic ultrasound system at output power of 14 Watt, was studied in water using a synchrotron X-ray imaging technique, Analyzer Based Imaging (ABI). The cavitation bubble distribution was investigated by repeated application of the ultrasound and imaging the water tank. The spatial frequency of the cavitation bubble pattern was evaluated by Fourier analysis. Acoustic cavitation was imaged at four different locations through the acoustic beam in water at a fixed power level. The pattern of cavitation bubbles in water was detected by synchrotron X-ray ABI. The spatial distribution of cavitation bubbles driven by the therapeutic ultrasound system was observed using ABI X-ray imaging technique. It was observed that the cavitation bubbles appeared in a periodic pattern. The calculated distance between intervals revealed that the distance of frequent cavitation lines (intervals) is one-half of the acoustic wave length consistent with standing waves. This set of experiments demonstrates the utility of synchrotron ABI for visualizing cavitation bubbles formed in water by clinical ultrasound systems working at high frequency and output powers as low as a therapeutic system.

  14. Distributed photovoltaic systems: Utility interface issues and their present status

    NASA Technical Reports Server (NTRS)

    Hassan, M.; Klein, J.

    1981-01-01

    Major technical issues involving the integration of distributed photovoltaics (PV) into electric utility systems are defined and their impacts are described quantitatively. An extensive literature search, interviews, and analysis yielded information about the work in progress and highlighted problem areas in which additional work and research are needed. The findings from the literature search were used to determine whether satisfactory solutions to the problems exist or whether satisfactory approaches to a solution are underway. It was discovered that very few standards, specifications, or guidelines currently exist that will aid industry in integrating PV into the utility system. Specific areas of concern identified are: (1) protection, (2) stability, (3) system unbalance, (4) voltage regulation and reactive power requirements, (5) harmonics, (6) utility operations, (7) safety, (8) metering, and (9) distribution system planning and design.

  15. Distributed controller clustering in software defined networks

    PubMed Central

    Gani, Abdullah; Akhunzada, Adnan; Talebian, Hamid; Choo, Kim-Kwang Raymond

    2017-01-01

    Software Defined Networking (SDN) is an emerging promising paradigm for network management because of its centralized network intelligence. However, the centralized control architecture of the software-defined networks (SDNs) brings novel challenges of reliability, scalability, fault tolerance and interoperability. In this paper, we proposed a novel clustered distributed controller architecture in the real setting of SDNs. The distributed cluster implementation comprises of multiple popular SDN controllers. The proposed mechanism is evaluated using a real world network topology running on top of an emulated SDN environment. The result shows that the proposed distributed controller clustering mechanism is able to significantly reduce the average latency from 8.1% to 1.6%, the packet loss from 5.22% to 4.15%, compared to distributed controller without clustering running on HP Virtual Application Network (VAN) SDN and Open Network Operating System (ONOS) controllers respectively. Moreover, proposed method also shows reasonable CPU utilization results. Furthermore, the proposed mechanism makes possible to handle unexpected load fluctuations while maintaining a continuous network operation, even when there is a controller failure. The paper is a potential contribution stepping towards addressing the issues of reliability, scalability, fault tolerance, and inter-operability. PMID:28384312

  16. Failure probability analysis of optical grid

    NASA Astrophysics Data System (ADS)

    Zhong, Yaoquan; Guo, Wei; Sun, Weiqiang; Jin, Yaohui; Hu, Weisheng

    2008-11-01

    Optical grid, the integrated computing environment based on optical network, is expected to be an efficient infrastructure to support advanced data-intensive grid applications. In optical grid, the faults of both computational and network resources are inevitable due to the large scale and high complexity of the system. With the optical network based distributed computing systems extensive applied in the processing of data, the requirement of the application failure probability have been an important indicator of the quality of application and an important aspect the operators consider. This paper will present a task-based analysis method of the application failure probability in optical grid. Then the failure probability of the entire application can be quantified, and the performance of reducing application failure probability in different backup strategies can be compared, so that the different requirements of different clients can be satisfied according to the application failure probability respectively. In optical grid, when the application based DAG (directed acyclic graph) is executed in different backup strategies, the application failure probability and the application complete time is different. This paper will propose new multi-objective differentiated services algorithm (MDSA). New application scheduling algorithm can guarantee the requirement of the failure probability and improve the network resource utilization, realize a compromise between the network operator and the application submission. Then differentiated services can be achieved in optical grid.

  17. Design of a Glenn Research Center Solar Field Grid-Tied Photovoltaic Power System

    NASA Technical Reports Server (NTRS)

    Eichenberg, Dennis J.

    2009-01-01

    The NASA Glenn Research Center (GRC) designed, developed, and installed, a 37.5 kW DC photovoltaic (PV) Solar Field in the GRC West Area in the 1970s for the purpose of testing PV panels for various space and terrestrial applications. The PV panels are arranged to provide a nominal 120 VDC. The GRC Solar Field has been extremely successful in meeting its mission. The PV panels and the supporting electrical systems are all near their end of life. GRC has designed a 72 kW DC grid-tied PV power system to replace the existing GRC West Area Solar Field. The 72 kW DC grid-tied PV power system will provide DC solar power for GRC PV testing applications, and provide AC facility power for all times that research power is not required. A grid-tied system is connected directly to the utility distribution grid. Facility power can be obtained from the utility system as normal. The PV system is synchronized with the utility system to provide power for the facility, and excess power is provided to the utility for use by all. The project transfers space technology to terrestrial use via nontraditional partners. GRC personnel glean valuable experience with PV power systems that are directly applicable to various space power systems, and provide valuable space program test data. PV power systems help to reduce harmful emissions and reduce the Nation s dependence on fossil fuels. Power generated by the PV system reduces the GRC utility demand, and the surplus power aids the community. Present global energy concerns reinforce the need for the development of alternative energy systems. Modern PV panels are readily available, reliable, efficient, and economical with a life expectancy of at least 25 years. Modern electronics has been the enabling technology behind grid-tied power systems, making them safe, reliable, efficient, and economical with a life expectancy of at least 25 years. The report concludes that the GRC West Area grid-tied PV power system design is viable for a reliable, maintenance free, long life power system that is of significant value to NASA and the community.

  18. Massively parallel data processing for quantitative total flow imaging with optical coherence microscopy and tomography

    NASA Astrophysics Data System (ADS)

    Sylwestrzak, Marcin; Szlag, Daniel; Marchand, Paul J.; Kumar, Ashwin S.; Lasser, Theo

    2017-08-01

    We present an application of massively parallel processing of quantitative flow measurements data acquired using spectral optical coherence microscopy (SOCM). The need for massive signal processing of these particular datasets has been a major hurdle for many applications based on SOCM. In view of this difficulty, we implemented and adapted quantitative total flow estimation algorithms on graphics processing units (GPU) and achieved a 150 fold reduction in processing time when compared to a former CPU implementation. As SOCM constitutes the microscopy counterpart to spectral optical coherence tomography (SOCT), the developed processing procedure can be applied to both imaging modalities. We present the developed DLL library integrated in MATLAB (with an example) and have included the source code for adaptations and future improvements. Catalogue identifier: AFBT_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AFBT_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU GPLv3 No. of lines in distributed program, including test data, etc.: 913552 No. of bytes in distributed program, including test data, etc.: 270876249 Distribution format: tar.gz Programming language: CUDA/C, MATLAB. Computer: Intel x64 CPU, GPU supporting CUDA technology. Operating system: 64-bit Windows 7 Professional. Has the code been vectorized or parallelized?: Yes, CPU code has been vectorized in MATLAB, CUDA code has been parallelized. RAM: Dependent on users parameters, typically between several gigabytes and several tens of gigabytes Classification: 6.5, 18. Nature of problem: Speed up of data processing in optical coherence microscopy Solution method: Utilization of GPU for massively parallel data processing Additional comments: Compiled DLL library with source code and documentation, example of utilization (MATLAB script with raw data) Running time: 1,8 s for one B-scan (150 × faster in comparison to the CPU data processing time)

  19. Zipf's law in city size from a resource utilization model.

    PubMed

    Ghosh, Asim; Chatterjee, Arnab; Chakrabarti, Anindya S; Chakrabarti, Bikas K

    2014-10-01

    We study a resource utilization scenario characterized by intrinsic fitness. To describe the growth and organization of different cities, we consider a model for resource utilization where many restaurants compete, as in a game, to attract customers using an iterative learning process. Results for the case of restaurants with uniform fitness are reported. When fitness is uniformly distributed, it gives rise to a Zipf law for the number of customers. We perform an exact calculation for the utilization fraction for the case when choices are made independent of fitness. A variant of the model is also introduced where the fitness can be treated as an ability to stay in the business. When a restaurant loses customers, its fitness is replaced by a random fitness. The steady state fitness distribution is characterized by a power law, while the distribution of the number of customers still follows the Zipf law, implying the robustness of the model. Our model serves as a paradigm for the emergence of Zipf law in city size distribution.

  20. Zipf's law in city size from a resource utilization model

    NASA Astrophysics Data System (ADS)

    Ghosh, Asim; Chatterjee, Arnab; Chakrabarti, Anindya S.; Chakrabarti, Bikas K.

    2014-10-01

    We study a resource utilization scenario characterized by intrinsic fitness. To describe the growth and organization of different cities, we consider a model for resource utilization where many restaurants compete, as in a game, to attract customers using an iterative learning process. Results for the case of restaurants with uniform fitness are reported. When fitness is uniformly distributed, it gives rise to a Zipf law for the number of customers. We perform an exact calculation for the utilization fraction for the case when choices are made independent of fitness. A variant of the model is also introduced where the fitness can be treated as an ability to stay in the business. When a restaurant loses customers, its fitness is replaced by a random fitness. The steady state fitness distribution is characterized by a power law, while the distribution of the number of customers still follows the Zipf law, implying the robustness of the model. Our model serves as a paradigm for the emergence of Zipf law in city size distribution.

  1. Challenges for fuel cells as stationary power resource in the evolving energy enterprise

    NASA Astrophysics Data System (ADS)

    Rastler, Dan

    The primary market challenges for fuel cells as stationary power resources in evolving energy markets are reviewed. Fuel cell power systems have significant barriers to overcome in their anticipated role as decentralized energy power systems. Market segments for fuel cells include combined heat and power; low-cost energy, premium power; peak shaving; and load management and grid support. Understanding the role and fit of fuel cell systems in evolving energy markets and the highest value applications are a major challenge for developers and government funding organizations. The most likely adopters of fuel cell systems and the challenges facing each adopter in the target market segment are reviewed. Adopters include generation companies, utility distribution companies, retail energy service providers and end-users. Key challenges include: overcoming technology risk; achieving retail competitiveness; understanding high value markets and end-user needs; distribution and service channels; regulatory policy issues; and the integration of these decentralized resources within the electrical distribution system.

  2. Modelling Root Systems Using Oriented Density Distributions

    NASA Astrophysics Data System (ADS)

    Dupuy, Lionel X.

    2011-09-01

    Root architectural models are essential tools to understand how plants access and utilize soil resources during their development. However, root architectural models use complex geometrical descriptions of the root system and this has limitations to model interactions with the soil. This paper presents the development of continuous models based on the concept of oriented density distribution function. The growth of the root system is built as a hierarchical system of partial differential equations (PDEs) that incorporate single root growth parameters such as elongation rate, gravitropism and branching rate which appear explicitly as coefficients of the PDE. Acquisition and transport of nutrients are then modelled by extending Darcy's law to oriented density distribution functions. This framework was applied to build a model of the growth and water uptake of barley root system. This study shows that simplified and computer effective continuous models of the root system development can be constructed. Such models will allow application of root growth models at field scale.

  3. Chirplet Wigner-Ville distribution for time-frequency representation and its application

    NASA Astrophysics Data System (ADS)

    Chen, G.; Chen, J.; Dong, G. M.

    2013-12-01

    This paper presents a Chirplet Wigner-Ville Distribution (CWVD) that is free for cross-term that usually occurs in Wigner-Ville distribution (WVD). By transforming the signal with frequency rotating operators, several mono-frequency signals without intermittent are obtained, WVD is applied to the rotated signals that is cross-term free, then some frequency shift operators corresponding to the rotating operator are utilized to relocate the signal‧s instantaneous frequencies (IFs). The operators‧ parameters come from the estimation of the IFs which are approached with a polynomial functions or spline functions. What is more, by analysis of error, the main factors for the performance of the novel method have been discovered and an effective signal extending method based on the IFs estimation has been developed to improve the energy concentration of WVD. The excellent performance of the novel method was manifested by applying it to estimate the IFs of some numerical signals and the echolocation signal emitted by the Large Brown Bat.

  4. Chemically modified graphene/polyimide composite films based on utilization of covalent bonding and oriented distribution.

    PubMed

    Huang, Ting; Lu, Renguo; Su, Chao; Wang, Hongna; Guo, Zheng; Liu, Pei; Huang, Zhongyuan; Chen, Haiming; Li, Tongsheng

    2012-05-01

    Herein, we have developed a rather simple composite fabrication approach to achieving molecular-level dispersion and planar orientation of chemically modified graphene (CMG) in the thermosetting polyimide (PI) matrix as well as realizing strong adhesion at the interfacial regions between reinforcing filler and matrix. The covalent adhesion of CMG to PI matrix and oriented distribution of CMG were carefully confirmed and analyzed by detailed investigations. Combination of covalent bonding and oriented distribution could enlarge the effectiveness of CMG in the matrix. Efficient stress transfer was found at the CMG/PI interfaces. Significant improvements in the mechanical performances, thermal stability, electrical conductivity, and hydrophobic behavior were achieved by addition of only a small amount of CMG. Furthermore, it is noteworthy that the hydrophilic-to-hydrophobic transition and the electrical percolation were observed at only 0.2 wt % CMG in this composite system. This facile methodology is believed to afford broad application potential in graphene-based polymer nanocomposites, especially other types of high-performance thermosetting systems.

  5. Pacific Northwest GridWise™ Testbed Demonstration Projects; Part I. Olympic Peninsula Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hammerstrom, Donald J.; Ambrosio, Ron; Carlon, Teresa A.

    2008-01-09

    This report describes the implementation and results of a field demonstration wherein residential electric water heaters and thermostats, commercial building space conditioning, municipal water pump loads, and several distributed generators were coordinated to manage constrained feeder electrical distribution through the two-way communication of load status and electric price signals. The field demonstration took place in Washington and Oregon and was paid for by the U.S. Department of Energy and several northwest utilities. Price is found to be an effective control signal for managing transmission or distribution congestion. Real-time signals at 5-minute intervals are shown to shift controlled load in time.more » The behaviors of customers and their responses under fixed, time-of-use, and real-time price contracts are compared. Peak loads are effectively reduced on the experimental feeder. A novel application of portfolio theory is applied to the selection of an optimal mix of customer contract types.« less

  6. Vivaldi: A Domain-Specific Language for Volume Processing and Visualization on Distributed Heterogeneous Systems.

    PubMed

    Choi, Hyungsuk; Choi, Woohyuk; Quan, Tran Minh; Hildebrand, David G C; Pfister, Hanspeter; Jeong, Won-Ki

    2014-12-01

    As the size of image data from microscopes and telescopes increases, the need for high-throughput processing and visualization of large volumetric data has become more pressing. At the same time, many-core processors and GPU accelerators are commonplace, making high-performance distributed heterogeneous computing systems affordable. However, effectively utilizing GPU clusters is difficult for novice programmers, and even experienced programmers often fail to fully leverage the computing power of new parallel architectures due to their steep learning curve and programming complexity. In this paper, we propose Vivaldi, a new domain-specific language for volume processing and visualization on distributed heterogeneous computing systems. Vivaldi's Python-like grammar and parallel processing abstractions provide flexible programming tools for non-experts to easily write high-performance parallel computing code. Vivaldi provides commonly used functions and numerical operators for customized visualization and high-throughput image processing applications. We demonstrate the performance and usability of Vivaldi on several examples ranging from volume rendering to image segmentation.

  7. Biostability analysis for drinking water distribution systems.

    PubMed

    Srinivasan, Soumya; Harrington, Gregory W

    2007-05-01

    The ability to limit regrowth in drinking water is referred to as biological stability and depends on the concentration of disinfectant residual and on the concentration of substrate required for the growth of microorganisms. The biostability curve, based on this fundamental concept of biological stability, is a graphical approach to study the two competing effects that determine bacterial regrowth in a distribution system: inactivation due to the presence of a disinfectant, and growth due to the presence of a substrate. Biostability curves are a practical, system specific approach for addressing the problem of bacterial regrowth in distribution systems. This paper presents a standardized algorithm for generating biostability curves and this will enable water utilities to incorporate this approach for their site-specific needs. Using data from pilot scale studies, it was found that this algorithm was applicable to control regrowth of HPC in chlorinated systems where AOC is the growth limiting substrate, and growth of AOB in chloraminated systems, where ammonia is the growth limiting substrate.

  8. Laser inscription of pseudorandom structures for microphotonic diffuser applications.

    PubMed

    Alqurashi, Tawfiq; Alhosani, Abdulla; Dauleh, Mahmoud; Yetisen, Ali K; Butt, Haider

    2018-04-19

    Optical diffusers provide a solution for a variety of applications requiring a Gaussian intensity distribution including imaging systems, biomedical optics, and aerospace. Advances in laser ablation processes have allowed the rapid production of efficient optical diffusers. Here, we demonstrate a novel technique to fabricate high-quality glass optical diffusers with cost-efficiency using a continuous CO2 laser. Surface relief pseudorandom microstructures were patterned on both sides of the glass substrates. A numerical simulation of the temperature distribution showed that the CO2 laser drills a 137 μm hole in the glass for every 2 ms of processing time. FFT simulation was utilized to design predictable optical diffusers. The pseudorandom microstructures were characterized by optical microscopy, Raman spectroscopy, and angle-resolved spectroscopy to assess their chemical properties, optical scattering, transmittance, and polarization response. Increasing laser exposure and the number of diffusing surfaces enhanced the diffusion and homogenized the incident light. The recorded speckle pattern showed high contrast with sharp bright spot free diffusion in the far field view range (250 mm). A model of glass surface peeling was also developed to prevent its occurrence during the fabrication process. The demonstrated method provides an economical approach in fabricating optical glass diffusers in a controlled and predictable manner. The produced optical diffusers have application in fibre optics, LED systems, and spotlights.

  9. An enhanced lumped element electrical model of a double barrier memristive device

    NASA Astrophysics Data System (ADS)

    Solan, Enver; Dirkmann, Sven; Hansen, Mirko; Schroeder, Dietmar; Kohlstedt, Hermann; Ziegler, Martin; Mussenbrock, Thomas; Ochs, Karlheinz

    2017-05-01

    The massive parallel approach of neuromorphic circuits leads to effective methods for solving complex problems. It has turned out that resistive switching devices with a continuous resistance range are potential candidates for such applications. These devices are memristive systems—nonlinear resistors with memory. They are fabricated in nanotechnology and hence parameter spread during fabrication may aggravate reproducible analyses. This issue makes simulation models of memristive devices worthwhile. Kinetic Monte-Carlo simulations based on a distributed model of the device can be used to understand the underlying physical and chemical phenomena. However, such simulations are very time-consuming and neither convenient for investigations of whole circuits nor for real-time applications, e.g. emulation purposes. Instead, a concentrated model of the device can be used for both fast simulations and real-time applications, respectively. We introduce an enhanced electrical model of a valence change mechanism (VCM) based double barrier memristive device (DBMD) with a continuous resistance range. This device consists of an ultra-thin memristive layer sandwiched between a tunnel barrier and a Schottky-contact. The introduced model leads to very fast simulations by using usual circuit simulation tools while maintaining physically meaningful parameters. Kinetic Monte-Carlo simulations based on a distributed model and experimental data have been utilized as references to verify the concentrated model.

  10. An innovative privacy preserving technique for incremental datasets on cloud computing.

    PubMed

    Aldeen, Yousra Abdul Alsahib S; Salleh, Mazleena; Aljeroudi, Yazan

    2016-08-01

    Cloud computing (CC) is a magnificent service-based delivery with gigantic computer processing power and data storage across connected communications channels. It imparted overwhelming technological impetus in the internet (web) mediated IT industry, where users can easily share private data for further analysis and mining. Furthermore, user affable CC services enable to deploy sundry applications economically. Meanwhile, simple data sharing impelled various phishing attacks and malware assisted security threats. Some privacy sensitive applications like health services on cloud that are built with several economic and operational benefits necessitate enhanced security. Thus, absolute cyberspace security and mitigation against phishing blitz became mandatory to protect overall data privacy. Typically, diverse applications datasets are anonymized with better privacy to owners without providing all secrecy requirements to the newly added records. Some proposed techniques emphasized this issue by re-anonymizing the datasets from the scratch. The utmost privacy protection over incremental datasets on CC is far from being achieved. Certainly, the distribution of huge datasets volume across multiple storage nodes limits the privacy preservation. In this view, we propose a new anonymization technique to attain better privacy protection with high data utility over distributed and incremental datasets on CC. The proficiency of data privacy preservation and improved confidentiality requirements is demonstrated through performance evaluation. Copyright © 2016 Elsevier Inc. All rights reserved.

  11. Two-stage controlled release system possesses excellent initial and long-term efficacy.

    PubMed

    Luo, Jian; Jing, Tong-Fang; Zhang, Da-Xia; Zhang, Xian-Peng; Li, Beixing; Liu, Feng

    2018-05-24

    In this work, a series of polyurea-based lambda-cyhalothrin-loaded microcapsules (MCs) with three different size distributions (average diameters of 1.35 μm, MC-S; 5.13 μm, MC-M; and 21.48 μm, MC-L) were prepared and characterized. The results indicated that MCs with a smaller particle size distribution had a faster release rate and excellent initial efficacy against pests. MC-L had a remarkably slow incipient release rate, outstanding photostability and better later-stage efficacy than that of the other tested MCs. The results clarified that the diameter distribution of MCs is the key factor in determining the release property and bioactivity of the MC formulations. Subsequently, the binary mixture MC formulations of MC(+M), MC(S+L) and MC(M+L) were obtained by mixing MC-S, MC-M or MC-L at 1:1 to establish a two-stage release system utilized for foliar application situations. Greenhouse and field experiments showed that MC(S+L) provided an optimal efficacy, and its effective duration was much longer than that of the emulsifiable concentrate (EC) group. Therefore, the release system established in this study was simple and workable for regulating the initial and long-term efficacy by adjusting the particle size distribution; in addition, this system has potential applications in other fields such as drug delivery devices. Copyright © 2018 Elsevier B.V. All rights reserved.

  12. Advanced time and wavelength division multiplexing for metropolitan area optical data communication networks

    NASA Astrophysics Data System (ADS)

    Watford, M.; DeCusatis, C.

    2005-09-01

    With the advent of new regulations governing the protection and recovery of sensitive business data, including the Sarbanes-Oxley Act, there has been a renewed interest in business continuity and disaster recovery applications for metropolitan area networks. Specifically, there has been a need for more efficient bandwidth utilization and lower cost per channel to facilitate mirroring of multi-terabit data bases. These applications have further blurred the boundary between metropolitan and wide area networks, with synchronous disaster recovery applications running up to 100 km and asynchronous solutions extending to 300 km or more. In this paper, we discuss recent enhancements in the Nortel Optical Metro 5200 Dense Wavelength Division Multiplexing (DWDM) platform, including features recently qualified for data communication applications such as Metro Mirror, Global Mirror, and Geographically Distributed Parallel Sysplex (GDPS). Using a 10 Gigabit/second (Gbit/s) backbone, this solution transports significantly more Fibre Channel protocol traffic with up to five times greater hardware density in the same physical package. This is also among the first platforms to utilize forward error correction (FEC) on the aggregate signals to improve bit error rate (BER) performance beyond industry standards. When combined with encapsulation into wide area network protocols, the use of FEC can compensate for impairments in BER across a service provider infrastructure without impacting application level performance. Design and implementation of these features will be discussed, including results from experimental test beds which validate these solutions for a number of applications. Future extensions of this environment will also be considered, including ways to provide configurable bandwidth on demand, mitigate Fibre Channel buffer credit management issues, and support for other GDPS protocols.

  13. Electrical utilities model for determining electrical distribution capacity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fritz, R. L.

    1997-09-03

    In its simplest form, this model was to obtain meaningful data on the current state of the Site`s electrical transmission and distribution assets, and turn this vast collection of data into useful information. The resulting product is an Electrical Utilities Model for Determining Electrical Distribution Capacity which provides: current state of the electrical transmission and distribution systems; critical Hanford Site needs based on outyear planning documents; decision factor model. This model will enable Electrical Utilities management to improve forecasting requirements for service levels, budget, schedule, scope, and staffing, and recommend the best path forward to satisfy customer demands at themore » minimum risk and least cost to the government. A dynamic document, the model will be updated annually to reflect changes in Hanford Site activities.« less

  14. [Research on the Application of Lean Management in Medical Consumables Material Logistics Management].

    PubMed

    Yang, Chai; Zhang, Wei; Gu, Wei; Shen, Aizong

    2016-11-01

    Solve the problems of high cost, low utilization rate of resources, low medical care quality problem in medical consumables material logistics management for scientific of medical consumables management. Analysis of the problems existing in the domestic medical consumables material logistics management in hospital, based on lean management method, SPD(Supply, Processing, Distribution) for specific applications, combined HBOS(Hospital Business Operation System), HIS (Hospital Information System) system for medical consumables material management. Achieve the lean management in medical consumables material purchase, warehouse construction, push, clinical use and retrospect. Lean management in medical consumables material can effectively control the cost in logistics management, optimize the alocation of resources, liberate unnecessary time of medical staff, improve the quality of medical care. It is a scientific management method.

  15. Low-cost interferometric TDM technology for dynamic sensing applications

    NASA Astrophysics Data System (ADS)

    Bush, Jeff; Cekorich, Allen

    2004-12-01

    A low-cost design approach for Time Division Multiplexed (TDM) fiber-optic interferometric interrogation of multi-channel sensor arrays is presented. This paper describes the evolutionary design process of the subject design. First, the requisite elements of interferometric interrogation are defined for a single channel sensor. The concept is then extended to multi-channel sensor interrogation implementing a TDM multiplex scheme where "traditional" design elements are utilized. The cost of the traditional TDM interrogator is investigated and concluded to be too high for entry into many markets. A new design approach is presented which significantly reduces the cost for TDM interrogation. This new approach, in accordance with the cost objectives, shows promise to bring this technology to within the threshold of commercial acceptance for a wide range of distributed fiber sensing applications.

  16. A summary of ERTS data applications in Alaska

    NASA Technical Reports Server (NTRS)

    Miller, J. M.; Belon, A. E.

    1974-01-01

    ERTS has proven to be an exceedingly useful tool for the preparation of urgently needed resource surveys in Alaska. For this reason the wide utilization of ERTS data by federal, state and industrial agencies in Alaska is increasingly directed toward the solution of operational problems in resource inventories, environmental surveys, and land use planning. Examples of some applications are discussed in connection with surveys of potential agricultural lands; mapping of predicted archaeological sites; permafrost terrain and aufeis mapping; snow melt enhancement from Prudhoe Bay roads; geologic interpretations correlated ith possible new petroleum fields, with earthquake activity, and with plate tectonic motion along the Denali fault system; hydrology in monitoring surging glaciers and the break-up characteristics of the Chena River watershed; sea-ice morphology correlated with marine mammal distribution; and coastal sediment plume circulation patterns.

  17. Distribution and utilization of curative primary healthcare services in Lahej, Yemen.

    PubMed

    Bawazir, A A; Bin Hawail, T S; Al-Sakkaf, K A Z; Basaleem, H O; Muhraz, A F; Al-Shehri, A M

    2013-09-01

    No evidence-based data exist on the availability, accessibility and utilization of healthcare services in Lahej Governorate, Yemen. The aim of this study was to assess the distribution and utilization of curative services in primary healthcare units and centres in Lahej. Cross-sectional study (clustering sample). This study was conducted in three of the 15 districts in Lahej between December 2009 and August 2010. Household members were interviewed using a questionnaire to determine sociodemographic characteristics and types of healthcare services available in the area. The distribution of health centres, health units and hospitals did not match the size of the populations or areas of the districts included in this study. Geographical accessibility was the main obstacle to utilization. Factors associated with the utilization of curative services were significantly related to the time required to reach the nearest facility, seeking curative services during illness and awareness of the availability of health facilities (P < 0.01). There is an urgent need to look critically and scientifically at the distribution of healthcare services in the region in order to ensure accessibility and quality of services. Copyright © 2013 The Royal Society for Public Health. Published by Elsevier Ltd. All rights reserved.

  18. Underestimation of Variance of Predicted Health Utilities Derived from Multiattribute Utility Instruments.

    PubMed

    Chan, Kelvin K W; Xie, Feng; Willan, Andrew R; Pullenayegum, Eleanor M

    2017-04-01

    Parameter uncertainty in value sets of multiattribute utility-based instruments (MAUIs) has received little attention previously. This false precision leads to underestimation of the uncertainty of the results of cost-effectiveness analyses. The aim of this study is to examine the use of multiple imputation as a method to account for this uncertainty of MAUI scoring algorithms. We fitted a Bayesian model with random effects for respondents and health states to the data from the original US EQ-5D-3L valuation study, thereby estimating the uncertainty in the EQ-5D-3L scoring algorithm. We applied these results to EQ-5D-3L data from the Commonwealth Fund (CWF) Survey for Sick Adults ( n = 3958), comparing the standard error of the estimated mean utility in the CWF population using the predictive distribution from the Bayesian mixed-effect model (i.e., incorporating parameter uncertainty in the value set) with the standard error of the estimated mean utilities based on multiple imputation and the standard error using the conventional approach of using MAUI (i.e., ignoring uncertainty in the value set). The mean utility in the CWF population based on the predictive distribution of the Bayesian model was 0.827 with a standard error (SE) of 0.011. When utilities were derived using the conventional approach, the estimated mean utility was 0.827 with an SE of 0.003, which is only 25% of the SE based on the full predictive distribution of the mixed-effect model. Using multiple imputation with 20 imputed sets, the mean utility was 0.828 with an SE of 0.011, which is similar to the SE based on the full predictive distribution. Ignoring uncertainty of the predicted health utilities derived from MAUIs could lead to substantial underestimation of the variance of mean utilities. Multiple imputation corrects for this underestimation so that the results of cost-effectiveness analyses using MAUIs can report the correct degree of uncertainty.

  19. Distribution of Hydrocarbon-Utilizing Microorganisms and Hydrocarbon Biodegradation Potentials in Alaskan Continental Shelf Areas

    PubMed Central

    Roubal, George; Atlas, Ronald M.

    1978-01-01

    Hydrocarbon-utilizing microorganisms were enumerated from Alaskan continental shelf areas by using plate counts and a new most-probable-number procedure based on mineralization of 14C-labeled hydrocarbons. Hydrocarbon utilizers were ubiquitously distributed, with no significant overall concentration differences between sampling regions or between surface water and sediment samples. There were, however, significant seasonal differences in numbers of hydrocarbon utilizers. Distribution of hydrocarbon utilizers within Cook Inlet was positively correlated with occurrence of hydrocarbons in the environment. Hydrocarbon biodegradation potentials were measured by using 14C-radiolabeled hydrocarbon-spiked crude oil. There was no significant correlation between numbers of hydrocarbon utilizers and hydrocarbon biodegradation potentials. The biodegradation potentials showed large seasonal variations in the Beaufort Sea, probably due to seasonal depletion of available nutrients. Non-nutrient-limited biodegradation potentials followed the order hexadecane > naphthalene ≫ pristane > benzanthracene. In Cook Inlet, biodegradation potentials for hexadecane and naphthalene were dependent on availability of inorganic nutrients. Biodegradation potentials for pristane and benzanthracene were restricted, probably by resistance to attack by available enzymes in the indigenous population. PMID:655706

  20. A Method of Visualizing Three-Dimensional Distribution of Yeast in Bread Dough

    NASA Astrophysics Data System (ADS)

    Maeda, Tatsurou; Do, Gab-Soo; Sugiyama, Junichi; Oguchi, Kosei; Shiraga, Seizaburou; Ueda, Mitsuyoshi; Takeya, Koji; Endo, Shigeru

    A novel technique was developed to monitor the change in three-dimensional (3D) distribution of yeast in frozen bread dough samples in accordance with the progress of mixing process. Application of a surface engineering technology allowed the identification of yeast in bread dough by bonding EGFP (Enhanced Green Fluorescent Protein) to the surface of yeast cells. The fluorescent yeast (a biomarker) was recognized as bright spots at the wavelength of 520 nm. A Micro-Slicer Image Processing System (MSIPS) with a fluorescence microscope was utilized to acquire cross-sectional images of frozen dough samples sliced at intervals of 1 μm. A set of successive two-dimensional images was reconstructed to analyze 3D distribution of yeast. Samples were taken from each of four normal mixing stages (i.e., pick up, clean up, development, and final stages) and also from over mixing stage. In the pick up stage yeast distribution was uneven with local areas of dense yeast. As the mixing progressed from clean up to final stages, the yeast became more evenly distributed throughout the dough sample. However, the uniformity in yeast distribution was lost in the over mixing stage possibly due to the breakdown of gluten structure within the dough sample.

Top