Sample records for utility applications define

  1. Selection and development of small solar thermal power applications

    NASA Technical Reports Server (NTRS)

    Bluhm, S. A.; Kuehn, T. J.; Gurfield, R. M.

    1979-01-01

    The paper discusses the approach of the JPL Point Focusing Thermal and Electric Power Applications Project to selecting and developing applications for point-focusing distributed-receiver solar thermal electric power systems. Six application categories are defined. Results of application studies of U.S. utilities are presented. The economic value of solar thermal power systems was found to range from $900 to $2100/kWe in small community utilities of the Southwest.

  2. Aligned Carbon Nanotube Tape for Sensor Applications

    NASA Technical Reports Server (NTRS)

    Tucker, Dennis S.

    2013-01-01

    For this effort, will concentrate on three applications: Vibration Gyroscope utilizes piezoelectric properties of the tape and Coriolis effect Accelerometer utilizes the piezoresistive property Strain Gauge utilizes piezoresistive property Accelerometer and Strain Gauge can also utilize piezoelectric effect Test piezoelectric properties using facilities at the Microfabrication Laboratory (AMRDEC) . Enhance piezoelectric effect using polyvinylidine fluoride and P(VDF ]TrFE) which is readily polarizable .Spray matrix solution while winding fiber; Sandwich of CNT tape and PVDF film (DOE .Two Level) . Construct and test prototype vibration gyroscope . Construct and test prototype accelerometer using cantilever design . Test strain sensitivity of CNT tape against industrial strain gauge . Embed CNT tape in composite samples as well as on surface and test to failure (4 ]point bend) A piezoelectric device exhibits an electrical response from a mechanical applied stress. . A piezoelectric device has both capacitance and resistance properties in which by applying an electric field from a waveform will exert a mechanical stress that can be monitored for a response. . The typical waveform applied is a sinusoidal waveform of a defined voltage for a defined period. The defined voltage is driven from 0 volts to the positive defined volts then back to 0 and driven to negative defined volts then back to 0. . Example. Vmax set to 10V and period set to 10 ms. . Voltage will start at zero, go to 10 volts, return to zero, go to ]10 volts and return to zero during 10 ms. . Applying this electrical field to a DUT, the capacitance response and resistance response can be observed. CNT tape is easier to manufacture and cheaper than micromachining silicon or other ceramic piezoelectric used in gyroscopes and accelerometers CNT tape properties can be modified during manufacture for specific application CNT tape has enhanced mechanical and thermal properties in addition to unique electrical properties CNT tape as a strain gauge in Structural Health Monitoring will provide an excellent material to embed within composite structures

  3. Landsat hydrobiological classification for an inland fresh water marsh within Everglades National Park

    NASA Technical Reports Server (NTRS)

    Rose, P. W.; Rosendahl, P. C.

    1981-01-01

    The considered investigation is concerned with the application of Landsat Multispectral Scanner (MSS) data to the classification of vegetative communities and the establishment of flow vectors for the Shark River Slough in Everglades National Park, Florida. A systematic array of 'ground truth' was established utilizing comprehensive hydrologic field data and conventional high altitude infrared aerial photography. A control network was defined that represented all hydrobiological zones (those wetland vegetative communities that directly influence the rate of overland sheet flow) in the Shark River Slough. These data were then directly applied to the Landsat imagery utilizing an interactive multispectral processor which generated hydrographic maps of the slough and defined the surface radiance characteristics of each hydrobiological system. It was found that the application of Landsat imagery for hydrologic applications in a wetlands area, such as the Shark River Slough in Everglades National Park, is definitely a viable tool for resource management.

  4. Research Management--Of What Nature Is the Concept?

    ERIC Educational Resources Information Center

    Cook, Desmond L.

    Research management is defined as the application of both management and management science to a particular field of research and development activities. Seven components of research management include theory and methodology; the planning, implementation, and evaluation of research programs; communications; utilization; and special applications.…

  5. BLAST for Behind-the-Meter Applications Lite Tool | Transportation Research

    Science.gov Websites

    provided by NREL's PV Watts calculator. A generic utility rate structure framework makes it possible to the BLAST documentation for proper CSV formatting. Rate structure values Define demand charges and energy costs to best represent your utility rate structure of interest. Demand charges and energy costs

  6. Single-step affinity purification for fungal proteomics.

    PubMed

    Liu, Hui-Lin; Osmani, Aysha H; Ukil, Leena; Son, Sunghun; Markossian, Sarine; Shen, Kuo-Fang; Govindaraghavan, Meera; Varadaraj, Archana; Hashmi, Shahr B; De Souza, Colin P; Osmani, Stephen A

    2010-05-01

    A single-step protein affinity purification protocol using Aspergillus nidulans is described. Detailed protocols for cell breakage, affinity purification, and depending on the application, methods for protein release from affinity beads are provided. Examples defining the utility of the approaches, which should be widely applicable, are included.

  7. Experimental verification, and domain definition, of structural alerts for protein binding: epoxides, lactones, nitroso, nitros, aldehydes and ketones.

    PubMed

    Nelms, M D; Cronin, M T D; Schultz, T W; Enoch, S J

    2013-01-01

    This study outlines how a combination of in chemico and Tetrahymena pyriformis data can be used to define the applicability domain of selected structural alerts within the profilers of the OECD QSAR Toolbox. Thirty-three chemicals were profiled using the OECD and OASIS profilers, enabling the applicability domain of six structural alerts to be defined, the alerts being: epoxides, lactones, nitrosos, nitros, aldehydes and ketones. Analysis of the experimental data showed the applicability domains for the epoxide, nitroso, aldehyde and ketone structural alerts to be well defined. In contrast, the data showed the applicability domains for the lactone and nitro structural alerts needed modifying. The accurate definition of the applicability domain for structural alerts within in silico profilers is important due to their use in the chemical category in predictive and regulatory toxicology. This study highlights the importance of utilizing multiple profilers in category formation.

  8. Utilizing Remote Sensing Data to Ascertain Soil Moisture Applications and Air Quality Conditions

    NASA Technical Reports Server (NTRS)

    Leptoukh, Gregory; Kempler, Steve; Teng, William; Friedl, Lawrence; Lynnes, Chris

    2009-01-01

    Recognizing the significance of NASA remote sensing Earth science data in monitoring and better understanding our planet's natural environment, NASA Earth Applied Sciences has implemented the 'Decision Support Through Earth Science Research Results' program. Several applications support systems through collaborations with benefiting organizations have been implemented. The Goddard Earth Sciences Data and Information Services Center (GES DISC) has participated in this program on two projects (one complete, one ongoing), and has had opportune ad hoc collaborations utilizing NASA Earth science data. GES DISC's understanding of Earth science missions and resulting data and information enables the GES DISC to identify challenges that come with bringing science data to research applications. In this presentation we describe applications research projects utilizing NASA Earth science data and a variety of resulting GES DISC applications support system project experiences. In addition, defining metrics that really evaluate success will be exemplified.

  9. Report of the Action Committee on Bioengineering.

    ERIC Educational Resources Information Center

    Schein, Martin W.

    Bioengineering has been defined as "the application of knowledge gained by a cross fertilization of engineering and the biological sciences so that both will be more fully utilized for the benefit of mankind." Bioengineering has at least six areas of application: (1) medical engineering, (2) environmental health engineering, (3)…

  10. Strange but True: The Physics of Glass, Gels and Jellies Is All Related through Rheology

    ERIC Educational Resources Information Center

    Sarker, Dipak K.

    2017-01-01

    Rheology is an enormously far-reaching branch of physics (or physical chemistry) and has a number of different guises. Rheological descriptions define fluids, semi-solids and conventional solids, and the application of this science defines the performance and utility of materials and substances as diverse as foods (such as yogurt and marmalade),…

  11. CRISPR/Cas9-Mediated Mutagenesis of Human Pluripotent Stem Cells in Defined Xeno-Free E8 Medium.

    PubMed

    Soh, Chew-Li; Huangfu, Danwei

    2017-01-01

    The recent advent of engineered nucleases including the CRISPR/Cas9 system has greatly facilitated genome manipulation in human pluripotent stem cells (hPSCs). In addition to facilitating hPSC-based disease studies, the application of genome engineering in hPSCs has also opened up new avenues for cell replacement therapy. To improve consistency and reproducibility of hPSC-based studies, and to meet the safety and regulatory requirements for clinical translation, it is necessary to use a defined, xeno-free cell culture system. This chapter describes protocols for CRISPR/Cas9 genome editing in an inducible Cas9 hPSC-based system, using cells cultured in chemically defined, xeno-free E8 Medium on a recombinant human vitronectin substrate. We detail procedures for the design and transfection of CRISPR guide RNAs, colony selection, and the expansion and validation of clonal mutant lines, all within this fully defined culture condition. These methods may be applied to a wide range of genome-engineering applications in hPSCs, including those that utilize different types of site-specific nucleases such as zinc finger nucleases (ZFNs) and TALENs, and form a closer step towards clinical utility of these cells.

  12. A Literature Review: Website Design and User Engagement.

    PubMed

    Garett, Renee; Chiu, Jason; Zhang, Ly; Young, Sean D

    2016-07-01

    Proper design has become a critical element needed to engage website and mobile application users. However, little research has been conducted to define the specific elements used in effective website and mobile application design. We attempt to review and consolidate research on effective design and to define a short list of elements frequently used in research. The design elements mentioned most frequently in the reviewed literature were navigation, graphical representation, organization, content utility, purpose, simplicity, and readability. We discuss how previous studies define and evaluate these seven elements. This review and the resulting short list of design elements may be used to help designers and researchers to operationalize best practices for facilitating and predicting user engagement.

  13. A Literature Review: Website Design and User Engagement

    PubMed Central

    Garett, Renee; Chiu, Jason; Zhang, Ly; Young, Sean D.

    2015-01-01

    Proper design has become a critical element needed to engage website and mobile application users. However, little research has been conducted to define the specific elements used in effective website and mobile application design. We attempt to review and consolidate research on effective design and to define a short list of elements frequently used in research. The design elements mentioned most frequently in the reviewed literature were navigation, graphical representation, organization, content utility, purpose, simplicity, and readability. We discuss how previous studies define and evaluate these seven elements. This review and the resulting short list of design elements may be used to help designers and researchers to operationalize best practices for facilitating and predicting user engagement. PMID:27499833

  14. 7 CFR 4288.110 - Applicant eligibility.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... RURAL UTILITIES SERVICE, DEPARTMENT OF AGRICULTURE PAYMENT PROGRAMS Advanced Biofuel Payment Program....119 present the requirements associated with advanced biofuel producer eligibility, biofuel... advanced biofuel producer, as defined in this subpart. (b) Eligibility determination. The Agency will...

  15. Space Software Defined Radio Characterization to Enable Reuse

    NASA Technical Reports Server (NTRS)

    Mortensen, Dale J.; Bishop, Daniel W.; Chelmins, David

    2012-01-01

    NASA's Space Communication and Navigation Testbed is beginning operations on the International Space Station this year. The objective is to promote new software defined radio technologies and associated software application reuse, enabled by this first flight of NASA's Space Telecommunications Radio System architecture standard. The Space Station payload has three software defined radios onboard that allow for a wide variety of communications applications; however, each radio was only launched with one waveform application. By design the testbed allows new waveform applications to be uploaded and tested by experimenters in and outside of NASA. During the system integration phase of the testbed special waveform test modes and stand-alone test waveforms were used to characterize the SDR platforms for the future experiments. Characterization of the Testbed's JPL SDR using test waveforms and specialized ground test modes is discussed in this paper. One of the test waveforms, a record and playback application, can be utilized in a variety of ways, including new satellite on-orbit checkout as well as independent on-board testbed experiments.

  16. Developing a UAS Program for Electric Utilities

    NASA Astrophysics Data System (ADS)

    Keltgen, James

    New innovations and technologies using unmanned aerial systems (UAS), or drones, have created unique opportunities for commercial applications. Electric utilities, likewise, realize the benefits of using UAS as a tool in electric utility operations. Although the opportunities exist, establishing a UAS program for electric utilities is largely an endeavor of trial and error or research and development with no clear path defined on how to establish a UAS program. By reviewing UAS use case examples and integrating lessons learned with Federal Aviation Administration (FAA) regulations, UAS best practices, unique electric utility values, legal and insurance perspectives, equipment selection, and thoughtful planning and preparation; a solution model is developed to establish a UAS program for electric utilities.

  17. Guideline for the utilization of commercial grade items in nuclear safety related applications: Final report. [Contains Glossary

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tulay, M.P.; Yurich, F.J.; Schremser, F.M. Jr.

    1988-06-01

    This guideline provides direction for the procurement and use of Commercial Grade Items (CGI)in safety-related applications. It is divided into five major sections. A glossary of terms and definitions, an acronym listing, and seven appendices have been included. The glossary defines terms used in this guideline. In certain instances, the definitions may be unique to this guideline. Identification of acronyms utilized in this guideline is also provided. Section 1 provides a background of the commercial grade item issues facing the nuclear industry. It provides a historical perspective of commercial grade item issues. Section 2 discusses the generic process for themore » acceptance of a commercial grade item for safety-related use. Section 3 defines the four distinct methods used to accept commercial grade items for safety-related applications. Section 4 lists specific references that are identified in this guideline. Section 5 is a bibliography of documents that were considered in developed this guideline, but were not directly referenced in the document.« less

  18. Advertising expenses examined in recent rulings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    Courts and commissions have handed down numerous rulings on the subject of utility advertising expenditures. Much controversy still abounds of who should be made to pay for such costs. Some specific cases concerning this issue are discussed. The majority of recent decisions agree that ratepayers should not have to pay for promotional or political advertising campaigns conducted by regulated utilities. Disagreement does exist as to how those terms should be defined in practice. Titles I and III of the Public Utility Regulatory Policies Act of 1978 direct state regulatory agencies to investigate advertising expenses by electric and natural gas utilitiesmore » and to adopt, if appropriate, a policy denying recovery (from ratepayers) of expenses incurred for promotional or political advertising. Importantly, the act also purports to define those terms by explaining what type of expenditures do and do not fall within those categories. Title I and III contain parallel definitions for both electric and natural gas utilities. Only those portions of the act applicable to electric utilities are discussed, with the troublesome area of nuclear advertising noted. (MCW)« less

  19. Software-defined Radio Based Measurement Platform for Wireless Networks

    PubMed Central

    Chao, I-Chun; Lee, Kang B.; Candell, Richard; Proctor, Frederick; Shen, Chien-Chung; Lin, Shinn-Yan

    2015-01-01

    End-to-end latency is critical to many distributed applications and services that are based on computer networks. There has been a dramatic push to adopt wireless networking technologies and protocols (such as WiFi, ZigBee, WirelessHART, Bluetooth, ISA100.11a, etc.) into time-critical applications. Examples of such applications include industrial automation, telecommunications, power utility, and financial services. While performance measurement of wired networks has been extensively studied, measuring and quantifying the performance of wireless networks face new challenges and demand different approaches and techniques. In this paper, we describe the design of a measurement platform based on the technologies of software-defined radio (SDR) and IEEE 1588 Precision Time Protocol (PTP) for evaluating the performance of wireless networks. PMID:27891210

  20. Software-defined Radio Based Measurement Platform for Wireless Networks.

    PubMed

    Chao, I-Chun; Lee, Kang B; Candell, Richard; Proctor, Frederick; Shen, Chien-Chung; Lin, Shinn-Yan

    2015-10-01

    End-to-end latency is critical to many distributed applications and services that are based on computer networks. There has been a dramatic push to adopt wireless networking technologies and protocols (such as WiFi, ZigBee, WirelessHART, Bluetooth, ISA100.11a, etc. ) into time-critical applications. Examples of such applications include industrial automation, telecommunications, power utility, and financial services. While performance measurement of wired networks has been extensively studied, measuring and quantifying the performance of wireless networks face new challenges and demand different approaches and techniques. In this paper, we describe the design of a measurement platform based on the technologies of software-defined radio (SDR) and IEEE 1588 Precision Time Protocol (PTP) for evaluating the performance of wireless networks.

  1. Information systems requirements for the Microgravity Science and Applications Program

    NASA Technical Reports Server (NTRS)

    Kicza, M. E.; Kreer, J. R.

    1991-01-01

    NASA's Microgravity Science and Applications (MSAD) Program is presented. Additionally, the types of information produced wiithin the program and the anticipated growth in information system requirements as the program transitions to Space Station Freedom utilization are discussed. Plans for payload operations support in the Freedom era are addressed, as well as current activities to define research community requirements for data and sample archives.

  2. Information systems requirements for the microgravity science and applications program

    NASA Technical Reports Server (NTRS)

    Kicza, M. E.; Kreer, J. R.

    1990-01-01

    NASA's Microgravity Science and Applications (MSAD) Program is presented. Additionally, the types of information produced within the program and the anticipated growth in information system requirements as the program transitions to Space Station Freedom utilization are discussed. Plans for payload operations support in the Freedom era are addressed, as well as current activities to define research community requirements for data and sample archives.

  3. 7 CFR 4280.193 - Combined funding.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... RURAL UTILITIES SERVICE, DEPARTMENT OF AGRICULTURE LOANS AND GRANTS Renewable Energy Systems and Energy... a project for which an applicant is seeking a combined grant and guaranteed loan are defined as....107 and the borrower eligibility requirements specified in § 4280.121. Projects must meet the project...

  4. 7 CFR 4280.193 - Combined funding.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... RURAL UTILITIES SERVICE, DEPARTMENT OF AGRICULTURE LOANS AND GRANTS Renewable Energy Systems and Energy... a project for which an applicant is seeking a combined grant and guaranteed loan are defined as....107 and the borrower eligibility requirements specified in § 4280.121. Projects must meet the project...

  5. Assembly and microscopic characterization of DNA origami structures.

    PubMed

    Scheible, Max; Jungmann, Ralf; Simmel, Friedrich C

    2012-01-01

    DNA origami is a revolutionary method for the assembly of molecular nanostructures from DNA with precisely defined dimensions and with an unprecedented yield. This can be utilized to arrange nanoscale components such as proteins or nanoparticles into pre-defined patterns. For applications it will now be of interest to arrange such components into functional complexes and study their geometry-dependent interactions. While commonly DNA nanostructures are characterized by atomic force microscopy or electron microscopy, these techniques often lack the time-resolution to study dynamic processes. It is therefore of considerable interest to also apply fluorescence microscopic techniques to DNA nanostructures. Of particular importance here is the utilization of novel super-resolved microscopy methods that enable imaging beyond the classical diffraction limit.

  6. EVAL mission requirements, phase 1

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The aspects of NASA's applications mission were enhanced by utilization of shuttle/spacelab, and payload groupings which optimize the cost of achieving the mission goals were defined. Preliminary Earth Viewing Application Laboratory (EVAL) missions, experiments, sensors, and sensor groupings were developed. The major technological EVAL themes and objectives which NASA will be addressing during the 1980 to 2,000 time period were investigated. Missions/experiments which addressed technique development, sensor development, application development, and/or operational data collection were considered as valid roles for EVAL flights.

  7. Artificial Intelligence and Expert Systems Research and Their Possible Impact on Information Science.

    ERIC Educational Resources Information Center

    Borko, Harold

    1985-01-01

    Defines artificial intelligence (AI) and expert systems; describes library applications utilizing AI to automate creation of document representations, request formulations, and design and modify search strategies for information retrieval systems; discusses expert system development for information services; and reviews impact of these…

  8. Metadata and network API aspects of a framework for storing and retrieving civil infrastructure monitoring data

    NASA Astrophysics Data System (ADS)

    Wong, John-Michael; Stojadinovic, Bozidar

    2005-05-01

    A framework has been defined for storing and retrieving civil infrastructure monitoring data over a network. The framework consists of two primary components: metadata and network communications. The metadata component provides the descriptions and data definitions necessary for cataloging and searching monitoring data. The communications component provides Java classes for remotely accessing the data. Packages of Enterprise JavaBeans and data handling utility classes are written to use the underlying metadata information to build real-time monitoring applications. The utility of the framework was evaluated using wireless accelerometers on a shaking table earthquake simulation test of a reinforced concrete bridge column. The NEESgrid data and metadata repository services were used as a backend storage implementation. A web interface was created to demonstrate the utility of the data model and provides an example health monitoring application.

  9. A technique for increasing the accuracy of the numerical inversion of the Laplace transform with applications

    NASA Technical Reports Server (NTRS)

    Berger, B. S.; Duangudom, S.

    1973-01-01

    A technique is introduced which extends the range of useful approximation of numerical inversion techniques to many cycles of an oscillatory function without requiring either the evaluation of the image function for many values of s or the computation of higher-order terms. The technique consists in reducing a given initial value problem defined over some interval into a sequence of initial value problems defined over a set of subintervals. Several numerical examples demonstrate the utility of the method.

  10. Social Area Analysis in Program Evaluation and Planning.

    ERIC Educational Resources Information Center

    Piasecki, Joseph R.; Kamis-Gould, Edna

    1981-01-01

    Social area analysis (SAA) is defined, and its conceptual roots and principal applications identified. The utility of SAA is assessed by reviewing use of demographic data among several disciplines. Authors review seminal and recent contributions to the field. Ecological fallacies and other problematic facets of SAA are considered. (Author/DWH)

  11. Reframing Knowing, Being, and Doing in the Seminary Classroom

    ERIC Educational Resources Information Center

    Cahalan, Kathleen A.

    2011-01-01

    Seminary education requires that students learn a complex body of theological knowledge, engage in the practices of ministry, and develop as persons of faith and vocation. Utilizing the six aspects of significant learning experiences defined by L. Dee Fink--foundational knowledge, application, integration, the human dimension, caring, and learning…

  12. Operations and maintenance manual for the LDUA supervisory control and data acquisition system (LDUA System 4200) and control network (LDUA System 4400)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barnes, G.A.

    1998-03-11

    This document defines the requirements applicable to the operation, maintenance and storage of the Supervisory Control and Data Acquisition System (SCADAS) and Control Network in support of the Light Duty Utility Arm (LDUA) operations.

  13. Experimental demonstration of OpenFlow-enabled media ecosystem architecture for high-end applications over metro and core networks.

    PubMed

    Ntofon, Okung-Dike; Channegowda, Mayur P; Efstathiou, Nikolaos; Rashidi Fard, Mehdi; Nejabati, Reza; Hunter, David K; Simeonidou, Dimitra

    2013-02-25

    In this paper, a novel Software-Defined Networking (SDN) architecture is proposed for high-end Ultra High Definition (UHD) media applications. UHD media applications require huge amounts of bandwidth that can only be met with high-capacity optical networks. In addition, there are requirements for control frameworks capable of delivering effective application performance with efficient network utilization. A novel SDN-based Controller that tightly integrates application-awareness with network control and management is proposed for such applications. An OpenFlow-enabled test-bed demonstrator is reported with performance evaluations of advanced online and offline media- and network-aware schedulers.

  14. Nowcasting Earthquakes: A Comparison of Induced Earthquakes in Oklahoma and at the Geysers, California

    NASA Astrophysics Data System (ADS)

    Luginbuhl, Molly; Rundle, John B.; Hawkins, Angela; Turcotte, Donald L.

    2018-01-01

    Nowcasting is a new method of statistically classifying seismicity and seismic risk (Rundle et al. 2016). In this paper, the method is applied to the induced seismicity at the Geysers geothermal region in California and the induced seismicity due to fluid injection in Oklahoma. Nowcasting utilizes the catalogs of seismicity in these regions. Two earthquake magnitudes are selected, one large say M_{λ } ≥ 4, and one small say M_{σ } ≥ 2. The method utilizes the number of small earthquakes that occurs between pairs of large earthquakes. The cumulative probability distribution of these values is obtained. The earthquake potential score (EPS) is defined by the number of small earthquakes that has occurred since the last large earthquake, the point where this number falls on the cumulative probability distribution of interevent counts defines the EPS. A major advantage of nowcasting is that it utilizes "natural time", earthquake counts, between events rather than clock time. Thus, it is not necessary to decluster aftershocks and the results are applicable if the level of induced seismicity varies in time. The application of natural time to the accumulation of the seismic hazard depends on the applicability of Gutenberg-Richter (GR) scaling. The increasing number of small earthquakes that occur after a large earthquake can be scaled to give the risk of a large earthquake occurring. To illustrate our approach, we utilize the number of M_{σ } ≥ 2.75 earthquakes in Oklahoma to nowcast the number of M_{λ } ≥ 4.0 earthquakes in Oklahoma. The applicability of the scaling is illustrated during the rapid build-up of injection-induced seismicity between 2012 and 2016, and the subsequent reduction in seismicity associated with a reduction in fluid injections. The same method is applied to the geothermal-induced seismicity at the Geysers, California, for comparison.

  15. Power Budget Analysis for High Altitude Airships

    NASA Technical Reports Server (NTRS)

    Choi, Sang H.; Elliott, James R.; King, Glen C.

    2006-01-01

    The High Altitude Airship (HAA) has various potential applications and mission scenarios that require onboard energy harvesting and power distribution systems. The energy source considered for the HAA s power budget is solar photon energy that allows the use of either photovoltaic (PV) cells or advanced thermoelectric (ATE) converters. Both PV cells and an ATE system utilizing high performance thermoelectric materials were briefly compared to identify the advantages of ATE for HAA applications in this study. The ATE can generate a higher quantity of harvested energy than PV cells by utilizing the cascaded efficiency of a three-staged ATE in a tandem mode configuration. Assuming that each stage of ATE material has the figure of merit of 5, the cascaded efficiency of a three-staged ATE system approaches the overall conversion efficiency greater than 60%. Based on this estimated efficiency, the configuration of a HAA and the power utility modules are defined.

  16. Materials selection guidelines for geothermal energy utilization systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ellis, P.F. II; Conover, M.F.

    1981-01-01

    This manual includes geothermal fluid chemistry, corrosion test data, and materials operating experience. Systems using geothermal energy in El Salvador, Iceland, Italy, Japan, Mexico, New Zealand, and the United States are described. The manual provides materials selection guidelines for surface equipment of future geothermal energy systems. The key chemical species that are significant in determining corrosiveness of geothermal fluids are identified. The utilization modes of geothermal energy are defined as well as the various physical fluid parameters that affect corrosiveness. Both detailed and summarized results of materials performance tests and applicable operating experiences from forty sites throughout the world aremore » presented. The application of various non-metal materials in geothermal environments are discussed. Included in appendices are: corrosion behavior of specific alloy classes in geothermal fluids, corrosion in seawater desalination plants, worldwide geothermal power production, DOE-sponsored utilization projects, plant availability, relative costs of alloys, and composition of alloys. (MHR)« less

  17. BOILER DESIGN CRITERIA FOR DRY SORBENT SO2 CONTROL WITH LOW-NOX BURNERS: NEW UNIT APPLICATIONS

    EPA Science Inventory

    The report describes a study to define boiler modifications required to achieve 70% SO2 removal with sorbent injection on a large tangentially fired utility boiler without supplemental spray drying. The study is a follow on to a recently completed broader evaluation of boiler des...

  18. EVALUATE THE UTILITY OF ENTEROCOCCI AS INDICATORS OF THE SOURCES OF FECAL CONTAMINATION IN IMPAIRED SUBWATERSHEDS THROUGH DNA-BASED MOLECULAR TECHNIQUES

    EPA Science Inventory

    Microbial source tracking (MST) is based on the assumption that specific strains of bacteria are associated with specific host species. MST methods are attractive because their application on environmental samples could help define the nature of water quality problems in impaire...

  19. Calculating the configurational entropy of a landscape mosaic

    Treesearch

    Samuel A. Cushman

    2016-01-01

    Applications of entropy and the second law of thermodynamics in landscape ecology are rare and poorly developed. This is a fundamental limitation given the centrally important role the second law plays in all physical and biological processes. A critical first step to exploring the utility of thermodynamics in landscape ecology is to define the configurational...

  20. MPD thruster application study

    NASA Technical Reports Server (NTRS)

    1981-01-01

    Developmental considerations for the magneto-plasma-dynamic (MPD) thruster are defined. General characteristics of an MPD engine are compared to those of chemical propulsion and ion bombardment engines and performance criteria which are mission specific are examined. Requirements for thruster ground testing facilities are discussed and the utilization of the space shuttle for an orbital flight test is addressed.

  1. Analyzing Security Breaches in the U.S.: A Business Analytics Case-Study

    ERIC Educational Resources Information Center

    Parks, Rachida F.; Adams, Lascelles

    2016-01-01

    This is a real-world applicable case-study and includes background information, functional organization requirements, and real data. Business analytics has been defined as the technologies, skills, and practices needed to iteratively investigate historical performance to gain insight or spot trends. You are asked to utilize/apply critical thinking…

  2. Performance Evaluation of Hyperbolic Position Location Technique in Cellular Wireless Networks

    DTIC Science & Technology

    2002-03-13

    number of applications called location - based services which can be defined as value added services that utilize the knowledge of the mobile user’s... Location based services “4-1-1”, location specific information such as local weather, mobile yellow pages, etc. and • Mobile e-commerce, wireless

  3. Future directions for LDEF ionizing radiation modeling and assessments

    NASA Technical Reports Server (NTRS)

    Armstrong, T. W.; Colborn, B. L.

    1993-01-01

    A calculational program utilizing data from radiation dosimetry measurements aboard the Long Duration Exposure Facility (LDEF) satellite to reduce the uncertainties in current models defining the ionizing radiation environment is in progress. Most of the effort to date has been on using LDEF radiation dose measurements to evaluate models defining the geomagnetically trapped radiation, which has provided results applicable to radiation design assessments being performed for Space Station Freedom. Plans for future data comparisons, model evaluations, and assessments using additional LDEF data sets (LET spectra, induced radioactivity, and particle spectra) are discussed.

  4. Conference on Occupational Health Aspects of Advanced Composite Technology in the Aerospace Industry Held in Dayton, Ohio on 6-9 February 1989. Volume 2. Proceedings

    DTIC Science & Technology

    1989-03-01

    fibers do not appoear to be a significant inhalation hazard nor are they biologically active in several in vitro test systems. Minor skin and eye...Additional emphasis on defining various methods to be utilized to define exposure including biological monitoring and application of various skin absorption...Threshold Limit Values and Biological Indices for 1988-1989, Cincinnati, Ohio Bartek, M.J., LaBulde, J.A., and Maibach, H.I. (1983). Skin permeability

  5. An application of LANDSAT multispectral imagery for the classification of hydrobiological systems, Shark River Slough, Everglades National Park, Florida

    NASA Technical Reports Server (NTRS)

    Rose, P. W.; Rosendahl, P. C. (Principal Investigator)

    1979-01-01

    Multivariant hydrologic parameters over the Shark River Slough were investigated. Ground truth was established utilizing U-2 infrared photography and comprehensive field data to define a control network which represented all hydrobiological systems in the slough. These data were then applied to LANDSAT imagery utilizing an interactive multispectral processor which generated hydrographic maps through classification of the slough and defined the multispectral surface radiance characteristics of the wetlands areas in the park. The spectral response of each hydrobiological zone was determined and plotted to formulate multispectral relationships between the emittent energy from the slough in order to determine the best possible multispectral wavelength combinations to enhance classification results. The extent of each hydrobiological zone in slough was determined and flow vectors for water movement throughout the slough established.

  6. Plan of research for integrated soil moisture studies. Recommendations of the Soil Moisture Working Group

    NASA Technical Reports Server (NTRS)

    1980-01-01

    Soil moisture information is a potentially powerful tool for applications in agriculture, water resources, and climate. At present, it is difficult for users of this information to clearly define their needs in terms of accuracy, resolution and frequency because of the current sparsity of data. A plan is described for defining and conducting an integrated and coordinated research effort to develop and refine remote sensing techniques which will determine spatial and temporal variations of soil moisture and to utilize soil moisture information in support of agricultural, water resources, and climate applications. The soil moisture requirements of these three different application areas were reviewed in relation to each other so that one plan covering the three areas could be formulated. Four subgroups were established to write and compile the plan, namely models, ground-based studies, aircraft experiments, and spacecraft missions.

  7. Development of a Model for the Representation of Nanotechnology-Specific Terminology

    PubMed Central

    Bailey, LeeAnn O.; Kennedy, Christopher H.; Fritts, Martin J.; Hartel, Francis W.

    2006-01-01

    Nanotechnology is an important, rapidly-evolving, multidisciplinary field [1]. The tremendous growth in this area necessitates the establishment of a common, open-source terminology to support the diverse biomedical applications of nanotechnology. Currently, the consensus process to define and categorize conceptual entities pertaining to nanotechnology is in a rudimentary stage. We have constructed a nanotechnology-specific conceptual hierarchy that can be utilized by end users to retrieve accurate, controlled terminology regarding emerging nanotechnology and corresponding clinical applications. PMID:17238469

  8. Phase 1 of the First Small Power System Experiment (engineering Experiment No. 1). Volume 4: Commercial System Definition. [development and testing of a solar thermal power plant

    NASA Technical Reports Server (NTRS)

    Holl, R. J.

    1979-01-01

    The development and design of a modular solar thermal power system for application in the 1 to 10 MWe range is described. The system is used in remote utility applications, small communities, rural areas, and for industrial uses. The operational reliability, the minimum risk of failure, and the maintenance and repair characteristics are determined and the commercial system design is defined.

  9. Photovoltaic village power application: Assessment of the near-term market

    NASA Technical Reports Server (NTRS)

    Rosenblum, L.; Bifano, W. J.; Poley, W. A.; Scudder, L. R.

    1978-01-01

    The village power application represents a potential market for photovoltaics. The price of energy for photovoltaic systems was compared to that of utility line extensions and diesel generators. The potential domestic demand was defined in both the government and commercial sectors. The foreign demand and sources of funding for village power systems in the developing countries were also discussed briefly. It was concluded that a near term domestic market of at least 12 MW min and a foreign market of about 10 GW exists.

  10. Photovoltaic water pumping applications: Assessment of the near-term market

    NASA Technical Reports Server (NTRS)

    Rosenblum, L.; Bifano, W. J.; Scudder, L. R.; Poley, W. A.; Cusick, J. P.

    1978-01-01

    Water pumping applications represent a potential market for photovoltaics. The price of energy for photovoltaic systems was compared to that of utility line extensions and diesel generators. The potential domestic demand was defined in the government, commercial/institutional and public sectors. The foreign demand and sources of funding for water pumping systems in the developing countries were also discussed briefly. It was concluded that a near term domestic market of at least 240 megawatts and a foreign market of about 6 gigawatts exist.

  11. Open Smart Energy Gateway (OpenSEG)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    The Open Smart Energy Gateway (OpenSEG) aims to provide near-real time smart meter data to consumers without the delays or latencies associated with it being transported to the utility data center and then back to the consumer's application. To do this, the gateway queries the local Smart Meter to which it is bound to get energy consumption information at pre-defined intervals (minimum interval is 4 seconds). OpenSEG then stores the resulting data internally for retrieval by an external application.

  12. EVALUATE THE UTILITY OF ENTEROCOCCI AND BACTEROIDES AS INDICATORS OF THE SOURCES OF FECAL CONTAMINATION IN IMPAIRED SUBWATERSHEDS THROUGH DNA-BASED MOLECULAR TECHNIQUES.

    EPA Science Inventory

    Microbial source tracking (MST) is based on the assumption that specific strains of bacteria are associated with specific host species. MST methods are attractive because their application on environmental samples could help define the nature of water quality problems in impaire...

  13. 7 CFR 1717.860 - Lien accommodations and subordinations under section 306E of the RE Act.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... (Continued) RURAL UTILITIES SERVICE, DEPARTMENT OF AGRICULTURE POST-LOAN POLICIES AND PROCEDURES COMMON TO... Regulatory Study Costs, and account 182.3, Other Regulatory Assets, as defined in 7 CFR part 1767. (c... § 1717.858, if requested by a borrower that meets the 110 percent equity test and all other applicable...

  14. Standardizing terminology and definitions of medication adherence and persistence in research employing electronic databases.

    PubMed

    Raebel, Marsha A; Schmittdiel, Julie; Karter, Andrew J; Konieczny, Jennifer L; Steiner, John F

    2013-08-01

    To propose a unifying set of definitions for prescription adherence research utilizing electronic health record prescribing databases, prescription dispensing databases, and pharmacy claims databases and to provide a conceptual framework to operationalize these definitions consistently across studies. We reviewed recent literature to identify definitions in electronic database studies of prescription-filling patterns for chronic oral medications. We then develop a conceptual model and propose standardized terminology and definitions to describe prescription-filling behavior from electronic databases. The conceptual model we propose defines 2 separate constructs: medication adherence and persistence. We define primary and secondary adherence as distinct subtypes of adherence. Metrics for estimating secondary adherence are discussed and critiqued, including a newer metric (New Prescription Medication Gap measure) that enables estimation of both primary and secondary adherence. Terminology currently used in prescription adherence research employing electronic databases lacks consistency. We propose a clear, consistent, broadly applicable conceptual model and terminology for such studies. The model and definitions facilitate research utilizing electronic medication prescribing, dispensing, and/or claims databases and encompasses the entire continuum of prescription-filling behavior. Employing conceptually clear and consistent terminology to define medication adherence and persistence will facilitate future comparative effectiveness research and meta-analytic studies that utilize electronic prescription and dispensing records.

  15. Institutional design and utilization of evaluation: a contribution to a theory of evaluation influence based on Swiss experience.

    PubMed

    Balthasar, Andreas

    2009-06-01

    Growing interest in the institutionalization of evaluation in the public administration raises the question as to which institutional arrangement offers optimal conditions for the utilization of evaluations. Institutional arrangement denotes the formal organization of processes and competencies, together with procedural rules, that are applicable independently of individual evaluation projects. It reflects the evaluation practice of an institution and defines the distance between evaluators and evaluees. This article outlines the results of a broad-based study of all 300 or so evaluations that the Swiss Federal Administration completed from 1999 to 2002. On this basis, it derives a theory of the influence of institutional factors on the utilization of evaluations.

  16. Domestic applications for aerospace waste and water management technologies

    NASA Technical Reports Server (NTRS)

    Disanto, F.; Murray, R. W.

    1972-01-01

    Some of the aerospace developments in solid waste disposal and water purification, which are applicable to specific domestic problems are explored. Also provided is an overview of the management techniques used in defining the need, in utilizing the available tools, and in synthesizing a solution. Specifically, several water recovery processes will be compared for domestic applicability. Examples are filtration, distillation, catalytic oxidation, reverse osmosis, and electrodialysis. Solid disposal methods will be discussed, including chemical treatment, drying, incineration, and wet oxidation. The latest developments in reducing household water requirements and some concepts for reusing water will be outlined.

  17. Defined-size DNA triple crossover construct for molecular electronics: modification, positioning and conductance properties.

    PubMed

    Linko, Veikko; Leppiniemi, Jenni; Paasonen, Seppo-Tapio; Hytönen, Vesa P; Toppari, J Jussi

    2011-07-08

    We present a novel, defined-size, small and rigid DNA template, a so-called B-A-B complex, based on DNA triple crossover motifs (TX tiles), which can be utilized in molecular scale patterning for nanoelectronics, plasmonics and sensing applications. The feasibility of the designed construct is demonstrated by functionalizing the TX tiles with one biotin-triethylene glycol (TEG) and efficiently decorating them with streptavidin, and furthermore by positioning and anchoring single thiol-modified B-A-B complexes to certain locations on a chip via dielectrophoretic trapping. Finally, we characterize the conductance properties of the non-functionalized construct, first by measuring DC conductivity and second by utilizing AC impedance spectroscopy in order to describe the conductivity mechanism of a single B-A-B complex using a detailed equivalent circuit model. This analysis also reveals further information about the conductivity of DNA structures in general.

  18. Developing Use Cases for Evaluation of ADMS Applications to Accelerate Technology Adoption: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Veda, Santosh; Wu, Hongyu; Martin, Maurice

    Grid modernization for the distribution systems comprise of the ability to effectively monitor and manage unplanned events while ensuring reliable operations. Integration of Distributed Energy Resources (DERs) and proliferation of autonomous smart controllers like microgrids and smart inverters in the distribution networks challenge the status quo of distribution system operations. Advanced Distribution Management System (ADMS) technologies are being increasingly deployed to manage the complexities of operating distribution systems. The ability to evaluate the ADMS applications in specific utility environments and for future scenarios will accelerate wider adoption of the ADMS and will lower the risks and costs of their implementation.more » This paper addresses the first step - identify and define the use cases for evaluating these applications. The applications that are selected for this discussion include Volt-VAr Optimization (VVO), Fault Location Isolation and Service Restoration (FLISR), Online Power Flow (OLPF)/Distribution System State Estimation (DSSE) and Market Participation. A technical description and general operational requirements for each of these applications is presented. The test scenarios that are most relevant to the utility challenges are also addressed.« less

  19. Application of Relational Contracting Methods to Federal Construction Projects

    DTIC Science & Technology

    2011-03-24

    59 viii List of Figures Page Figure 1 Wittgenstein Model (Chan et al., 2010...a precise and comprehensive definition of the concept (Chan et al, 2010). Ludwig Wittgenstein argued that complex concepts are unable to be defined...recently, Chan et al. (2010) utilized the Wittgenstein concept and both of these previous researchers’ work to develop a model of the elements of

  20. A Review of LIDAR Radiometric Processing: From Ad Hoc Intensity Correction to Rigorous Radiometric Calibration.

    PubMed

    Kashani, Alireza G; Olsen, Michael J; Parrish, Christopher E; Wilson, Nicholas

    2015-11-06

    In addition to precise 3D coordinates, most light detection and ranging (LIDAR) systems also record "intensity", loosely defined as the strength of the backscattered echo for each measured point. To date, LIDAR intensity data have proven beneficial in a wide range of applications because they are related to surface parameters, such as reflectance. While numerous procedures have been introduced in the scientific literature, and even commercial software, to enhance the utility of intensity data through a variety of "normalization", "correction", or "calibration" techniques, the current situation is complicated by a lack of standardization, as well as confusing, inconsistent use of terminology. In this paper, we first provide an overview of basic principles of LIDAR intensity measurements and applications utilizing intensity information from terrestrial, airborne topographic, and airborne bathymetric LIDAR. Next, we review effective parameters on intensity measurements, basic theory, and current intensity processing methods. We define terminology adopted from the most commonly-used conventions based on a review of current literature. Finally, we identify topics in need of further research. Ultimately, the presented information helps lay the foundation for future standards and specifications for LIDAR radiometric calibration.

  1. A Review of LIDAR Radiometric Processing: From Ad Hoc Intensity Correction to Rigorous Radiometric Calibration

    PubMed Central

    Kashani, Alireza G.; Olsen, Michael J.; Parrish, Christopher E.; Wilson, Nicholas

    2015-01-01

    In addition to precise 3D coordinates, most light detection and ranging (LIDAR) systems also record “intensity”, loosely defined as the strength of the backscattered echo for each measured point. To date, LIDAR intensity data have proven beneficial in a wide range of applications because they are related to surface parameters, such as reflectance. While numerous procedures have been introduced in the scientific literature, and even commercial software, to enhance the utility of intensity data through a variety of “normalization”, “correction”, or “calibration” techniques, the current situation is complicated by a lack of standardization, as well as confusing, inconsistent use of terminology. In this paper, we first provide an overview of basic principles of LIDAR intensity measurements and applications utilizing intensity information from terrestrial, airborne topographic, and airborne bathymetric LIDAR. Next, we review effective parameters on intensity measurements, basic theory, and current intensity processing methods. We define terminology adopted from the most commonly-used conventions based on a review of current literature. Finally, we identify topics in need of further research. Ultimately, the presented information helps lay the foundation for future standards and specifications for LIDAR radiometric calibration. PMID:26561813

  2. Remote sensing applications for range management

    NASA Technical Reports Server (NTRS)

    Haas, R. H.

    1981-01-01

    The use of satellite information for range management is discussed. The use of infrared photography and color photography for analysis of vegetation cover is described. The methods of interpreting LANDSAT imagery are highlighted and possible applications of such interpretive methods to range management are considered. The concept of using LANDSAT as a sampling frame for renewable natural resource inventories was examined. It is concluded that a blending of LANDSAT vegetation data with soils and digital terrain data, will define a basic sampling unit that is appropriate for range management utilization.

  3. Application of symbolic computations to the constitutive modeling of structural materials

    NASA Technical Reports Server (NTRS)

    Arnold, Steven M.; Tan, H. Q.; Dong, X.

    1990-01-01

    In applications involving elevated temperatures, the derivation of mathematical expressions (constitutive equations) describing the material behavior can be quite time consuming, involved and error-prone. Therefore intelligent application of symbolic systems to faciliate this tedious process can be of significant benefit. Presented here is a problem oriented, self contained symbolic expert system, named SDICE, which is capable of efficiently deriving potential based constitutive models in analytical form. This package, running under DOE MACSYMA, has the following features: (1) potential differentiation (chain rule), (2) tensor computations (utilizing index notation) including both algebraic and calculus; (3) efficient solution of sparse systems of equations; (4) automatic expression substitution and simplification; (5) back substitution of invariant and tensorial relations; (6) the ability to form the Jacobian and Hessian matrix; and (7) a relational data base. Limited aspects of invariant theory were also incorporated into SDICE due to the utilization of potentials as a starting point and the desire for these potentials to be frame invariant (objective). The uniqueness of SDICE resides in its ability to manipulate expressions in a general yet pre-defined order and simplify expressions so as to limit expression growth. Results are displayed, when applicable, utilizing index notation. SDICE was designed to aid and complement the human constitutive model developer. A number of examples are utilized to illustrate the various features contained within SDICE. It is expected that this symbolic package can and will provide a significant incentive to the development of new constitutive theories.

  4. Theoretical models for application in school health education research.

    PubMed

    Parcel, G S

    1984-01-01

    Theoretical models that may be useful to research studies in school health education are reviewed. Selected, well-defined theories include social learning theory, problem-behavior theory, theory of reasoned action, communications theory, coping theory, social competence, and social and family theories. Also reviewed are multiple theory models including models of health related-behavior, the PRECEDE Framework, social-psychological approaches and the Activated Health Education Model. Two major reviews of teaching models are also discussed. The paper concludes with a brief outline of the general applications of theory to the field of school health education including applications to basic research, development and design of interventions, program evaluation, and program utilization.

  5. Reliability Centered Maintenance - Methodologies

    NASA Technical Reports Server (NTRS)

    Kammerer, Catherine C.

    2009-01-01

    Journal article about Reliability Centered Maintenance (RCM) methodologies used by United Space Alliance, LLC (USA) in support of the Space Shuttle Program at Kennedy Space Center. The USA Reliability Centered Maintenance program differs from traditional RCM programs because various methodologies are utilized to take advantage of their respective strengths for each application. Based on operational experience, USA has customized the traditional RCM methodology into a streamlined lean logic path and has implemented the use of statistical tools to drive the process. USA RCM has integrated many of the L6S tools into both RCM methodologies. The tools utilized in the Measure, Analyze, and Improve phases of a Lean Six Sigma project lend themselves to application in the RCM process. All USA RCM methodologies meet the requirements defined in SAE JA 1011, Evaluation Criteria for Reliability-Centered Maintenance (RCM) Processes. The proposed article explores these methodologies.

  6. Material Processing Opportunites Utilizing a Free Electron Laser

    NASA Astrophysics Data System (ADS)

    Todd, Alan

    1996-11-01

    Many properties of photocathode-driven Free Electron Lasers (FEL) are extremely attractive for material processing applications. These include: 1) broad-band tunability across the IR and UV spectra which permits wavelength optimization, depth deposition control and utilization of resonance phenomena; 2) picosecond pulse structure with continuous nanosecond spacing for optimum deposition efficiency and minimal collateral damage; 3) high peak and average radiated power for economic processing in quantity; and 4) high brightness for spatially defined energy deposition and intense energy density in small spots. We discuss five areas: polymer, metal and electronic material processing, micromachining and defense applications; where IR or UV material processing will find application if the economics is favorable. Specific examples in the IR and UV, such as surface texturing of polymers for improved look and feel, and anti-microbial food packaging films, which have been demonstrated using UV excimer lamps and lasers, will be given. Unfortunately, although the process utility is readily proven, the power levels and costs of lamps and lasers do not scale to production margins. However, from these examples, application specific cost targets ranging from 0.1=A2/kJ to 10=A2/kJ of delivered radiation at power levels from 10 kW to 500 kW, have been developed and are used to define strawman FEL processing systems. Since =46EL radiation energy extraction from the generating electron beam is typically a few percent, at these high average power levels, economic considerations dictate the use of a superconducting RF accelerator with energy recovery to minimize cavity and beam dump power loss. Such a 1 kW IR FEL, funded by the US Navy, is presently under construction at the Thomas Jefferson National Accelerator Facility. This dual-use device, scheduled to generate first light in late 1997, will test both the viability of high-power FELs for shipboard self-defense against cruise missiles, and for the first time, provide an industrial testbed capable of processing various materials in market evaluation quantities.

  7. The Effect of Explicit Metapragmatic Instruction on Request Speech Act Awareness of Intermediate EFL Students at Institute Level

    ERIC Educational Resources Information Center

    Masouleh, Fatemeh Abdollahizadeh; Arjmandi, Masoumeh; Vahdany, Fereydoon

    2014-01-01

    This study deals with the application of the pragmatics research to EFL teaching. The need for language learners to utilize a form of speech acts such as request which involves a series of strategies was significance of the study. Although defining different speech acts has been established since 1960s, recently there has been a shift towards…

  8. SQA of finite element method (FEM) codes used for analyses of pit storage/transport packages

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Russel, E.

    1997-11-01

    This report contains viewgraphs on the software quality assurance of finite element method codes used for analyses of pit storage and transport projects. This methodology utilizes the ISO 9000-3: Guideline for application of 9001 to the development, supply, and maintenance of software, for establishing well-defined software engineering processes to consistently maintain high quality management approaches.

  9. Artificial Intelligence Applications to Testability.

    DTIC Science & Technology

    1984-10-01

    general software assistant; examining testability utilization of it should wait a few years until the software assistant is a well defined product ...ago. It provides a single host which satisfies the needs of developers, product developers, and end users . As shown in table 5.10-2, it also provides...follows a trend towards more user -oriented design approaches to interactive computer systems. The implicit goal in this trend is the

  10. mGrid: A load-balanced distributed computing environment for the remote execution of the user-defined Matlab code

    PubMed Central

    Karpievitch, Yuliya V; Almeida, Jonas S

    2006-01-01

    Background Matlab, a powerful and productive language that allows for rapid prototyping, modeling and simulation, is widely used in computational biology. Modeling and simulation of large biological systems often require more computational resources then are available on a single computer. Existing distributed computing environments like the Distributed Computing Toolbox, MatlabMPI, Matlab*G and others allow for the remote (and possibly parallel) execution of Matlab commands with varying support for features like an easy-to-use application programming interface, load-balanced utilization of resources, extensibility over the wide area network, and minimal system administration skill requirements. However, all of these environments require some level of access to participating machines to manually distribute the user-defined libraries that the remote call may invoke. Results mGrid augments the usual process distribution seen in other similar distributed systems by adding facilities for user code distribution. mGrid's client-side interface is an easy-to-use native Matlab toolbox that transparently executes user-defined code on remote machines (i.e. the user is unaware that the code is executing somewhere else). Run-time variables are automatically packed and distributed with the user-defined code and automated load-balancing of remote resources enables smooth concurrent execution. mGrid is an open source environment. Apart from the programming language itself, all other components are also open source, freely available tools: light-weight PHP scripts and the Apache web server. Conclusion Transparent, load-balanced distribution of user-defined Matlab toolboxes and rapid prototyping of many simple parallel applications can now be done with a single easy-to-use Matlab command. Because mGrid utilizes only Matlab, light-weight PHP scripts and the Apache web server, installation and configuration are very simple. Moreover, the web-based infrastructure of mGrid allows for it to be easily extensible over the Internet. PMID:16539707

  11. mGrid: a load-balanced distributed computing environment for the remote execution of the user-defined Matlab code.

    PubMed

    Karpievitch, Yuliya V; Almeida, Jonas S

    2006-03-15

    Matlab, a powerful and productive language that allows for rapid prototyping, modeling and simulation, is widely used in computational biology. Modeling and simulation of large biological systems often require more computational resources then are available on a single computer. Existing distributed computing environments like the Distributed Computing Toolbox, MatlabMPI, Matlab*G and others allow for the remote (and possibly parallel) execution of Matlab commands with varying support for features like an easy-to-use application programming interface, load-balanced utilization of resources, extensibility over the wide area network, and minimal system administration skill requirements. However, all of these environments require some level of access to participating machines to manually distribute the user-defined libraries that the remote call may invoke. mGrid augments the usual process distribution seen in other similar distributed systems by adding facilities for user code distribution. mGrid's client-side interface is an easy-to-use native Matlab toolbox that transparently executes user-defined code on remote machines (i.e. the user is unaware that the code is executing somewhere else). Run-time variables are automatically packed and distributed with the user-defined code and automated load-balancing of remote resources enables smooth concurrent execution. mGrid is an open source environment. Apart from the programming language itself, all other components are also open source, freely available tools: light-weight PHP scripts and the Apache web server. Transparent, load-balanced distribution of user-defined Matlab toolboxes and rapid prototyping of many simple parallel applications can now be done with a single easy-to-use Matlab command. Because mGrid utilizes only Matlab, light-weight PHP scripts and the Apache web server, installation and configuration are very simple. Moreover, the web-based infrastructure of mGrid allows for it to be easily extensible over the Internet.

  12. OIL—Output input language for data connectivity between geoscientific software applications

    NASA Astrophysics Data System (ADS)

    Amin Khan, Khalid; Akhter, Gulraiz; Ahmad, Zulfiqar

    2010-05-01

    Geoscientific computing has become so complex that no single software application can perform all the processing steps required to get the desired results. Thus for a given set of analyses, several specialized software applications are required, which must be interconnected for electronic flow of data. In this network of applications the outputs of one application become inputs of other applications. Each of these applications usually involve more than one data type and may have their own data formats, making them incompatible with other applications in terms of data connectivity. Consequently several data format conversion utilities are developed in-house to provide data connectivity between applications. Practically there is no end to this problem as each time a new application is added to the system, a set of new data conversion utilities need to be developed. This paper presents a flexible data format engine, programmable through a platform independent, interpreted language named; Output Input Language (OIL). Its unique architecture allows input and output formats to be defined independent of each other by two separate programs. Thus read and write for each format is coded only once and data connectivity link between two formats is established by a combination of their read and write programs. This results in fewer programs with no redundancy and maximum reuse, enabling rapid application development and easy maintenance of data connectivity links.

  13. Technological aspects of lift-slab method in high-rise-building construction.

    NASA Astrophysics Data System (ADS)

    Gaidukov, Pavel V.; Pugach, Evgeny M.

    2018-03-01

    The utilization efficiency of slab lifting technology for high-rise-building construction is regarded in the present article. The main problem of the article is organizing technology abilities indication, which proves the method application possibility. There is the comparing of lifting technologies and sequential concrete-frame extension, as follows: the first one: the parameters are defined, and the second one: the organizational model is executed. This model defines borders of the usage methods, as well. There is the mathematic model creating, which describes boundary conditions of the present technologies usage. This model allows to predict construction efficiency for different stored-number buildings.

  14. Burst switching without guard interval in all-optical software-define star intra-data center network

    NASA Astrophysics Data System (ADS)

    Ji, Philip N.; Wang, Ting

    2014-02-01

    Optical switching has been introduced in intra-data center networks (DCNs) to increase capacity and to reduce power consumption. Recently we proposed a star MIMO OFDM-based all-optical DCN with burst switching and software-defined networking. Here, we introduce the control procedure for the star DCN in detail for the first time. The timing, signaling, and operation are described for each step to achieve efficient bandwidth resource utilization. Furthermore, the guidelines for the burst assembling period selection that allows burst switching without guard interval are discussed. The star all-optical DCN offers flexible and efficient control for next-generation data center application.

  15. Validity and reliability of an application review process using dedicated reviewers in one stage of a multi-stage admissions model.

    PubMed

    Zeeman, Jacqueline M; McLaughlin, Jacqueline E; Cox, Wendy C

    2017-11-01

    With increased emphasis placed on non-academic skills in the workplace, a need exists to identify an admissions process that evaluates these skills. This study assessed the validity and reliability of an application review process involving three dedicated application reviewers in a multi-stage admissions model. A multi-stage admissions model was utilized during the 2014-2015 admissions cycle. After advancing through the academic review, each application was independently reviewed by two dedicated application reviewers utilizing a six-construct rubric (written communication, extracurricular and community service activities, leadership experience, pharmacy career appreciation, research experience, and resiliency). Rubric scores were extrapolated to a three-tier ranking to select candidates for on-site interviews. Kappa statistics were used to assess interrater reliability. A three-facet Many-Facet Rasch Model (MFRM) determined reviewer severity, candidate suitability, and rubric construct difficulty. The kappa statistic for candidates' tier rank score (n = 388 candidates) was 0.692 with a perfect agreement frequency of 84.3%. There was substantial interrater reliability between reviewers for the tier ranking (kappa: 0.654-0.710). Highest construct agreement occurred in written communication (kappa: 0.924-0.984). A three-facet MFRM analysis explained 36.9% of variance in the ratings, with 0.06% reflecting application reviewer scoring patterns (i.e., severity or leniency), 22.8% reflecting candidate suitability, and 14.1% reflecting construct difficulty. Utilization of dedicated application reviewers and a defined tiered rubric provided a valid and reliable method to effectively evaluate candidates during the application review process. These analyses provide insight into opportunities for improving the application review process among schools and colleges of pharmacy. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. Beyond Widgets -- Systems Incentive Programs for Utilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Regnier, Cindy; Mathew, Paul; Robinson, Alastair

    Utility incentive programs remain one of the most significant means of deploying commercialized, but underutilized building technologies to scale. However, these programs have been largely limited to component-based products (e.g., lamps, RTUs). While some utilities do provide ‘custom’ incentive programs with whole building and system level technical assistance, these programs require deeper levels of analysis, resulting in higher program costs. This results in custom programs being restricted to utilities with greater resources, and are typically applied mainly to large or energy-intensive facilities, leaving much of the market without cost effective access and incentives for these solutions. In addition, with increasinglymore » stringent energy codes, cost effective component-based solutions that achieve significant savings are dwindling. Building systems (e.g., integrated façade, HVAC and/or lighting solutions) can deliver higher savings that translate into large sector-wide savings if deployed at the scale of these programs. However, systems application poses a number of challenges – baseline energy use must be defined and measured; the metrics for energy and performance must be defined and tested against; in addition, system savings must be validated under well understood conditions. This paper presents a sample of findings of a project to develop validated utility incentive program packages for three specific integrated building systems, in collaboration with Xcel Energy (CO, MN), ComEd, and a consortium of California Public Owned Utilities (CA POUs) (Northern California Power Agency(NCPA) and the Southern California Public Power Authority(SCPPA)). Furthermore, these program packages consist of system specifications, system performance, M&V protocols, streamlined assessment methods, market assessment and implementation guidance.« less

  17. Underwater manipulator's kinematic analysis for sustainable and energy efficient water hydraulics system

    NASA Astrophysics Data System (ADS)

    Hassan, Siti Nor Habibah; Yusof, Ahmad Anas; Tuan, Tee Boon; Saadun, Mohd Noor Asril; Ibrahim, Mohd Qadafie; Nik, Wan Mohd Norsani Wan

    2015-05-01

    In promoting energy saving and sustainability, this paper presents research development of water hydraulics manipulator test rig for underwater application. Kinematic analysis of the manipulator has been studied in order to identify the workspace of the fabricated manipulator. The workspace is important as it will define the working area suitable to be developed on the test rig, in order to study the effectiveness of using water hydraulics system for underwater manipulation application. Underwater manipulator that has the ability to utilize the surrounding sea water itself as the power and energy carrier should have better advantages over sustainability and performance.

  18. Telerobotic management system: coordinating multiple human operators with multiple robots

    NASA Astrophysics Data System (ADS)

    King, Jamie W.; Pretty, Raymond; Brothers, Brendan; Gosine, Raymond G.

    2003-09-01

    This paper describes an application called the Tele-robotic management system (TMS) for coordinating multiple operators with multiple robots for applications such as underground mining. TMS utilizes several graphical interfaces to allow the user to define a partially ordered plan for multiple robots. This plan is then converted to a Petri net for execution and monitoring. TMS uses a distributed framework to allow robots and operators to easily integrate with the applications. This framework allows robots and operators to join the network and advertise their capabilities through services. TMS then decides whether tasks should be dispatched to a robot or a remote operator based on the services offered by the robots and operators.

  19. Study of robotics systems applications to the space station program

    NASA Technical Reports Server (NTRS)

    Fox, J. C.

    1983-01-01

    Applications of robotics systems to potential uses of the Space Station as an assembly facility, and secondarily as a servicing facility, are considered. A typical robotics system mission is described along with the pertinent application guidelines and Space Station environmental assumptions utilized in developing the robotic task scenarios. A functional description of a supervised dual-robot space structure construction system is given, and four key areas of robotic technology are defined, described, and assessed. Alternate technologies for implementing the more routine space technology support subsystems that will be required to support the Space Station robotic systems in assembly and servicing tasks are briefly discussed. The environmental conditions impacting on the robotic configuration design and operation are reviewed.

  20. Nowcasting Induced Seismicity at the Groningen Gas Field in the Netherlands

    NASA Astrophysics Data System (ADS)

    Luginbuhl, M.; Rundle, J. B.; Turcotte, D. L.

    2017-12-01

    The Groningen natural gas field in the Netherlands has recently been a topic of controversy for many residents in the surrounding area. The gas field provides energy for the majority of the country; however, for a minority of Dutch citizens who live nearby, the seismicity induced by the gas field is a cause for major concern. Since the early 2000's, the region has seen an increase in both number and magnitude of events, the largest of which was a magnitude 3.6 in 2012. Earthquakes of this size and smaller easily cause infrastructural damage to older houses and farms built with single brick walls. Nowcasting is a new method of statistically classifying seismicity and seismic risk. In this paper, the method is applied to the induced seismicity at the natural gas fields in Groningen, Netherlands. Nowcasting utilizes the catalogs of seismicity in these regions. Two earthquake magnitudes are selected, one large say , and one small say . The method utilizes the number of small earthquakes that occur between pairs of large earthquakes. The cumulative probability distribution of these values is obtained. The earthquake potential score (EPS) is defined by the number of small earthquakes that have occurred since the last large earthquake, the point where this number falls on the cumulative probability distribution of interevent counts defines the EPS. A major advantage of nowcasting is that it utilizes "natural time", earthquake counts, between events rather than clock time. Thus, it is not necessary to decluster aftershocks and the results are applicable if the level of induced seismicity varies in time, which it does in this case. The application of natural time to the accumulation of the seismic hazard depends on the applicability of Gutenberg-Richter (GR) scaling. The increasing number of small earthquakes that occur after a large earthquake can be scaled to give the risk of a large earthquake occurring. To illustrate our approach, we utilize the number of earthquakes in Groningen to nowcast the number of earthquakes in Groningen. The applicability of the scaling is illustrated during the rapid build up of seismicity between 2004 and 2016. It can now be used to forecast the expected reduction in seismicity associated with reduction in gas production.

  1. Absorbent product to absorb fluids. [for collection of human wastes

    NASA Technical Reports Server (NTRS)

    Dawn, F. S.; Correale, J. V. (Inventor)

    1982-01-01

    A multi-layer absorbent product for use in contact with the skin to absorb fluids is discussed. The product utilizes a water pervious facing layer for contacting the skin, overlayed by a first fibrous wicking layer, the wicking layer preferably being of the one-way variety in which fluid or liquid is moved away from the facing layer. The product further includes a first container section defined by inner and outer layer of a water pervious wicking material between which is disposed a first absorbent mass. A second container section defined by inner and outer layers between which is disposed a second absorbent mass and a liquid impermeable/gas permeable layer. Spacesuit applications are discussed.

  2. Method and apparatus for determining and utilizing a time-expanded decision network

    NASA Technical Reports Server (NTRS)

    de Weck, Olivier (Inventor); Silver, Matthew (Inventor)

    2012-01-01

    A method, apparatus and computer program for determining and utilizing a time-expanded decision network is presented. A set of potential system configurations is defined. Next, switching costs are quantified to create a "static network" that captures the difficulty of switching among these configurations. A time-expanded decision network is provided by expanding the static network in time, including chance and decision nodes. Minimum cost paths through the network are evaluated under plausible operating scenarios. The set of initial design configurations are iteratively modified to exploit high-leverage switches and the process is repeated to convergence. Time-expanded decision networks are applicable, but not limited to, the design of systems, products, services and contracts.

  3. Performance evaluation of multi-stratum resources integration based on network function virtualization in software defined elastic data center optical interconnect.

    PubMed

    Yang, Hui; Zhang, Jie; Ji, Yuefeng; Tian, Rui; Han, Jianrui; Lee, Young

    2015-11-30

    Data center interconnect with elastic optical network is a promising scenario to meet the high burstiness and high-bandwidth requirements of data center services. In our previous work, we implemented multi-stratum resilience between IP and elastic optical networks that allows to accommodate data center services. In view of this, this study extends to consider the resource integration by breaking the limit of network device, which can enhance the resource utilization. We propose a novel multi-stratum resources integration (MSRI) architecture based on network function virtualization in software defined elastic data center optical interconnect. A resource integrated mapping (RIM) scheme for MSRI is introduced in the proposed architecture. The MSRI can accommodate the data center services with resources integration when the single function or resource is relatively scarce to provision the services, and enhance globally integrated optimization of optical network and application resources. The overall feasibility and efficiency of the proposed architecture are experimentally verified on the control plane of OpenFlow-based enhanced software defined networking (eSDN) testbed. The performance of RIM scheme under heavy traffic load scenario is also quantitatively evaluated based on MSRI architecture in terms of path blocking probability, provisioning latency and resource utilization, compared with other provisioning schemes.

  4. Measuring Road Network Vulnerability with Sensitivity Analysis

    PubMed Central

    Jun-qiang, Leng; Long-hai, Yang; Liu, Wei-yi; Zhao, Lin

    2017-01-01

    This paper focuses on the development of a method for road network vulnerability analysis, from the perspective of capacity degradation, which seeks to identify the critical infrastructures in the road network and the operational performance of the whole traffic system. This research involves defining the traffic utility index and modeling vulnerability of road segment, route, OD (Origin Destination) pair and road network. Meanwhile, sensitivity analysis method is utilized to calculate the change of traffic utility index due to capacity degradation. This method, compared to traditional traffic assignment, can improve calculation efficiency and make the application of vulnerability analysis to large actual road network possible. Finally, all the above models and calculation method is applied to actual road network evaluation to verify its efficiency and utility. This approach can be used as a decision-supporting tool for evaluating the performance of road network and identifying critical infrastructures in transportation planning and management, especially in the resource allocation for mitigation and recovery. PMID:28125706

  5. Maori responsiveness in health and medical research: key issues for researchers (part 1).

    PubMed

    Sporle, Andrew; Koea, Jonathan

    2004-08-06

    Application for contestable government-research funding and ethical approval requires researchers to outline how their intended research project contributes to Maori development or advancement. When formulating their research proposals, the key issues for researchers are research utility, defining Maori, informed consent, confidentiality, issues with human tissues and genetic material, participant remuneration and recognition (koha), intellectual property, and involvement of local Maori health or social services. The most common Maori responsiveness issues in research applications can be readily approached by researchers who address straightforward methodological concerns, by working through precedents established by peers and colleagues, as well as by working with end-users of their research.

  6. Forestry applications project/timber resource. Sam Houston National forest inventory and development of a survey planning model

    NASA Technical Reports Server (NTRS)

    Colwell, R. N.

    1976-01-01

    The Forestry Applications Project has been directed towards solving the problem of meeting informational needs of the resource managers utilizing remote sensing data sources including satellite data, conventional aerial photography, and direct measurement on the ground in such combinations as needed to best achieve these goals. It is recognized that sampling plays an important role in generating relevant information for managing large geographic populations. The central problem, therefore, is to define the kind and amount of sampling and the place of remote sensing data sources in that sampling system to do the best possible job of meeting the manager's informational needs.

  7. Development and Applications of a Stage Stacking Procedure

    NASA Technical Reports Server (NTRS)

    Kulkarni, Sameer; Celestina, Mark L.; Adamczyk, John J.

    2012-01-01

    The preliminary design of multistage axial compressors in gas turbine engines is typically accomplished with mean-line methods. These methods, which rely on empirical correlations, estimate compressor performance well near the design point, but may become less reliable off-design. For land-based applications of gas turbine engines, off-design performance estimates are becoming increasingly important, as turbine plant operators desire peaking or load-following capabilities and hot-day operability. The current work develops a one-dimensional stage stacking procedure, including a newly defined blockage term, which is used to estimate the off-design performance and operability range of a 13-stage axial compressor used in a power generating gas turbine engine. The new blockage term is defined to give mathematical closure on static pressure, and values of blockage are shown to collapse to curves as a function of stage inlet flow coefficient and corrected shaft speed. In addition to these blockage curves, the stage stacking procedure utilizes stage characteristics of ideal work coefficient and adiabatic efficiency. These curves are constructed using flow information extracted from computational fluid dynamics (CFD) simulations of groups of stages within the compressor. Performance estimates resulting from the stage stacking procedure are shown to match the results of CFD simulations of the entire compressor to within 1.6% in overall total pressure ratio and within 0.3 points in overall adiabatic efficiency. Utility of the stage stacking procedure is demonstrated by estimation of the minimum corrected speed which allows stable operation of the compressor. Further utility of the stage stacking procedure is demonstrated with a bleed sensitivity study, which estimates a bleed schedule to expand the compressors operating range.

  8. A pyrosequencing assay for the quantitative methylation analysis of the PCDHB gene cluster, the major factor in neuroblastoma methylator phenotype.

    PubMed

    Banelli, Barbara; Brigati, Claudio; Di Vinci, Angela; Casciano, Ida; Forlani, Alessandra; Borzì, Luana; Allemanni, Giorgio; Romani, Massimo

    2012-03-01

    Epigenetic alterations are hallmarks of cancer and powerful biomarkers, whose clinical utilization is made difficult by the absence of standardization and of common methods of data interpretation. The coordinate methylation of many loci in cancer is defined as 'CpG island methylator phenotype' (CIMP) and identifies clinically distinct groups of patients. In neuroblastoma (NB), CIMP is defined by a methylation signature, which includes different loci, but its predictive power on outcome is entirely recapitulated by the PCDHB cluster only. We have developed a robust and cost-effective pyrosequencing-based assay that could facilitate the clinical application of CIMP in NB. This assay permits the unbiased simultaneous amplification and sequencing of 17 out of 19 genes of the PCDHB cluster for quantitative methylation analysis, taking into account all the sequence variations. As some of these variations were at CpG doublets, we bypassed the data interpretation conducted by the methylation analysis software to assign the corrected methylation value at these sites. The final result of the assay is the mean methylation level of 17 gene fragments in the protocadherin B cluster (PCDHB) cluster. We have utilized this assay to compare the methylation levels of the PCDHB cluster between high-risk and very low-risk NB patients, confirming the predictive value of CIMP. Our results demonstrate that the pyrosequencing-based assay herein described is a powerful instrument for the analysis of this gene cluster that may simplify the data comparison between different laboratories and, in perspective, could facilitate its clinical application. Furthermore, our results demonstrate that, in principle, pyrosequencing can be efficiently utilized for the methylation analysis of gene clusters with high internal homologies.

  9. Software-defined optical network for metro-scale geographically distributed data centers.

    PubMed

    Samadi, Payman; Wen, Ke; Xu, Junjie; Bergman, Keren

    2016-05-30

    The emergence of cloud computing and big data has rapidly increased the deployment of small and mid-sized data centers. Enterprises and cloud providers require an agile network among these data centers to empower application reliability and flexible scalability. We present a software-defined inter data center network to enable on-demand scale out of data centers on a metro-scale optical network. The architecture consists of a combined space/wavelength switching platform and a Software-Defined Networking (SDN) control plane equipped with a wavelength and routing assignment module. It enables establishing transparent and bandwidth-selective connections from L2/L3 switches, on-demand. The architecture is evaluated in a testbed consisting of 3 data centers, 5-25 km apart. We successfully demonstrated end-to-end bulk data transfer and Virtual Machine (VM) migrations across data centers with less than 100 ms connection setup time and close to full link capacity utilization.

  10. [Generalization of the results of clinical studies through the analysis of subgroups].

    PubMed

    Costa, João; Fareleira, Filipa; Ascensão, Raquel; Vaz Carneiro, António

    2012-01-01

    Subgroup analysis in clinical trials are usually performed to define the potential heterogeneity of treatment effect in relation with the baseline risk, physiopathology, practical application of therapy or the under-utilization in clinical practice of effective interventions due to uncertainties of its benefit/risk ratio. When appropriately planned, subgroup analysis are a valid methodology the define benefits in subgroups of patients, thus providing good quality evidence to support clinical decision making. However, in order to be correct, subgroup analysis should be defined a priori, done in small numbers, should be fully reported and, most important, must endure statistical tests for interaction. In this paper we present an example of the treatment of post-menopausal osteoporosis, in which the benefits of an intervention (the higher the fracture risk is, the better the benefit is) with a specific agent (bazedoxifene) was only disclosed after a post-hoc analysis of the initial global trial sample.

  11. A safety-based decision making architecture for autonomous systems

    NASA Technical Reports Server (NTRS)

    Musto, Joseph C.; Lauderbaugh, L. K.

    1991-01-01

    Engineering systems designed specifically for space applications often exhibit a high level of autonomy in the control and decision-making architecture. As the level of autonomy increases, more emphasis must be placed on assimilating the safety functions normally executed at the hardware level or by human supervisors into the control architecture of the system. The development of a decision-making structure which utilizes information on system safety is detailed. A quantitative measure of system safety, called the safety self-information, is defined. This measure is analogous to the reliability self-information defined by McInroy and Saridis, but includes weighting of task constraints to provide a measure of both reliability and cost. An example is presented in which the safety self-information is used as a decision criterion in a mobile robot controller. The safety self-information is shown to be consistent with the entropy-based Theory of Intelligent Machines defined by Saridis.

  12. Technology Utilization House Study Report. [For Energy Conservation

    NASA Technical Reports Server (NTRS)

    1974-01-01

    The objectives of Project TECH are: (1) to construct a single family detached dwelling for demonstrating the application of advanced technology and minimizing the requirement for energy and utility services, and (2) to help influence future development in home construction by defining the interaction of integrated energy and water management systems with building configuration and construction materials. Components and methods expected to be cost effective over a 20 year span were studied. Emphasis was placed on the utilization of natural heating and cooling characteristics. Orientation and location of windows, landscaping, natural ventilation, and characteristics of the local climate and microclimate were intended to be used to best advantage. Energy conserving homes are most efficient when design for specific sites, therefore project TECH should not be considered a prototype design suitable for all locations. However, it does provide ideas and analytical methods which can be applied to some degree in all housing.

  13. Multifunctional carbon nanoelectrodes fabricated by focused ion beam milling.

    PubMed

    Thakar, Rahul; Weber, Anna E; Morris, Celeste A; Baker, Lane A

    2013-10-21

    We report a strategy for fabrication of sub-micron, multifunctional carbon electrodes and application of these electrodes as probes for scanning electrochemical microscopy (SECM) and scanning ion conductance microscopy (SICM). The fabrication process utilized chemical vapor deposition of parylene, followed by thermal pyrolysis to form conductive carbon and then further deposition of parylene to form an insulation layer. To achieve well-defined electrode geometries, two methods of electrode exposure were utilized. In the first method, carbon probes were masked in polydimethylsiloxane (PDMS) to obtain a cone-shaped electrode. In the second method, the electrode area was exposed via milling with a focused ion beam (FIB) to reveal a carbon ring electrode, carbon ring/platinum disk electrode, or carbon ring/nanopore electrode. Carbon electrodes were batch fabricated (~35/batch) through the vapor deposition process and were characterized with scanning electron microscopy (SEM), scanning transmission electron microscopy (STEM), and cyclic voltammetry (CV) measurements. Additionally, Raman spectroscopy was utilized to examine the effects of Ga(+) ion implantation, a result of FIB milling. Constant-height, feedback mode SECM was performed with conical carbon electrodes and carbon ring electrodes. We demonstrate the utility of carbon ring/nanopore electrodes with SECM-SICM to simultaneously collect topography, ion current and electrochemical current images. In addition, carbon ring/nanopore electrodes were utilized in substrate generation/tip collection (SG/TC) SECM. In SG/TC SECM, localized delivery of redox molecules affords a higher resolution, than when the redox molecules are present in the bath solution. Multifunctional geometries of carbon electrode probes will find utility in electroanalytical applications, in general, and more specifically with electrochemical microscopy as discussed herein.

  14. Swarm formation control utilizing elliptical surfaces and limiting functions.

    PubMed

    Barnes, Laura E; Fields, Mary Anne; Valavanis, Kimon P

    2009-12-01

    In this paper, we present a strategy for organizing swarms of unmanned vehicles into a formation by utilizing artificial potential fields that were generated from normal and sigmoid functions. These functions construct the surface on which swarm members travel, controlling the overall swarm geometry and the individual member spacing. Nonlinear limiting functions are defined to provide tighter swarm control by modifying and adjusting a set of control variables that force the swarm to behave according to set constraints, formation, and member spacing. The artificial potential functions and limiting functions are combined to control swarm formation, orientation, and swarm movement as a whole. Parameters are chosen based on desired formation and user-defined constraints. This approach is computationally efficient and scales well to different swarm sizes, to heterogeneous systems, and to both centralized and decentralized swarm models. Simulation results are presented for a swarm of 10 and 40 robots that follow circle, ellipse, and wedge formations. Experimental results are included to demonstrate the applicability of the approach on a swarm of four custom-built unmanned ground vehicles (UGVs).

  15. Applications of space observations to the management and utilization of coastal fishery resources

    NASA Technical Reports Server (NTRS)

    Kemmerer, A. J.; Savastano, K. J.; Faller, K. H.

    1977-01-01

    Information needs of those concerned with the harvest and management of coastal fishery resources can be satisfied in part through applications of satellite remote sensing. Recently completed and ongoing investigations have demonstrated potentials for defining fish distribution patterns from multispectral data, monitoring fishing distribution and effort with synthetic aperture radar systems, forecasting recruitment of certain estuarine-dependent species, and tracking marine mammals. These investigations, which are reviewed in this paper, have relied on Landsat 1 and 2, Skylab-3, and Nimbus-6 supported sensors and sensors carried by aircraft and mounted on surface platforms to simulate applications from Seasat-A and other future spacecraft systems. None of the systems are operational as all were designed to identify and demonstrate applications and to aid in the specification of requirements for future spaceborne systems.

  16. Packet utilisation definitions for the ESA XMM mission

    NASA Technical Reports Server (NTRS)

    Nye, H. R.

    1994-01-01

    XMM, ESA's X-Ray Multi-Mirror satellite, due for launch at the end of 1999 will be the first ESA scientific spacecraft to implement the ESA packet telecommand and telemetry standards and will be the first ESOC-controlled science mission to take advantage of the new flight control system infrastructure development (based on object-oriented design and distributed-system architecture) due for deployment in 1995. The implementation of the packet standards is well defined at packet transport level. However, the standard relevant to the application level (the ESA Packet Utilization Standard) covers a wide range of on-board 'services' applicable in varying degrees to the needs of XMM. In defining which parts of the ESA PUS to implement, the XMM project first considered the mission objectives and the derived operations concept and went on to identify a minimum set of packet definitions compatible with these aspects. This paper sets the scene as above and then describes the services needed for XMM and the telecommand and telemetry packet types necessary to support each service.

  17. Integrated Power and Attitude Control Systems for Space Station

    NASA Technical Reports Server (NTRS)

    Oglevie, R. E.; Eisenhaure, D. B.

    1985-01-01

    Integrated Power and Attitude Control Systems (IPACS) studies performed over a decade ago established the feasibility of simultaneously storing electrical energy in wheels and utilizing the resulting momentum for spacecraft attitude control. It was shown that such a system possessed many advantages over other contemporary energy storage and attitude control systems in many applications. More recent technology advances in composite rotors, magnetic bearings, and power control electronics have triggered new optimism regarding the feasibility and merits of such a system. The paper presents the results of a recent study whose focus was to define an advanced IPACS and to evaluate its merits for the Space Station application. A system and component design concept is developed to establish the system performance capability. A system level trade study, including life-cycle costing, is performed to define the merits of the system relative to two other candidate systems. It is concluded that an advanced IPACS concept is not only feasible, but offers substantial savings in mass, and life-cycle cost.

  18. Solar thermal plant impact analysis and requirements definition

    NASA Technical Reports Server (NTRS)

    Gupta, Y. P.

    1980-01-01

    Progress on a continuing study comprising of ten tasks directed at defining impact and requirements for solar thermal power systems (SPS), 1 to 10 MWe each in capacity, installed during 1985 through year 2000 in a utility or a nonutility load in the United States is summarized. The point focus distributed receiver (PFDR) solar power systems are emphasized. Tasks 1 through 4, completed to date, include the development of a comprehensive data base on SPS configurations, their performance, cost, availability, and potential applications; user loads, regional characteristics, and an analytic methodology that incorporates the generally accepted utility financial planning methods and several unique modifications to treat the significant and specific characteristics of solar power systems deployed in either central or distributed power generation modes, are discussed.

  19. National Aeronautics and Space Administration fundamental research program. Information utilization and evaluation

    NASA Technical Reports Server (NTRS)

    Estes, J. E.; Eisgruber, L.

    1981-01-01

    In the second half of the 1980's NASA can expect to face difficult choices among alternative fundamental and applied research, and development projects that could potentially lead to improvements in the information systems used to manage renewable resources. The working group on information utilization and evaluation believes that effective choices cannot be made without a better understanding of the current and prospective problems and opportunities involved in the application of remote sensing to improve renewable research information systems. A renewable resources information system is defined in a broad context to include a flow of data/information from: acquisition through processing, storage, integration with other data, analysis, graphic presentation, decision making, and assessment of the affects of those decisions.

  20. Use of digital technologies for nasal prosthesis manufacturing.

    PubMed

    Palousek, David; Rosicky, Jiri; Koutny, Daniel

    2014-04-01

    Digital technology is becoming more accessible for common use in medical applications; however, their expansion in prosthetic and orthotic laboratories is not large because of the persistent image of difficult applicability to real patients. This article aims to offer real example in the area of human facial prostheses. This article describes the utilization of optical digitization, computational modelling, rapid prototyping, mould fabrication and manufacturing of a nasal silicone prosthesis. This technical note defines the key points of the methodology and aspires to contribute to the introduction of a certified manufacturing procedure. The results show that the used technologies reduce the manufacturing time, reflect patient's requirements and allow the manufacture of high-quality prostheses for missing facial asymmetric parts. The methodology provides a good position for further development issues and is usable for clinical practice. Clinical relevance Utilization of digital technologies in facial prosthesis manufacturing process can be a good contribution for higher patient comfort and higher production efficiency but with higher initial investment and demands for experience with software tools.

  1. Redox Bulk Energy Storage System Study, Volume 1

    NASA Technical Reports Server (NTRS)

    Ciprios, G.; Erskine, W., Jr.; Grimes, P. G.

    1977-01-01

    Opportunities were found for electrochemical energy storage devices in the U.S. electric utility industry. Application requirements for these devices were defined, including techno-economic factors. A new device, the Redox storage battery was analyzed. The Redox battery features a decoupling of energy storage and power conversion functions. General computer methods were developed to simulate Redox system operations. These studies showed that the Redox system is potentially attractive if certain performance goals can be achieved. Pathways for reducing the cost of the Redox system were identified.

  2. The IDEAL (Integrated Design and Engineering Analysis Languages) modeling methodology: Capabilities and Applications

    NASA Technical Reports Server (NTRS)

    Evers, Ken H.; Bachert, Robert F.

    1987-01-01

    The IDEAL (Integrated Design and Engineering Analysis Languages) modeling methodology has been formulated and applied over a five-year period. It has proven to be a unique, integrated approach utilizing a top-down, structured technique to define and document the system of interest; a knowledge engineering technique to collect and organize system descriptive information; a rapid prototyping technique to perform preliminary system performance analysis; and a sophisticated simulation technique to perform in-depth system performance analysis.

  3. Nonlinear aerodynamic wing design

    NASA Technical Reports Server (NTRS)

    Bonner, Ellwood

    1985-01-01

    The applicability of new nonlinear theoretical techniques is demonstrated for supersonic wing design. The new technology was utilized to define outboard panels for an existing advanced tactical fighter model. Mach 1.6 maneuver point design and multi-operating point compromise surfaces were developed and tested. High aerodynamic efficiency was achieved at the design conditions. A corollary result was that only modest supersonic penalties were incurred to meet multiple aerodynamic requirements. The nonlinear potential analysis of a practical configuration arrangement correlated well with experimental data.

  4. Q14 - Standards Development Plan, Ada Interfaces to X Window System, Analysis and Recommendations

    DTIC Science & Technology

    1989-03-20

    portability and reusability. -, . /- , ... ’ 4t. I-,: 2 Introduction Two major thrusts of the STARS program, and industry as a whole, are application...and IEEE, and in industry consortiums to show the directions X is taking and the opportunities for Ada to utilize this work. X is not the only window...and actually prohibit portability, but to avoid this the X developers formed the X Consortium, consisting of industry and academic members, who define

  5. Skylab IMSS checklist application study for emergency medical care. [emergency medical care operations involving the use and operation of the portable ambulance module

    NASA Technical Reports Server (NTRS)

    Carl, J. G.; Furukawa, S.

    1975-01-01

    A manual is presented that provides basic technical documentation to support the operation and utilization of the Portable Ambulance Module (PAM) in the field. The PAM is designed to be used for emergency resuscitation and victim monitoring. The functions of all the controls, displays, and stowed equipment of the unit are defined. Supportive medical and physiological data in those areas directly related to the uses of the PAM unit are presented.

  6. A methodology for comprehensive strategic planning and program prioritization

    NASA Astrophysics Data System (ADS)

    Raczynski, Christopher Michael

    2008-10-01

    This process developed in this work, Strategy Optimization for the Allocation of Resources (SOAR), is a strategic planning methodology based off Integrated Product and Process Development and systems engineering techniques. Utilizing a top down approach, the process starts with the creation of the organization vision and its measures of effectiveness. These measures are prioritized based on their application to external world scenarios which will frame the future. The programs which will be used to accomplish this vision are identified by decomposing the problem. Information is gathered on the programs as to the application, cost, schedule, risk, and other pertinent information. The relationships between the levels of the hierarchy are mapped utilizing subject matter experts. These connections are then utilized to determine the overall benefit of the programs to the vision of the organization. Through a Multi-Objective Genetic Algorithm a tradespace of potential program portfolios can be created amongst which the decision maker can allocate resources. The information and portfolios are presented to the decision maker through the use of a Decision Support System which collects and visualizes all the data in a single location. This methodology was tested utilizing a science and technology planning exercise conducted by the United States Navy. A thorough decomposition was defined and technology programs identified which had the potential to provide benefit to the vision. The prioritization of the top level capabilities was performed through the use of a rank ordering scheme and a previous naval application was used to demonstrate a cumulative voting scheme. Voting was performed utilizing the Nominal Group Technique to capture the relationships between the levels of the hierarchy. Interrelationships between the technologies were identified and a MOGA was utilized to optimize portfolios with respect to these constraints and information was placed in a DSS. This formulation allowed the decision makers to assess which portfolio could provide the greatest benefit to the Navy while still fitting within the funding profile.

  7. Genetic data simulators and their applications: an overview

    PubMed Central

    Peng, Bo; Chen, Huann-Sheng; Mechanic, Leah E.; Racine, Ben; Clarke, John; Gillanders, Elizabeth; Feuer, Eric J.

    2016-01-01

    Computer simulations have played an indispensable role in the development and application of statistical models and methods for genetic studies across multiple disciplines. The need to simulate complex evolutionary scenarios and pseudo-datasets for various studies has fueled the development of dozens of computer programs with varying reliability, performance, and application areas. To help researchers compare and choose the most appropriate simulators for their studies, we have created the Genetic Simulation Resources (GSR) website, which allows authors of simulation software to register their applications and describe them with more than 160 defined attributes. This article summarizes the properties of 93 simulators currently registered at GSR and provides an overview of the development and applications of genetic simulators. Unlike other review articles that address technical issues or compare simulators for particular application areas, we focus on software development, maintenance, and features of simulators, often from a historical perspective. Publications that cite these simulators are used to summarize both the applications of genetic simulations and the utilization of simulators. PMID:25504286

  8. Fabrication of 3D SiO x structures using patterned PMMA sacrificial layer

    NASA Astrophysics Data System (ADS)

    Li, Zhiqin; Xiang, Quan; Zheng, Mengjie; Bi, Kaixi; Chen, Yiqin; Chen, Keqiu; Duan, Huigao

    2018-02-01

    Three-dimensional (3D) nanofabrication based on electron-beam lithography (EBL) has drawn wide attention for various applications with its high patterning resolution and design flexibility. In this work, we present a bilayer EBL process to obtain 3D freestanding SiO x structures via the release of the bottom sacrificial layer. This new kind of bilayer process enables us to define various 3D freestanding SiO x structures with high resolution and low edge roughness. As a proof of concept for applications, metal-coated freestanding SiO x microplates with an underlying air gap were fabricated to form asymmetric Fabry-Perot resonators, which can be utilized for colorimetric refractive index sensing and thus also have application potential for biochemical detection, anti-counterfeiting and smart active nano-optical devices.

  9. Wideband unbalanced waveguide power dividers and combiners

    DOEpatents

    Halligan, Matthew; McDonald, Jacob Jeremiah; Strassner, II, Bernd H.

    2016-05-17

    The various technologies presented herein relate to waveguide dividers and waveguide combiners for application in radar systems, wireless communications, etc. Waveguide dividers-combiners can be manufactured in accordance with custom dimensions, as well as in accordance with waveguide standards such that the input and output ports are of a defined dimension and have a common impedance. Various embodiments are presented which can incorporate one or more septum(s), one or more pairs of septums, an iris, an input matching region, a notch located on the input waveguide arm, waveguide arms having stepped transformer regions, etc. The various divider configurations presented herein can be utilized in high fractional bandwidth applications, e.g., a fractional bandwidth of about 30%, and RF applications in the Ka frequency band (e.g., 26.5-40 GHz).

  10. [Utilization and coverage of a Food and Nutritional Surveillance System in Rio Grande do Sul state, Brazil].

    PubMed

    Jung, Natália Miranda; Bairros, Fernanda de Souza; Neutzling, Marilda Borges

    2014-05-01

    This article seeks to describe the utilization and coverage percentage of the Nutritional and Food Surveillance System (SISVAN-Web) in the Regional Health Offices of Rio Grande do Sul in 2010 and to assess its correlation with socio-economic, demographic and health system organization variables at the time. It is an ecological study that used secondary data from the SISVAN-Web, the Department of Primary Health Care, the IT Department of the Unified Health System and the Brazilian Institute of Geography and Statistics. The evaluation of utilization and coverage data was restricted to nutritional status. The percentage of utilization of SISVAN-Web refers to the number of cities that fed the system. Total coverage was defined as the percentage of individuals in all stages of the life cycle monitored by SISVAN-Web. It was found that 324 cities fed the application, corresponding to a utilization percentage of 65.3%. Greater system coverage was observed in all Regional Health Coordination (RHC) Units for ages 0 to 5 years and 5-10 years. There was a significant association between the percentage of utilization of SISVAN-Web and Family Health Strategy coverage in each RHC Unit. The results of this study indicated low percentages of utilization and coverage of SISVAN-Web in Rio Grande do Sul.

  11. Experiment Software and Projects on the Web with VISPA

    NASA Astrophysics Data System (ADS)

    Erdmann, M.; Fischer, B.; Fischer, R.; Geiser, E.; Glaser, C.; Müller, G.; Rieger, M.; Urban, M.; von Cube, R. F.; Welling, C.

    2017-10-01

    The Visual Physics Analysis (VISPA) project defines a toolbox for accessing software via the web. It is based on latest web technologies and provides a powerful extension mechanism that enables to interface a wide range of applications. Beyond basic applications such as a code editor, a file browser, or a terminal, it meets the demands of sophisticated experiment-specific use cases that focus on physics data analyses and typically require a high degree of interactivity. As an example, we developed a data inspector that is capable of browsing interactively through event content of several data formats, e.g., MiniAOD which is utilized by the CMS collaboration. The VISPA extension mechanism can also be used to embed external web-based applications that benefit from dynamic allocation of user-defined computing resources via SSH. For example, by wrapping the JSROOT project, ROOT files located on any remote machine can be inspected directly through a VISPA server instance. We introduced domains that combine groups of users and role-based permissions. Thereby, tailored projects are enabled, e.g. for teaching where access to student’s homework is restricted to a team of tutors, or for experiment-specific data that may only be accessible for members of the collaboration. We present the extension mechanism including corresponding applications and give an outlook onto the new permission system.

  12. Multiple Signals Govern Utilization of a Polysaccharide in the Gut Bacterium Bacteroides thetaiotaomicron.

    PubMed

    Schwalm, Nathan D; Townsend, Guy E; Groisman, Eduardo A

    2016-10-11

    The utilization of simple sugars is widespread across all domains of life. In contrast, the breakdown of complex carbohydrates is restricted to a subset of organisms. A regulatory paradigm for integration of complex polysaccharide breakdown with simple sugar utilization was established in the mammalian gut symbiont Bacteroides thetaiotaomicron, whereby sensing of monomeric fructose regulates catabolism of both fructose and polymeric fructans. We now report that a different regulatory paradigm governs utilization of monomeric arabinose and the arabinose polymer arabinan. We establish that (i) arabinan utilization genes are controlled by a transcriptional activator that responds to arabinan and by a transcriptional repressor that responds to arabinose, (ii) arabinose utilization genes are regulated directly by the arabinose-responding repressor but indirectly by the arabinan-responding activator, and (iii) activation of both arabinan and arabinose utilization genes requires a pleiotropic transcriptional regulator necessary for survival in the mammalian gut. Genomic analysis predicts that this paradigm is broadly applicable to the breakdown of other polysaccharides in both B. thetaiotaomicron and other gut Bacteroides spp. The uncovered mechanism enables regulation of polysaccharide utilization genes in response to both the polysaccharide and its breakdown products. Breakdown of complex polysaccharides derived from "dietary fiber" is achieved by the mammalian gut microbiota. This breakdown creates a critical nutrient source for both the microbiota and its mammalian host. Because the availability of individual polysaccharides fluctuates with variations in the host diet, members of the microbiota strictly control expression of polysaccharide utilization genes. Our findings define a regulatory architecture that controls the breakdown of a polysaccharide by a gut bacterium in response to three distinct signals. This architecture integrates perception of a complex polysaccharide and its monomeric constituent as well as feedback of central metabolism. Moreover, it is broadly applicable to several prominent members of the mammalian gut microbiota. The identified regulatory strategy may contribute to the abundance of gut Bacteroides, despite fluctuations in the host diet. Copyright © 2016 Schwalm et al.

  13. Investigation of bus transit schedule behavior modeling using advanced techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kalaputapu, R.; Demetsky, M.J.

    This research focused on investigating the application of artificial neural networks (ANN) and the Box-Jenkins technique for developing and testing schedule behavior models using data obtained for a test route from Tidewater Regional Transit`s AVL system. The three ANN architectures investigated were: Feedforward Network, Elman Network and Jordan Network. In addition, five different model structures were investigated. The time-series methodology was adopted for developing the schedule behavior models. Finally, the role of a schedule behavior model within the framework of an intelligent transit management system is defined and the potential utility of the schedule behavior model is discussed using anmore » example application.« less

  14. Well-defined protein-polymer conjugates--synthesis and potential applications.

    PubMed

    Thordarson, Pall; Le Droumaguet, Benjamin; Velonia, Kelly

    2006-11-01

    During the last decades, numerous studies have focused on combining the unique catalytic/functional properties and structural characteristics of proteins and enzymes with those of synthetic molecules and macromolecules. The aim of such multidisciplinary studies is to improve the properties of the natural component, combine them with those of the synthetic, and create novel biomaterials in the nanometer scale. The specific coupling of polymers onto the protein structures has proved to be one of the most straightforward and applicable approaches in that sense. In this article, we focus on the synthetic pathways that have or can be utilized to specifically couple proteins to polymers. The different categories of well-defined protein-polymer conjugates and the effect of the polymer on the protein function are discussed. Studies have shown that the specific conjugation of a synthetic polymer to a protein conveys its physico-chemical properties and, therefore, modifies the biodistribution and solubility of the protein, making it in certain cases soluble and active in organic solvents. An overview of the applications derived from such bioconjugates in the pharmaceutical industry, biocatalysis, and supramolecular nanobiotechnology is presented at the final part of the article.

  15. Geothermal energy: opportunities for California commerce. Phase I report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Longyear, A.B.

    1981-12-01

    The potential geothermal direct-use energy market and its application to projects in California are assessed. Project identification effort is to be focused on those that have the highest probability for near-term successful commercial operations. Near-term herein means 2 to 5 years for project implementation. Phase I has been focused on defining and assessing: (1) the geothermal direct-use resources that are suitable for near-term utilization; and (2) the generic applications (municipal heating districts, horticultural greenhouse firms, laundries, etc.) that are suitable for near-term projects. Five economic development regions in the state, containing recognized geothermal direct-use resources, have been defined. Thirty-eight directmore » use resources have been evaluated in these regions. After assessment against pre-selected criteria, twenty-seven have been rated with a priority of I, II or III, thereby qualifying them for further marketing effort. The five areas with a priority of I are summarized. These areas have no perceived impediments to near-term development. Twenty-nine generic categories of applications were assessed against previously selected criteria to determine their near term potential for direct use of geothermal fluids. Some twenty industry, commercial and institutional application categories were rated with a priority of I, II or III and warrant further marketing efforts. The seven categories with a priority of I are listed. These categories were found to have the least impediments to near-term application projects.« less

  16. Demand Activated Manufacturing Architecture (DAMA) model for supply chain collaboration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    CHAPMAN,LEON D.; PETERSEN,MARJORIE B.

    The Demand Activated Manufacturing Architecture (DAMA) project during the last five years of work with the U.S. Integrated Textile Complex (retail, apparel, textile, and fiber sectors) has developed an inter-enterprise architecture and collaborative model for supply chains. This model will enable improved collaborative business across any supply chain. The DAMA Model for Supply Chain Collaboration is a high-level model for collaboration to achieve Demand Activated Manufacturing. The five major elements of the architecture to support collaboration are (1) activity or process, (2) information, (3) application, (4) data, and (5) infrastructure. These five elements are tied to the application of themore » DAMA architecture to three phases of collaboration - prepare, pilot, and scale. There are six collaborative activities that may be employed in this model: (1) Develop Business Planning Agreements, (2) Define Products, (3) Forecast and Plan Capacity Commitments, (4) Schedule Product and Product Delivery, (5) Expedite Production and Delivery Exceptions, and (6) Populate Supply Chain Utility. The Supply Chain Utility is a set of applications implemented to support collaborative product definition, forecast visibility, planning, scheduling, and execution. The DAMA architecture and model will be presented along with the process for implementing this DAMA model.« less

  17. Concise Review: Cell Surface N-Linked Glycoproteins as Potential Stem Cell Markers and Drug Targets.

    PubMed

    Boheler, Kenneth R; Gundry, Rebekah L

    2017-01-01

    Stem cells and their derivatives hold great promise to advance regenerative medicine. Critical to the progression of this field is the identification and utilization of antibody-accessible cell-surface proteins for immunophenotyping and cell sorting-techniques essential for assessment and isolation of defined cell populations with known functional and therapeutic properties. Beyond their utility for cell identification and selection, cell-surface proteins are also major targets for pharmacological intervention. Although comprehensive cell-surface protein maps are highly valuable, they have been difficult to define until recently. In this review, we discuss the application of a contemporary targeted chemoproteomic-based technique for defining the cell-surface proteomes of stem and progenitor cells. In applying this approach to pluripotent stem cells (PSCs), these studies have improved the biological understanding of these cells, led to the enhanced use and development of antibodies suitable for immunophenotyping and sorting, and contributed to the repurposing of existing drugs without the need for high-throughput screening. The utility of this latter approach was first demonstrated with human PSCs (hPSCs) through the identification of small molecules that are selectively toxic to hPSCs and have the potential for eliminating confounding and tumorigenic cells in hPSC-derived progeny destined for research and transplantation. Overall, the cutting-edge technologies reviewed here will accelerate the development of novel cell-surface protein targets for immunophenotyping, new reagents to improve the isolation of therapeutically qualified cells, and pharmacological studies to advance the treatment of intractable diseases amenable to cell-replacement therapies. Stem Cells Translational Medicine 2017;6:131-138. © 2016 The Authors Stem Cells Translational Medicine published by Wiley Periodicals, Inc. on behalf of AlphaMed Press.

  18. HWDA: A coherence recognition and resolution algorithm for hybrid web data aggregation

    NASA Astrophysics Data System (ADS)

    Guo, Shuhang; Wang, Jian; Wang, Tong

    2017-09-01

    Aiming at the object confliction recognition and resolution problem for hybrid distributed data stream aggregation, a distributed data stream object coherence solution technology is proposed. Firstly, the framework was defined for the object coherence conflict recognition and resolution, named HWDA. Secondly, an object coherence recognition technology was proposed based on formal language description logic and hierarchical dependency relationship between logic rules. Thirdly, a conflict traversal recognition algorithm was proposed based on the defined dependency graph. Next, the conflict resolution technology was prompted based on resolution pattern matching including the definition of the three types of conflict, conflict resolution matching pattern and arbitration resolution method. At last, the experiment use two kinds of web test data sets to validate the effect of application utilizing the conflict recognition and resolution technology of HWDA.

  19. Using partial site aggregation to reduce bias in random utility travel cost models

    NASA Astrophysics Data System (ADS)

    Lupi, Frank; Feather, Peter M.

    1998-12-01

    We propose a "partial aggregation" strategy for defining the recreation sites that enter choice sets in random utility models. Under the proposal, the most popular sites and sites that will be the subject of policy analysis enter choice sets as individual sites while remaining sites are aggregated into groups of similar sites. The scheme balances the desire to include all potential substitute sites in the choice sets with practical data and modeling constraints. Unlike fully aggregate models, our analysis and empirical applications suggest that the partial aggregation approach reasonably approximates the results of a disaggregate model. The partial aggregation approach offers all of the data and computational advantages of models with aggregate sites but does not suffer from the same degree of bias as fully aggregate models.

  20. Bridging the Gap: Towards a Cell-Type Specific Understanding of Neural Circuits Underlying Fear Behaviors

    PubMed Central

    McCullough, KM; Morrison, FG; Ressler, KJ

    2016-01-01

    Fear and anxiety-related disorders are remarkably common and debilitating, and are often characterized by dysregulated fear responses. Rodent models of fear learning and memory have taken great strides towards elucidating the specific neuronal circuitries underlying the learning of fear responses. The present review addresses recent research utilizing optogenetic approaches to parse circuitries underlying fear behaviors. It also highlights the powerful advances made when optogenetic techniques are utilized in a genetically defined, cell-type specific, manner. The application of next-generation genetic and sequencing approaches in a cell-type specific context will be essential for a mechanistic understanding of the neural circuitry underlying fear behavior and for the rational design of targeted, circuit specific, pharmacologic interventions for the treatment and prevention of fear-related disorders. PMID:27470092

  1. System approach to distributed sensor management

    NASA Astrophysics Data System (ADS)

    Mayott, Gregory; Miller, Gordon; Harrell, John; Hepp, Jared; Self, Mid

    2010-04-01

    Since 2003, the US Army's RDECOM CERDEC Night Vision Electronic Sensor Directorate (NVESD) has been developing a distributed Sensor Management System (SMS) that utilizes a framework which demonstrates application layer, net-centric sensor management. The core principles of the design support distributed and dynamic discovery of sensing devices and processes through a multi-layered implementation. This results in a sensor management layer that acts as a System with defined interfaces for which the characteristics, parameters, and behaviors can be described. Within the framework, the definition of a protocol is required to establish the rules for how distributed sensors should operate. The protocol defines the behaviors, capabilities, and message structures needed to operate within the functional design boundaries. The protocol definition addresses the requirements for a device (sensors or processes) to dynamically join or leave a sensor network, dynamically describe device control and data capabilities, and allow dynamic addressing of publish and subscribe functionality. The message structure is a multi-tiered definition that identifies standard, extended, and payload representations that are specifically designed to accommodate the need for standard representations of common functions, while supporting the need for feature-based functions that are typically vendor specific. The dynamic qualities of the protocol enable a User GUI application the flexibility of mapping widget-level controls to each device based on reported capabilities in real-time. The SMS approach is designed to accommodate scalability and flexibility within a defined architecture. The distributed sensor management framework and its application to a tactical sensor network will be described in this paper.

  2. Gate-defined Quantum Confinement in Suspended Bilayer Graphene

    NASA Astrophysics Data System (ADS)

    Allen, Monica

    2013-03-01

    Quantum confined devices in carbon-based materials offer unique possibilities for applications ranging from quantum computation to sensing. In particular, nanostructured carbon is a promising candidate for spin-based quantum computation due to the ability to suppress hyperfine coupling to nuclear spins, a dominant source of spin decoherence. Yet graphene lacks an intrinsic bandgap, which poses a serious challenge for the creation of such devices. We present a novel approach to quantum confinement utilizing tunnel barriers defined by local electric fields that break sublattice symmetry in suspended bilayer graphene. This technique electrostatically confines charges via band structure control, thereby eliminating the edge and substrate disorder that hinders on-chip etched nanostructures to date. We report clean single electron tunneling through gate-defined quantum dots in two regimes: at zero magnetic field using the energy gap induced by a perpendicular electric field and at finite magnetic fields using Landau level confinement. The observed Coulomb blockade periodicity agrees with electrostatic simulations based on local top-gate geometry, a direct demonstration of local control over the band structure of graphene. This technology integrates quantum confinement with pristine device quality and access to vibrational modes, enabling wide applications from electromechanical sensors to quantum bits. More broadly, the ability to externally tailor the graphene bandgap over nanometer scales opens a new unexplored avenue for creating quantum devices.

  3. Evaluative methodology for prioritizing transportation energy conservation strategies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pang, L.M.G.

    An analytical methodology was developed for the purpose of prioritizing a set of transportation energy conservation (TEC) strategies within an urban environment. Steps involved in applying the methodology consist of 1) defining the goals, objectives and constraints of the given urban community, 2) identifying potential TEC strategies, 3) assessing the impact of the strategies, 4) applying the TEC evaluation model, and 5) utilizing a selection process to determine the optimal set of strategies for implementation. This research provides an overview of 21 TEC strategies, a quick-response technique for estimating energy savings, a multiattribute utility theory approach for assessing subjective impacts,more » and a computer program for making the strategy evaluations, all of which assist in expediting the execution of the entire methodology procedure. The critical element of the methodology is the strategy evaluation model which incorporates a number of desirable concepts including 1) a comprehensive accounting of all relevant impacts, 2) the application of multiobjective decision-making techniques, 3) an approach to assure compatibilty among quantitative and qualitative impact measures, 4) the inclusion of the decision maker's preferences in the evaluation procedure, and 5) the cost-effectiveness concept. Application of the methodolgy to Salt Lake City, Utah demonstrated its utility, ease of use and favorability by decision makers.« less

  4. Enriching the national map database for multi-scale use: Introducing the visibilityfilter attribution

    USGS Publications Warehouse

    Stauffer, Andrew J.; Webinger, Seth; Roche, Brittany

    2016-01-01

    The US Geological Survey’s (USGS) National Geospatial Technical Operations Center is prototyping and evaluating the ability to filter data through a range of scales using 1:24,000-scale The National Map (TNM) datasets as the source. A “VisibilityFilter” attribute is under evaluation that can be added to all TNM vector data themes and will permit filtering of data to eight target scales between 1:24,000 and 1:5,000,000, thus defining each feature’s smallest applicable scale-of-use. For a prototype implementation, map specifications for 1:100,000- and 1:250,000-scale USGS Topographic Map Series are being utilized to define feature content appropriate at fixed mapping scales to guide generalization decisions that are documented in a ScaleMaster diagram. This paper defines the VisibilityFilter attribute, the generalization decisions made for each TNM data theme, and how these decisions are embedded into the data to support efficient data filtering.

  5. Modeling of ultrasonic processes utilizing a generic software framework

    NASA Astrophysics Data System (ADS)

    Bruns, P.; Twiefel, J.; Wallaschek, J.

    2017-06-01

    Modeling of ultrasonic processes is typically characterized by a high degree of complexity. Different domains and size scales must be regarded, so that it is rather difficult to build up a single detailed overall model. Developing partial models is a common approach to overcome this difficulty. In this paper a generic but simple software framework is presented which allows to coupe arbitrary partial models by slave modules with well-defined interfaces and a master module for coordination. Two examples are given to present the developed framework. The first one is the parameterization of a load model for ultrasonically-induced cavitation. The piezoelectric oscillator, its mounting, and the process load are described individually by partial models. These partial models then are coupled using the framework. The load model is composed of spring-damper-elements which are parameterized by experimental results. In the second example, the ideal mounting position for an oscillator utilized in ultrasonic assisted machining of stone is determined. Partial models for the ultrasonic oscillator, its mounting, the simplified contact process, and the workpiece’s material characteristics are presented. For both applications input and output variables are defined to meet the requirements of the framework’s interface.

  6. Perfect joint remote state preparation of arbitrary six-qubit cluster-type states

    NASA Astrophysics Data System (ADS)

    Choudhury, Binayak S.; Samanta, Soumen

    2018-07-01

    In this paper, a joint remote state preparation protocol, which is applicable to six-qubit cluster states, is presented. The scheme is performed with the help of three quantum channels constituted by eight qubits. A new index of efficiency for JRSP protocols is defined. A comparison is made with the existing similar schemes from which it is concluded that the present scheme utilizes its resources more efficiently. The work is a part of the line of research on transfer and remote preparation of entanglement.

  7. Toward Reliable and Energy Efficient Wireless Sensing for Space and Extreme Environments

    NASA Technical Reports Server (NTRS)

    Choi, Baek-Young; Boyd, Darren; Wilkerson, DeLisa

    2017-01-01

    Reliability is the critical challenge of wireless sensing in space systems operating in extreme environments. Energy efficiency is another concern for battery powered wireless sensors. Considering the physics of wireless communications, we propose an approach called Software-Defined Wireless Communications (SDC) that dynamically decide a reliable channel(s) avoiding unnecessary redundancy of channels, out of multiple distinct electromagnetic frequency bands such as radio and infrared frequencies.We validate the concept with Android and Raspberry Pi sensors and pseudo extreme experiments. SDC can be utilized in many areas beyond space applications.

  8. Development of a support software system for real-time HAL/S applications

    NASA Technical Reports Server (NTRS)

    Smith, R. S.

    1984-01-01

    Methodologies employed in defining and implementing a software support system for the HAL/S computer language for real-time operations on the Shuttle are detailed. Attention is also given to the management and validation techniques used during software development and software maintenance. Utilities developed to support the real-time operating conditions are described. With the support system being produced on Cyber computers and executable code then processed through Cyber or PDP machines, the support system has a production level status and can serve as a model for other software development projects.

  9. Wheel configurations for combined energy storage and attitude control systems

    NASA Technical Reports Server (NTRS)

    Oglevie, R. E.

    1985-01-01

    Integrated power and attitude control system (IPACS) studies performed over a decade ago established the feasibility of simultaneously storing electrical energy in wheels and utilizing the resulting momentum for spacecraft attitude control. It was shown that such a system possessed many advantages over other contemporary energy storage and attitude control systems in many applications. More recent technology advances in composite rotors, magnetic bearings, and power control electronics have triggered new optimism regarding the feasibility and merits of such a system. This paper presents the results of a recent study whose focus was to define an advanced IPACS and to evaluate its merits for the Space Station application. Emphasis is given to the selection of the wheel configuration to perform the combined functions. A component design concept is developed to establish the system performance capability. A system-level trade study, including life-cycle costing, is performed to define the merits of the system relative to two other candidate systems. It is concluded that an advanced IPACS concept is not only feasible but offers substantial savings in mass and life-cycle cost.

  10. Efficient Testing Combining Design of Experiment and Learn-to-Fly Strategies

    NASA Technical Reports Server (NTRS)

    Murphy, Patrick C.; Brandon, Jay M.

    2017-01-01

    Rapid modeling and efficient testing methods are important in a number of aerospace applications. In this study efficient testing strategies were evaluated in a wind tunnel test environment and combined to suggest a promising approach for both ground-based and flight-based experiments. Benefits of using Design of Experiment techniques, well established in scientific, military, and manufacturing applications are evaluated in combination with newly developing methods for global nonlinear modeling. The nonlinear modeling methods, referred to as Learn-to-Fly methods, utilize fuzzy logic and multivariate orthogonal function techniques that have been successfully demonstrated in flight test. The blended approach presented has a focus on experiment design and identifies a sequential testing process with clearly defined completion metrics that produce increased testing efficiency.

  11. Crowdsourcing applications for public health.

    PubMed

    Brabham, Daren C; Ribisl, Kurt M; Kirchner, Thomas R; Bernhardt, Jay M

    2014-02-01

    Crowdsourcing is an online, distributed, problem-solving, and production model that uses the collective intelligence of networked communities for specific purposes. Although its use has benefited many sectors of society, it has yet to be fully realized as a method for improving public health. This paper defines the core components of crowdsourcing and proposes a framework for understanding the potential utility of crowdsourcing in the domain of public health. Four discrete crowdsourcing approaches are described (knowledge discovery and management; distributed human intelligence tasking; broadcast search; and peer-vetted creative production types) and a number of potential applications for crowdsourcing for public health science and practice are enumerated. © 2013 American Journal of Preventive Medicine Published by American Journal of Preventive Medicine All rights reserved.

  12. MARTe: A Multiplatform Real-Time Framework

    NASA Astrophysics Data System (ADS)

    Neto, André C.; Sartori, Filippo; Piccolo, Fabio; Vitelli, Riccardo; De Tommasi, Gianmaria; Zabeo, Luca; Barbalace, Antonio; Fernandes, Horacio; Valcarcel, Daniel F.; Batista, Antonio J. N.

    2010-04-01

    Development of real-time applications is usually associated with nonportable code targeted at specific real-time operating systems. The boundary between hardware drivers, system services, and user code is commonly not well defined, making the development in the target host significantly difficult. The Multithreaded Application Real-Time executor (MARTe) is a framework built over a multiplatform library that allows the execution of the same code in different operating systems. The framework provides the high-level interfaces with hardware, external configuration programs, and user interfaces, assuring at the same time hard real-time performances. End-users of the framework are required to define and implement algorithms inside a well-defined block of software, named Generic Application Module (GAM), that is executed by the real-time scheduler. Each GAM is reconfigurable with a set of predefined configuration meta-parameters and interchanges information using a set of data pipes that are provided as inputs and required as output. Using these connections, different GAMs can be chained either in series or parallel. GAMs can be developed and debugged in a non-real-time system and, only once the robustness of the code and correctness of the algorithm are verified, deployed to the real-time system. The software also supplies a large set of utilities that greatly ease the interaction and debugging of a running system. Among the most useful are a highly efficient real-time logger, HTTP introspection of real-time objects, and HTTP remote configuration. MARTe is currently being used to successfully drive the plasma vertical stabilization controller on the largest magnetic confinement fusion device in the world, with a control loop cycle of 50 ?s and a jitter under 1 ?s. In this particular project, MARTe is used with the Real-Time Application Interface (RTAI)/Linux operating system exploiting the new ?86 multicore processors technology.

  13. Fly ashes from coal and petroleum coke combustion: current and innovative potential applications.

    PubMed

    González, Aixa; Navia, Rodrigo; Moreno, Natalia

    2009-12-01

    Coal fly ashes (CFA) are generated in large amounts worldwide. Current combustion technologies allow the burning of fuels with high sulfur content such as petroleum coke, generating non-CFA, such as petroleum coke fly ash (PCFA), mainly from fluidized bed combustion processes. The disposal of CFA and PCFA fly ashes can have severe impacts in the environment such as a potential groundwater contamination by the leaching of heavy metals and/or particulate matter emissions; making it necessary to treat or reuse them. At present CFA are utilized in several applications fields such as cement and concrete production, agriculture and soil stabilization. However, their reuse is restricted by the quality parameters of the end-product or requirements defined by the production process. Therefore, secondary material markets can use a limited amount of CFA, which implies the necessity of new markets for the unused CFA. Some potential future utilization options reviewed herein are zeolite synthesis and valuable metals extraction. In comparison to CFA, PCFA are characterized by a high Ca content, suggesting a possible use as neutralizers of acid wastewaters from mining operations, opening a new potential application area for PCFA that could solve contamination problems in emergent and mining countries such as Chile. However, this potential application may be limited by PCFA heavy metals leaching, mainly V and Ni, which are present in PCFA in high concentrations.

  14. KSC's work flow assistant

    NASA Technical Reports Server (NTRS)

    Wilkinson, John; Johnson, Earl

    1991-01-01

    The work flow assistant (WFA) is an advanced technology project under the shuttle processing data management system (SPDMS) at Kennedy Space Center (KSC). It will be utilized for short range scheduling, controlling work flow on the floor, and providing near real-time status for all major space transportation systems (STS) work centers at KSC. It will increase personnel and STS safety and improve productivity through deeper active scheduling that includes tracking and correlation of STS and ground support equipment (GSE) configuration and work. It will also provide greater accessibility to this data. WFA defines a standards concept for scheduling data which permits both commercial off-the-shelf (COTS) scheduling tools and WFA developed applications to be reused. WFA will utilize industry standard languages and workstations to achieve a scalable, adaptable, and portable architecture which may be used at other sites.

  15. Land-Use Requirements for Solar Power Plants in the United States

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ong, S.; Campbell, C.; Denholm, P.

    2013-06-01

    This report provides data and analysis of the land use associated with utility-scale ground-mounted solar facilities, defined as installations greater than 1 MW. We begin by discussing standard land-use metrics as established in the life-cycle assessment literature and then discuss their applicability to solar power plants. We present total and direct land-use results for various solar technologies and system configurations, on both a capacity and an electricity-generation basis. The total area corresponds to all land enclosed by the site boundary. The direct area comprises land directly occupied by solar arrays, access roads, substations, service buildings, and other infrastructure. As ofmore » the third quarter of 2012, the solar projects we analyze represent 72% of installed and under-construction utility-scale PV and CSP capacity in the United States.« less

  16. Embedded parallel processing based ground control systems for small satellite telemetry

    NASA Technical Reports Server (NTRS)

    Forman, Michael L.; Hazra, Tushar K.; Troendly, Gregory M.; Nickum, William G.

    1994-01-01

    The use of networked terminals which utilize embedded processing techniques results in totally integrated, flexible, high speed, reliable, and scalable systems suitable for telemetry and data processing applications such as mission operations centers (MOC). Synergies of these terminals, coupled with the capability of terminal to receive incoming data, allow the viewing of any defined display by any terminal from the start of data acquisition. There is no single point of failure (other than with network input) such as exists with configurations where all input data goes through a single front end processor and then to a serial string of workstations. Missions dedicated to NASA's ozone measurements program utilize the methodologies which are discussed, and result in a multimission configuration of low cost, scalable hardware and software which can be run by one flight operations team with low risk.

  17. 47 CFR 1.1402 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Procedures § 1.1402 Definitions. (a) The term utility means any person that is a local exchange carrier or an electric, gas, water, steam, or other public utility, and who owns or controls poles, ducts, conduits, or... telecommunications services (as defined in 47 U.S.C. 226) or incumbent local exchange carriers (as defined in 47 U.S...

  18. Well-Defined Peapod-like Magnetic Nanoparticles and Their Controlled Modification for Effective Imaging Guided Gene Therapy.

    PubMed

    Wang, Ranran; Hu, Yang; Zhao, Nana; Xu, Fu-Jian

    2016-05-11

    Due to their unique properties, one-dimensional (1D) magnetic nanostructures are of great significance for biorelated applications. A facile and straightforward strategy to fabricate 1D magnetic structure with special shapes is highly desirable. In this work, well-defined peapod-like 1D magnetic nanoparticles (Fe3O4@SiO2, p-FS) are readily synthesized by a facile method without assistance of any templates, magnetic string or magnetic field. There are few reports on 1D gene carriers based on Fe3O4 nanoparticles. BUCT-PGEA (ethanolamine-functionalized poly(glycidyl methacrylate) is subsequently grafted from the surface of p-FS nanoparticles by atom transfer radical polymerization to construct highly efficient gene vectors (p-FS-PGEA) for effective biomedical applications. Peapod-like p-FS nanoparticles were proven to largely improve gene transfection performance compared with ordinary spherical Fe3O4@SiO2 nanoparticles (s-FS). External magnetic field was also utilized to further enhance the transfection efficiency. Moreover, the as-prepared p-FS-PGEA gene carriers could combine the magnetic characteristics of p-FS to well achieve noninvasive magnetic resonance imaging (MRI). We show here novel and multifunctional magnetic nanostructures fabricated for biomedical applications that realized efficient gene delivery and real-time imaging at the same time.

  19. SU-F-T-25: Design and Implementation of a Multi-Purpose Applicator for Pelvic Brachytherapy Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bogue, J; Parsai, E

    Purpose: The current generation of inflatable multichannel brachytherapy applicators, such as the Varian Capri, have limited implementation to only vaginal and rectal cancers. While there are similar designs utilizing rigid, non-inflatable applicators, these alternatives could cause increased dose to surrounding tissue due to air gaps. Modification of the Capri could allow for easier treatment planning by reducing the number of channels and increased versatility by modifying the applicator to include an attachable single tandem for cervical or multiple tandems for endometrial applications. Methods: A Varian Capri applicator was simulated in water to replicate a patient. Multiple plans were optimized tomore » deliver a prescribed dose of 100 cGy at 5mm away from the exterior of the applicator using six to thirteen existing channels. The current model was expanded upon to include a detachable tandem or multiple tandoms to increase its functionality to both cervical and endometrial cancers. Models were constructed in both threedimensional rendering software and Monte Carlo to allow prototyping and simulations. Results: Treatment plans utilizing six to thirteen channels produced limited dosimetric differences between channel arrangements, with a seven channel plan very closely approximating the thirteen channels. It was concluded that only seven channels would be necessary in future simulations to give an accurate representation of the applicator. Tandem attachments were prototyped for the applicator to demonstrate the ease of which they could be included. Future simulation in treatment planning software and Monte Carlo results will be presented to further define the ideal applicator geometry Conclusion: The current Capri applicator design could be easily modified to increase applicability to include cervical and endometrial treatments in addition to vaginal and rectal cancers. This new design helps in a more versatile single use applicator that can easily be inserted and to further reduce dose to critical structures during brachytherapy treatments.« less

  20. Ore minerals textural characterization by hyperspectral imaging

    NASA Astrophysics Data System (ADS)

    Bonifazi, Giuseppe; Picone, Nicoletta; Serranti, Silvia

    2013-02-01

    The utilization of hyperspectral detection devices, for natural resources mapping/exploitation through remote sensing techniques, dates back to the early 1970s. From the first devices utilizing a one-dimensional profile spectrometer, HyperSpectral Imaging (HSI) devices have been developed. Thus, from specific-customized devices, originally developed by Governmental Agencies (e.g. NASA, specialized research labs, etc.), a lot of HSI based equipment are today available at commercial level. Parallel to this huge increase of hyperspectral systems development/manufacturing, addressed to airborne application, a strong increase also occurred in developing HSI based devices for "ground" utilization that is sensing units able to play inside a laboratory, a processing plant and/or in an open field. Thanks to this diffusion more and more applications have been developed and tested in this last years also in the materials sectors. Such an approach, when successful, is quite challenging being usually reliable, robust and characterised by lower costs if compared with those usually associated to commonly applied analytical off- and/or on-line analytical approaches. In this paper such an approach is presented with reference to ore minerals characterization. According to the different phases and stages of ore minerals and products characterization, and starting from the analyses of the detected hyperspectral firms, it is possible to derive useful information about mineral flow stream properties and their physical-chemical attributes. This last aspect can be utilized to define innovative process mineralogy strategies and to implement on-line procedures at processing level. The present study discusses the effects related to the adoption of different hardware configurations, the utilization of different logics to perform the analysis and the selection of different algorithms according to the different characterization, inspection and quality control actions to apply.

  1. A System-Wide Approach to Physician Efficiency and Utilization Rates for Non-Operating Room Anesthesia Sites.

    PubMed

    Tsai, Mitchell H; Huynh, Tinh T; Breidenstein, Max W; O'Donnell, Stephen E; Ehrenfeld, Jesse M; Urman, Richard D

    2017-07-01

    There has been little in the development or application of operating room (OR) management metrics to non-operating room anesthesia (NORA) sites. This is in contrast to the well-developed management framework for the OR management. We hypothesized that by adopting the concept of physician efficiency, we could determine the applicability of this clinical productivity benchmark for physicians providing services for NORA cases at a tertiary care center. We conducted a retrospective data analysis of NORA sites at an academic, rural hospital, including both adult and pediatric patients. Using the time stamps from WiseOR® (Palo Alto, CA), we calculated site utilization and physician efficiency for each day. We defined scheduling efficiency (SE) as the number of staffed anesthesiologists divided by the number of staffed sites and stratified the data into three categories (SE < 1, SE = 1, and SE >1). The mean physician efficiency was 0.293 (95% CI, [0.281, 0.305]), and the mean site utilization was 0.328 (95% CI, [0.314, 0.343]). When days were stratified by scheduling efficiency (SE < 1, =1, or >1), we found differences between physician efficiency and site utilization. On days where scheduling efficiency was less than 1, that is, there are more sites than physicians, mean physician efficiency (95% CI, [0.326, 0.402]) was higher than mean site utilization (95% CI, [0.250, 0.296]). We demonstrate that scheduling efficiency vis-à-vis physician efficiency as an OR management metric diverge when anesthesiologists travel between NORA sites. When the opportunity to scale operational efficiencies is limited, increasing scheduling efficiency by incorporating different NORA sites into a "block" allocation on any given day may be the only suitable tactical alternative.

  2. Extending the applicability of the Goldschmidt tolerance factor to arbitrary ionic compounds

    PubMed Central

    Sato, Toyoto; Takagi, Shigeyuki; Deledda, Stefano; Hauback, Bjørn C.; Orimo, Shin-ichi

    2016-01-01

    Crystal structure determination is essential for characterizing materials and their properties, and can be facilitated by various tools and indicators. For instance, the Goldschmidt tolerance factor (T) for perovskite compounds is acknowledged for evaluating crystal structures in terms of the ionic packing. However, its applicability is limited to perovskite compounds. Here, we report on extending the applicability of T to ionic compounds with arbitrary ionic arrangements and compositions. By focussing on the occupancy of constituent spherical ions in the crystal structure, we define the ionic filling fraction (IFF), which is obtained from the volumes of crystal structure and constituent ions. Ionic compounds, including perovskites, are arranged linearly by the IFF, providing consistent results with T. The linearity guides towards finding suitable unit cell and composition, thus tackling the main obstacle for determining new crystal structures. We demonstrate the utility of the IFF by solving the structure of three hydrides with new crystal structures. PMID:27032978

  3. Extending the applicability of the Goldschmidt tolerance factor to arbitrary ionic compounds.

    PubMed

    Sato, Toyoto; Takagi, Shigeyuki; Deledda, Stefano; Hauback, Bjørn C; Orimo, Shin-ichi

    2016-04-01

    Crystal structure determination is essential for characterizing materials and their properties, and can be facilitated by various tools and indicators. For instance, the Goldschmidt tolerance factor (T) for perovskite compounds is acknowledged for evaluating crystal structures in terms of the ionic packing. However, its applicability is limited to perovskite compounds. Here, we report on extending the applicability of T to ionic compounds with arbitrary ionic arrangements and compositions. By focussing on the occupancy of constituent spherical ions in the crystal structure, we define the ionic filling fraction (IFF), which is obtained from the volumes of crystal structure and constituent ions. Ionic compounds, including perovskites, are arranged linearly by the IFF, providing consistent results with T. The linearity guides towards finding suitable unit cell and composition, thus tackling the main obstacle for determining new crystal structures. We demonstrate the utility of the IFF by solving the structure of three hydrides with new crystal structures.

  4. GLobal Integrated Design Environment (GLIDE): A Concurrent Engineering Application

    NASA Technical Reports Server (NTRS)

    McGuire, Melissa L.; Kunkel, Matthew R.; Smith, David A.

    2010-01-01

    The GLobal Integrated Design Environment (GLIDE) is a client-server software application purpose-built to mitigate issues associated with real time data sharing in concurrent engineering environments and to facilitate discipline-to-discipline interaction between multiple engineers and researchers. GLIDE is implemented in multiple programming languages utilizing standardized web protocols to enable secure parameter data sharing between engineers and researchers across the Internet in closed and/or widely distributed working environments. A well defined, HyperText Transfer Protocol (HTTP) based Application Programming Interface (API) to the GLIDE client/server environment enables users to interact with GLIDE, and each other, within common and familiar tools. One such common tool, Microsoft Excel (Microsoft Corporation), paired with its add-in API for GLIDE, is discussed in this paper. The top-level examples given demonstrate how this interface improves the efficiency of the design process of a concurrent engineering study while reducing potential errors associated with manually sharing information between study participants.

  5. Latency Requirements for Head-Worn Display S/EVS Applications

    NASA Technical Reports Server (NTRS)

    Bailey, Randall E.; Trey Arthur, J. J., III; Williams, Steven P.

    2004-01-01

    NASA s Aviation Safety Program, Synthetic Vision Systems Project is conducting research in advanced flight deck concepts, such as Synthetic/Enhanced Vision Systems (S/EVS), for commercial and business aircraft. An emerging thrust in this activity is the development of spatially-integrated, large field-of-regard information display systems. Head-worn or helmet-mounted display systems are being proposed as one method in which to meet this objective. System delays or latencies inherent to spatially-integrated, head-worn displays critically influence the display utility, usability, and acceptability. Research results from three different, yet similar technical areas flight control, flight simulation, and virtual reality are collectively assembled in this paper to create a global perspective of delay or latency effects in head-worn or helmet-mounted display systems. Consistent definitions and measurement techniques are proposed herein for universal application and latency requirements for Head-Worn Display S/EVS applications are drafted. Future research areas are defined.

  6. Latency requirements for head-worn display S/EVS applications

    NASA Astrophysics Data System (ADS)

    Bailey, Randall E.; Arthur, Jarvis J., III; Williams, Steven P.

    2004-08-01

    NASA's Aviation Safety Program, Synthetic Vision Systems Project is conducting research in advanced flight deck concepts, such as Synthetic/Enhanced Vision Systems (S/EVS), for commercial and business aircraft. An emerging thrust in this activity is the development of spatially-integrated, large field-of-regard information display systems. Head-worn or helmet-mounted display systems are being proposed as one method in which to meet this objective. System delays or latencies inherent to spatially-integrated, head-worn displays critically influence the display utility, usability, and acceptability. Research results from three different, yet similar technical areas - flight control, flight simulation, and virtual reality - are collectively assembled in this paper to create a global perspective of delay or latency effects in head-worn or helmet-mounted display systems. Consistent definitions and measurement techniques are proposed herein for universal application and latency requirements for Head-Worn Display S/EVS applications are drafted. Future research areas are defined.

  7. UceWeb: a web-based collaborative tool for collecting and sharing quality of life data.

    PubMed

    Parimbelli, E; Sacchi, L; Rubrichi, S; Mazzanti, A; Quaglini, S

    2015-01-01

    This work aims at building a platform where quality-of-life data, namely utility coefficients, can be elicited not only for immediate use, but also systematically stored together with patient profiles to build a public repository to be further exploited in studies on specific target populations (e.g. cost/utility analyses). We capitalized on utility theory and previous experience to define a set of desirable features such a tool should show to facilitate sound elicitation of quality of life. A set of visualization tools and algorithms has been developed to this purpose. To make it easily accessible for potential users, the software has been designed as a web application. A pilot validation study has been performed on 20 atrial fibrillation patients. A collaborative platform, UceWeb, has been developed and tested. It implements the standard gamble, time trade-off and rating-scale utility elicitation methods. It allows doctors and patients to choose the mode of interaction to maximize patients’ comfort in answering difficult questions. Every utility elicitation may contribute to the growth of the repository. UceWeb can become a unique source of data allowing researchers both to perform more reliable comparisons among healthcare interventions and build statistical models to gain deeper insight into quality of life data.

  8. Application of Ionic Liquids to Energy Storage and Conversion Materials and Devices.

    PubMed

    Watanabe, Masayoshi; Thomas, Morgan L; Zhang, Shiguo; Ueno, Kazuhide; Yasuda, Tomohiro; Dokko, Kaoru

    2017-05-24

    Ionic liquids (ILs) are liquids consisting entirely of ions and can be further defined as molten salts having melting points lower than 100 °C. One of the most important research areas for IL utilization is undoubtedly their energy application, especially for energy storage and conversion materials and devices, because there is a continuously increasing demand for clean and sustainable energy. In this article, various application of ILs are reviewed by focusing on their use as electrolyte materials for Li/Na ion batteries, Li-sulfur batteries, Li-oxygen batteries, and nonhumidified fuel cells and as carbon precursors for electrode catalysts of fuel cells and electrode materials for batteries and supercapacitors. Due to their characteristic properties such as nonvolatility, high thermal stability, and high ionic conductivity, ILs appear to meet the rigorous demands/criteria of these various applications. However, for further development, specific applications for which these characteristic properties become unique (i.e., not easily achieved by other materials) must be explored. Thus, through strong demands for research and consideration of ILs unique properties, we will be able to identify indispensable applications for ILs.

  9. Application-Level Interoperability Across Grids and Clouds

    NASA Astrophysics Data System (ADS)

    Jha, Shantenu; Luckow, Andre; Merzky, Andre; Erdely, Miklos; Sehgal, Saurabh

    Application-level interoperability is defined as the ability of an application to utilize multiple distributed heterogeneous resources. Such interoperability is becoming increasingly important with increasing volumes of data, multiple sources of data as well as resource types. The primary aim of this chapter is to understand different ways in which application-level interoperability can be provided across distributed infrastructure. We achieve this by (i) using the canonical wordcount application, based on an enhanced version of MapReduce that scales-out across clusters, clouds, and HPC resources, (ii) establishing how SAGA enables the execution of wordcount application using MapReduce and other programming models such as Sphere concurrently, and (iii) demonstrating the scale-out of ensemble-based biomolecular simulations across multiple resources. We show user-level control of the relative placement of compute and data and also provide simple performance measures and analysis of SAGA-MapReduce when using multiple, different, heterogeneous infrastructures concurrently for the same problem instance. Finally, we discuss Azure and some of the system-level abstractions that it provides and show how it is used to support ensemble-based biomolecular simulations.

  10. A condition-specific codon optimization approach for improved heterologous gene expression in Saccharomyces cerevisiae

    PubMed Central

    2014-01-01

    Background Heterologous gene expression is an important tool for synthetic biology that enables metabolic engineering and the production of non-natural biologics in a variety of host organisms. The translational efficiency of heterologous genes can often be improved by optimizing synonymous codon usage to better match the host organism. However, traditional approaches for optimization neglect to take into account many factors known to influence synonymous codon distributions. Results Here we define an alternative approach for codon optimization that utilizes systems level information and codon context for the condition under which heterologous genes are being expressed. Furthermore, we utilize a probabilistic algorithm to generate multiple variants of a given gene. We demonstrate improved translational efficiency using this condition-specific codon optimization approach with two heterologous genes, the fluorescent protein-encoding eGFP and the catechol 1,2-dioxygenase gene CatA, expressed in S. cerevisiae. For the latter case, optimization for stationary phase production resulted in nearly 2.9-fold improvements over commercial gene optimization algorithms. Conclusions Codon optimization is now often a standard tool for protein expression, and while a variety of tools and approaches have been developed, they do not guarantee improved performance for all hosts of applications. Here, we suggest an alternative method for condition-specific codon optimization and demonstrate its utility in Saccharomyces cerevisiae as a proof of concept. However, this technique should be applicable to any organism for which gene expression data can be generated and is thus of potential interest for a variety of applications in metabolic and cellular engineering. PMID:24636000

  11. An algebraic multigrid method for Q2-Q1 mixed discretizations of the Navier-Stokes equations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prokopenko, Andrey; Tuminaro, Raymond S.

    Algebraic multigrid (AMG) preconditioners are considered for discretized systems of partial differential equations (PDEs) where unknowns associated with different physical quantities are not necessarily co-located at mesh points. Speci cally, we investigate a Q 2-Q 1 mixed finite element discretization of the incompressible Navier-Stokes equations where the number of velocity nodes is much greater than the number of pressure nodes. Consequently, some velocity degrees-of-freedom (dofs) are defined at spatial locations where there are no corresponding pressure dofs. Thus, AMG approaches lever- aging this co-located structure are not applicable. This paper instead proposes an automatic AMG coarsening that mimics certain pressure/velocitymore » dof relationships of the Q 2-Q 1 discretization. The main idea is to first automatically define coarse pressures in a somewhat standard AMG fashion and then to carefully (but automatically) choose coarse velocity unknowns so that the spatial location relationship between pressure and velocity dofs resembles that on the nest grid. To define coefficients within the inter-grid transfers, an energy minimization AMG (EMIN-AMG) is utilized. EMIN-AMG is not tied to specific coarsening schemes and grid transfer sparsity patterns, and so it is applicable to the proposed coarsening. Numerical results highlighting solver performance are given on Stokes and incompressible Navier-Stokes problems.« less

  12. An algebraic multigrid method for Q2-Q1 mixed discretizations of the Navier-Stokes equations

    DOE PAGES

    Prokopenko, Andrey; Tuminaro, Raymond S.

    2016-07-01

    Algebraic multigrid (AMG) preconditioners are considered for discretized systems of partial differential equations (PDEs) where unknowns associated with different physical quantities are not necessarily co-located at mesh points. Speci cally, we investigate a Q 2-Q 1 mixed finite element discretization of the incompressible Navier-Stokes equations where the number of velocity nodes is much greater than the number of pressure nodes. Consequently, some velocity degrees-of-freedom (dofs) are defined at spatial locations where there are no corresponding pressure dofs. Thus, AMG approaches lever- aging this co-located structure are not applicable. This paper instead proposes an automatic AMG coarsening that mimics certain pressure/velocitymore » dof relationships of the Q 2-Q 1 discretization. The main idea is to first automatically define coarse pressures in a somewhat standard AMG fashion and then to carefully (but automatically) choose coarse velocity unknowns so that the spatial location relationship between pressure and velocity dofs resembles that on the nest grid. To define coefficients within the inter-grid transfers, an energy minimization AMG (EMIN-AMG) is utilized. EMIN-AMG is not tied to specific coarsening schemes and grid transfer sparsity patterns, and so it is applicable to the proposed coarsening. Numerical results highlighting solver performance are given on Stokes and incompressible Navier-Stokes problems.« less

  13. Software automation tools for increased throughput metabolic soft-spot identification in early drug discovery.

    PubMed

    Zelesky, Veronica; Schneider, Richard; Janiszewski, John; Zamora, Ismael; Ferguson, James; Troutman, Matthew

    2013-05-01

    The ability to supplement high-throughput metabolic clearance data with structural information defining the site of metabolism should allow design teams to streamline their synthetic decisions. However, broad application of metabolite identification in early drug discovery has been limited, largely due to the time required for data review and structural assignment. The advent of mass defect filtering and its application toward metabolite scouting paved the way for the development of software automation tools capable of rapidly identifying drug-related material in complex biological matrices. Two semi-automated commercial software applications, MetabolitePilot™ and Mass-MetaSite™, were evaluated to assess the relative speed and accuracy of structural assignments using data generated on a high-resolution MS platform. Review of these applications has demonstrated their utility in providing accurate results in a time-efficient manner, leading to acceleration of metabolite identification initiatives while highlighting the continued need for biotransformation expertise in the interpretation of more complex metabolic reactions.

  14. Ionic Liquid/Metal-Organic Framework Composites: From Synthesis to Applications.

    PubMed

    Kinik, Fatma Pelin; Uzun, Alper; Keskin, Seda

    2017-07-21

    Metal-organic frameworks (MOFs) have been widely studied for different applications owing to their fascinating properties such as large surface areas, high porosities, tunable pore sizes, and acceptable thermal and chemical stabilities. Ionic liquids (ILs) have been recently incorporated into the pores of MOFs as cavity occupants to change the physicochemical properties and gas affinities of MOFs. Several recent studies have shown that IL/MOF composites show superior performances compared with pristine MOFs in various fields, such as gas storage, adsorption and membrane-based gas separation, catalysis, and ionic conductivity. In this review, we address the recent advances in syntheses of IL/MOF composites and provide a comprehensive overview of their applications. Opportunities and challenges of using IL/MOF composites in many applications are reviewed and the requirements for the utilization of these composite materials in real industrial processes are discussed to define the future directions in this field. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Architecting a Simulation Framework for Model Rehosting

    NASA Technical Reports Server (NTRS)

    Madden, Michael M.

    2004-01-01

    The utility of vehicle math models extends beyond human-in-the-loop simulation. It is desirable to deploy a given model across a multitude of applications that target design, analysis, and research. However, the vehicle model alone represents an incomplete simulation. One must also replicate the environment models (e.g., atmosphere, gravity, terrain) to achieve identical vehicle behavior across all applications. Environment models are increasing in complexity and represent a substantial investment to re-engineer for a new application. A software component that can be rehosted in each application is one solution to the deployment problem. The component must encapsulate both the vehicle and environment models. The component must have a well-defined interface that abstracts the bulk of the logic to operate the models. This paper examines the characteristics of a rehostable modeling component from the perspective of a human-in-the-loop simulation framework. The Langley Standard Real-Time Simulation in C++ (LaSRS++) is used as an example. LaSRS++ was recently redesigned to transform its modeling package into a rehostable component.

  16. Advances and applications of occupancy models

    USGS Publications Warehouse

    Bailey, Larissa; MacKenzie, Darry I.; Nichols, James D.

    2013-01-01

    Summary: The past decade has seen an explosion in the development and application of models aimed at estimating species occurrence and occupancy dynamics while accounting for possible non-detection or species misidentification. We discuss some recent occupancy estimation methods and the biological systems that motivated their development. Collectively, these models offer tremendous flexibility, but simultaneously place added demands on the investigator. Unlike many mark–recapture scenarios, investigators utilizing occupancy models have the ability, and responsibility, to define their sample units (i.e. sites), replicate sampling occasions, time period over which species occurrence is assumed to be static and even the criteria that constitute ‘detection’ of a target species. Subsequent biological inference and interpretation of model parameters depend on these definitions and the ability to meet model assumptions. We demonstrate the relevance of these definitions by highlighting applications from a single biological system (an amphibian–pathogen system) and discuss situations where the use of occupancy models has been criticized. Finally, we use these applications to suggest future research and model development.

  17. Central station market development strategies for photovoltaics

    NASA Technical Reports Server (NTRS)

    1980-01-01

    Federal market development strategies designed to accelerate the market penetration of central station applications of photovoltaic energy system are analyzed. Since no specific goals were set for the commercialization of central station applications, strategic principles are explored which, when coupled with specific objectives for central stations, can produce a market development implementation plan. The study includes (1) background information on the National Photovoltaic Program, photovoltaic technology, and central stations; (2) a brief market assessment; (3) a discussion of the viewpoints of the electric utility industry with respect to solar energy; (4) a discussion of commercialization issues; and (5) strategy principles. It is recommended that a set of specific goals and objectives be defined for the photovoltaic central station program, and that these goals and objectives evolve into an implementation plan that identifies the appropriate federal role.

  18. Central station market development strategies for photovoltaics

    NASA Astrophysics Data System (ADS)

    1980-11-01

    Federal market development strategies designed to accelerate the market penetration of central station applications of photovoltaic energy system are analyzed. Since no specific goals were set for the commercialization of central station applications, strategic principles are explored which, when coupled with specific objectives for central stations, can produce a market development implementation plan. The study includes (1) background information on the National Photovoltaic Program, photovoltaic technology, and central stations; (2) a brief market assessment; (3) a discussion of the viewpoints of the electric utility industry with respect to solar energy; (4) a discussion of commercialization issues; and (5) strategy principles. It is recommended that a set of specific goals and objectives be defined for the photovoltaic central station program, and that these goals and objectives evolve into an implementation plan that identifies the appropriate federal role.

  19. Low-cost interferometric TDM technology for dynamic sensing applications

    NASA Astrophysics Data System (ADS)

    Bush, Jeff; Cekorich, Allen

    2004-12-01

    A low-cost design approach for Time Division Multiplexed (TDM) fiber-optic interferometric interrogation of multi-channel sensor arrays is presented. This paper describes the evolutionary design process of the subject design. First, the requisite elements of interferometric interrogation are defined for a single channel sensor. The concept is then extended to multi-channel sensor interrogation implementing a TDM multiplex scheme where "traditional" design elements are utilized. The cost of the traditional TDM interrogator is investigated and concluded to be too high for entry into many markets. A new design approach is presented which significantly reduces the cost for TDM interrogation. This new approach, in accordance with the cost objectives, shows promise to bring this technology to within the threshold of commercial acceptance for a wide range of distributed fiber sensing applications.

  20. Medicare program; revisions to payment policies under the Physician Fee Schedule, and other part B payment policies for CY 2008; delay of the date of applicability of the revised anti-markup provisions for certain services furnished in certain locations (Sec. 414.50). Final rule.

    PubMed

    2008-01-03

    This final rule delays until January 1, 2009 the applicability of the anti-markup provisions in Sec. 414.50, as revised at 72 FR 66222, except with respect to the technical component of a purchased diagnostic test and with respect to any anatomic pathology diagnostic testing services furnished in space that: Is utilized by a physician group practice as a "centralized building" (as defined at Sec. 411.351 of this chapter) for purposes of complying with the physician self-referral rules; and does not qualify as a "same building" under Sec. 411.355(b)(2)(i) of this chapter.

  1. Utility of the clue - From assessing the investigative contribution of forensic science to supporting the decision to use traces.

    PubMed

    Bitzer, Sonja; Albertini, Nicola; Lock, Eric; Ribaux, Olivier; Delémont, Olivier

    2015-12-01

    In an attempt to grasp the effectiveness of forensic science in the criminal justice process, a number of studies introduced some form of performance indicator. However, most of these indicators suffer from different weaknesses, from the definition of forensic science itself to problems of reliability and validity. We suggest the introduction of the concept of utility of the clue as an internal evaluation indicator of forensic science in the investigation. Utility of the clue is defined as added value of information, gained by the use of traces. This concept could be used to assess the contribution of the trace in the context of the case. By extension, a second application of this concept is suggested. By formalising and considering, a priori, the perceived utility of using traces, we introduce the notion of expected utility that could be used as decision factor when choosing which traces to use, once they have been collected at the crime scene or from an object in the laboratory. In a case-based approach, utility can be assessed in the light of the available information to evaluate the investigative contribution of forensic science. In the decision-making process, the projection or estimation of the utility of the clue is proposed to be a factor to take into account when triaging the set of traces. Copyright © 2015 The Chartered Society of Forensic Sciences. Published by Elsevier Ireland Ltd. All rights reserved.

  2. Clustering execution in a processing system to increase power savings

    DOEpatents

    Bose, Pradip; Buyuktosunoglu, Alper; Jacobson, Hans M.; Vega, Augusto J.

    2018-03-20

    Embodiments relate to clustering execution in a processing system. An aspect includes accessing a control flow graph that defines a data dependency and an execution sequence of a plurality of tasks of an application that executes on a plurality of system components. The execution sequence of the tasks in the control flow graph is modified as a clustered control flow graph that clusters active and idle phases of a system component while maintaining the data dependency. The clustered control flow graph is sent to an operating system, where the operating system utilizes the clustered control flow graph for scheduling the tasks.

  3. Compact collimators designed with a modified point approximation for light-emitting diodes

    NASA Astrophysics Data System (ADS)

    Luo, Tao; Wang, Gang

    2017-09-01

    We present a novel freeform lens design method for an application to LED collimating illumination. The method is derived from a basic geometric-optics analysis and construction approach. By using this method, a compact collimated lenses with Aspect Ratio = 0.219 is presented. Moreover, the utility efficiency (UE) inside the angle defined by ideal concentrator hypothesis with different lens-to-LED size ratios for both this lens and TIR lens are presented. A prototype of the collimator lens is also made to verify the practical performance of the lens, which has light distribution very compatible with the simulation results.

  4. Fluid Stochastic Petri Nets: Theory, Applications, and Solution

    NASA Technical Reports Server (NTRS)

    Horton, Graham; Kulkarni, Vidyadhar G.; Nicol, David M.; Trivedi, Kishor S.

    1996-01-01

    In this paper we introduce a new class of stochastic Petri nets in which one or more places can hold fluid rather than discrete tokens. We define a class of fluid stochastic Petri nets in such a way that the discrete and continuous portions may affect each other. Following this definition we provide equations for their transient and steady-state behavior. We present several examples showing the utility of the construct in communication network modeling and reliability analysis, and discuss important special cases. We then discuss numerical methods for computing the transient behavior of such nets. Finally, some numerical examples are presented.

  5. Accounting in the Social Menu

    ERIC Educational Resources Information Center

    González, José Villacís

    2010-01-01

    This paper was born out of combinatorics. It defines a level of utility which, though it cannot be measured, can be preferred to another in each specific combination of goods. In turn, each combination defines a menu, meaning that there will be as many menus as there are combinations of goods. In this manner, we have a menu and a utility for each…

  6. Geometric representation methods for multi-type self-defining remote sensing data sets

    NASA Technical Reports Server (NTRS)

    Anuta, P. E.

    1980-01-01

    Efficient and convenient representation of remote sensing data is highly important for an effective utilization. The task of merging different data types is currently dealt with by treating each case as an individual problem. A description is provided of work which is carried out to standardize the multidata merging process. The basic concept of the new approach is that of the self-defining data set (SDDS). The creation of a standard is proposed. This standard would be such that data which may be of interest in a large number of earth resources remote sensing applications would be in a format which allows convenient and automatic merging. Attention is given to details regarding the multidata merging problem, a geometric description of multitype data sets, image reconstruction from track-type data, a data set generation system, and an example multitype data set.

  7. Web-based resources for mass-spectrometry-based metabolomics: a user's guide.

    PubMed

    Tohge, Takayuki; Fernie, Alisdair R

    2009-03-01

    In recent years, a plethora of web-based tools aimed at supporting mass-spectrometry-based metabolite profiling and metabolomics applications have appeared. Given the huge hurdles presented by the chemical diversity and dynamic range of the metabolites present in the plant kingdom, profiling the levels of a broad range of metabolites is highly challenging. Given the scale and costs involved in defining the plant metabolome, it is imperative that data are effectively shared between laboratories pursuing this goal. However, ensuring accurate comparison of samples run on the same machine within the same laboratory, let alone cross-machine and cross-laboratory comparisons, requires both careful experimentation and data interpretation. In this review, we present an overview of currently available software that aids either in peak identification or in the related field of peak alignment as well as those with utility in defining structural information of compounds and metabolic pathways.

  8. Geosynchronous platform definition study. Volume 3: Geosynchronous mission characteristics

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The objectives of the study were to examine the nature of currently planned and new evolutionary geosynchronous programs, to analyze alternative ways of conducting missions, to establish concepts for new systems to support geosynchronous programs in an effective and economical manner, and to define the logistic support to carry out these programs. In order to meet these objectives, it was necessary to define and examine general geosynchronous mission characteristics and the potentially applicable electromagnetic spectrum characteristics. An organized compilation of these data is given with emphasis on the development and use of the data. Fundamental geosynchronous orbit time histories, mission profile characteristics, and delivery system characteristics are presented. In addition, electromagnetic spectrum utilization is discussed in terms of the usable frequency spectrum, the spectrum potentially available considering established frequency allocations, and the technology status as it affects the ability to operate within specific frequency bands.

  9. The long underestimated carbonyl function of carbohydrates – an organocatalyzed shot into carbohydrate chemistry.

    PubMed

    Mahrwald, R

    2015-09-21

    The aggressive and strong development of organocatalysis provides several protocols for the convenient utilization of the carbonyl function of unprotected carbohydrates in C-C-bond formation processes. These amine-catalyzed mechanisms enable multiple cascade-protocols for the synthesis of a wide range of carbohydrate-derived compound classes. Several, only slightly different protocols, have been developed for the application of 1,3-dicarbonyl compounds in the stereoselective chain-elongation of unprotected carbohydrates and the synthesis of highly functionalized C-glycosides of defined configuration. In addition, C-glycosides can also be accessed by amine-catalyzed reactions with methyl ketones. By a one-pot cascade reaction of isocyanides with unprotected aldoses and amino acids access to defined configured glycopeptide mimetics is achieved. Depending on the reaction conditions different origins to control the installation of configuration during the bond-formation process were observed.

  10. Generic functional requirements for a NASA general-purpose data base management system

    NASA Technical Reports Server (NTRS)

    Lohman, G. M.

    1981-01-01

    Generic functional requirements for a general-purpose, multi-mission data base management system (DBMS) for application to remotely sensed scientific data bases are detailed. The motivation for utilizing DBMS technology in this environment is explained. The major requirements include: (1) a DBMS for scientific observational data; (2) a multi-mission capability; (3) user-friendly; (4) extensive and integrated information about data; (5) robust languages for defining data structures and formats; (6) scientific data types and structures; (7) flexible physical access mechanisms; (8) ways of representing spatial relationships; (9) a high level nonprocedural interactive query and data manipulation language; (10) data base maintenance utilities; (11) high rate input/output and large data volume storage; and adaptability to a distributed data base and/or data base machine configuration. Detailed functions are specified in a top-down hierarchic fashion. Implementation, performance, and support requirements are also given.

  11. Applied choline-omics: lessons from human metabolic studies for the integration of genomics research into nutrition practice.

    PubMed

    West, Allyson A; Caudill, Marie A

    2014-08-01

    Nutritional genomics, defined as the study of reciprocal interactions among nutrients, metabolic intermediates, and the genome, along with other closely related nutritional -omic fields (eg, epigenomics, transcriptomics, and metabolomics) have become vital areas of nutrition study and knowledge. Utilizing results from human metabolic research on the essential nutrient choline, this article illustrates how nutrigenetic, nutrigenomic, and inter-related -omic research has provided new insights into choline metabolism and its effect on physiologic processes. Findings from highlighted choline research are also discussed in the context of translation to clinical and public health nutrition applications. Overall, this article underscores the utility of -omic research methods in elucidating nutrient metabolism as well as the potential for nutritional -omic concepts and discoveries to be broadly applied in nutritional practice. Copyright © 2014 Academy of Nutrition and Dietetics. Published by Elsevier Inc. All rights reserved.

  12. In silico design of context-responsive mammalian promoters with user-defined functionality

    PubMed Central

    Gibson, Suzanne J.; Hatton, Diane

    2017-01-01

    Abstract Comprehensive de novo-design of complex mammalian promoters is restricted by unpredictable combinatorial interactions between constituent transcription factor regulatory elements (TFREs). In this study, we show that modular binding sites that do not function cooperatively can be identified by analyzing host cell transcription factor expression profiles, and subsequently testing cognate TFRE activities in varying homotypic and heterotypic promoter architectures. TFREs that displayed position-insensitive, additive function within a specific expression context could be rationally combined together in silico to create promoters with highly predictable activities. As TFRE order and spacing did not affect the performance of these TFRE-combinations, compositions could be specifically arranged to preclude the formation of undesirable sequence features. This facilitated simple in silico-design of promoters with context-required, user-defined functionalities. To demonstrate this, we de novo-created promoters for biopharmaceutical production in CHO cells that exhibited precisely designed activity dynamics and long-term expression-stability, without causing observable retroactive effects on cellular performance. The design process described can be utilized for applications requiring context-responsive, customizable promoter function, particularly where co-expression of synthetic TFs is not suitable. Although the synthetic promoter structure utilized does not closely resemble native mammalian architectures, our findings also provide additional support for a flexible billboard model of promoter regulation. PMID:28977454

  13. The application of iodine and magnetic susceptibility surface geochemical surveys in the Lodgepole Play, Eastern Williston Basin, North Dakota

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tedesco, S.A.

    1996-06-01

    The use of surface geochemistry as a first pass exploration tool is becoming more prevalent in petroleum exploration. This is especially true due to the high cost of 2-D and 3-D surveys in defining small targets such as the Waulsortian mounds of the Lodgepole Formation. Surface geochemical surveys are very effective in pinpointing specific target areas for seismic surveying and thus reducing costs. Presented are examples of surface geochemical surveys utilizing magnetic susceptibility and iodine methods in delineating reservoirs in the Lodgepole, Mission Canyon and Red River formations. The types of surveys presented vary from reconnaissance to detail and examplesmore » of how to define a grid will be discussed. Surface geochemical surveys can be very effective when the areal extent of the target(s) and the purpose of the survey are clearly defined prior to implementation. By determining which areas have microseepage and which areas do not, surface geochemistry can be a very effective tool in focusing exploration efforts and maximizing exploration dollars.« less

  14. Distributed controller clustering in software defined networks.

    PubMed

    Abdelaziz, Ahmed; Fong, Ang Tan; Gani, Abdullah; Garba, Usman; Khan, Suleman; Akhunzada, Adnan; Talebian, Hamid; Choo, Kim-Kwang Raymond

    2017-01-01

    Software Defined Networking (SDN) is an emerging promising paradigm for network management because of its centralized network intelligence. However, the centralized control architecture of the software-defined networks (SDNs) brings novel challenges of reliability, scalability, fault tolerance and interoperability. In this paper, we proposed a novel clustered distributed controller architecture in the real setting of SDNs. The distributed cluster implementation comprises of multiple popular SDN controllers. The proposed mechanism is evaluated using a real world network topology running on top of an emulated SDN environment. The result shows that the proposed distributed controller clustering mechanism is able to significantly reduce the average latency from 8.1% to 1.6%, the packet loss from 5.22% to 4.15%, compared to distributed controller without clustering running on HP Virtual Application Network (VAN) SDN and Open Network Operating System (ONOS) controllers respectively. Moreover, proposed method also shows reasonable CPU utilization results. Furthermore, the proposed mechanism makes possible to handle unexpected load fluctuations while maintaining a continuous network operation, even when there is a controller failure. The paper is a potential contribution stepping towards addressing the issues of reliability, scalability, fault tolerance, and inter-operability.

  15. Reusable rocket engine turbopump health monitoring system, part 3

    NASA Technical Reports Server (NTRS)

    Perry, John G.

    1989-01-01

    Degradation mechanisms and sensor identification/selection resulted in a list of degradation modes and a list of sensors that are utilized in the diagnosis of these degradation modes. The sensor list is divided into primary and secondary indicators of the corresponding degradation modes. The signal conditioning requirements are discussed, describing the methods of producing the Space Shuttle Main Engine (SSME) post-hot-fire test data to be utilized by the Health Monitoring System. Development of the diagnostic logic and algorithms is also presented. The knowledge engineering approach, as utilized, includes the knowledge acquisition effort, characterization of the expert's problem solving strategy, conceptually defining the form of the applicable knowledge base, and rule base, and identifying an appropriate inferencing mechanism for the problem domain. The resulting logic flow graphs detail the diagnosis/prognosis procedure as followed by the experts. The nature and content of required support data and databases is also presented. The distinction between deep and shallow types of knowledge is identified. Computer coding of the Health Monitoring System is shown to follow the logical inferencing of the logic flow graphs/algorithms.

  16. Human factors guidelines for applications of 3D perspectives: a literature review

    NASA Astrophysics Data System (ADS)

    Dixon, Sharon; Fitzhugh, Elisabeth; Aleva, Denise

    2009-05-01

    Once considered too processing-intense for general utility, application of the third dimension to convey complex information is facilitated by the recent proliferation of technological advancements in computer processing, 3D displays, and 3D perspective (2.5D) renderings within a 2D medium. The profusion of complex and rapidly-changing dynamic information being conveyed in operational environments has elevated interest in possible military applications of 3D technologies. 3D can be a powerful mechanism for clearer information portrayal, facilitating rapid and accurate identification of key elements essential to mission performance and operator safety. However, implementation of 3D within legacy systems can be costly, making integration prohibitive. Therefore, identifying which tasks may benefit from 3D or 2.5D versus simple 2D visualizations is critical. Unfortunately, there is no "bible" of human factors guidelines for usability optimization of 2D, 2.5D, or 3D visualizations nor for determining which display best serves a particular application. Establishing such guidelines would provide an invaluable tool for designers and operators. Defining issues common to each will enhance design effectiveness. This paper presents the results of an extensive review of open source literature addressing 3D information displays, with particular emphasis on comparison of true 3D with 2D and 2.5D representations and their utility for military tasks. Seventy-five papers are summarized, highlighting militarily relevant applications of 3D visualizations and 2.5D perspective renderings. Based on these findings, human factors guidelines for when and how to use these visualizations, along with recommendations for further research are discussed.

  17. Risk and utility in portfolio optimization

    NASA Astrophysics Data System (ADS)

    Cohen, Morrel H.; Natoli, Vincent D.

    2003-06-01

    Modern portfolio theory (MPT) addresses the problem of determining the optimum allocation of investment resources among a set of candidate assets. In the original mean-variance approach of Markowitz, volatility is taken as a proxy for risk, conflating uncertainty with risk. There have been many subsequent attempts to alleviate that weakness which, typically, combine utility and risk. We present here a modification of MPT based on the inclusion of separate risk and utility criteria. We define risk as the probability of failure to meet a pre-established investment goal. We define utility as the expectation of a utility function with positive and decreasing marginal value as a function of yield. The emphasis throughout is on long investment horizons for which risk-free assets do not exist. Analytic results are presented for a Gaussian probability distribution. Risk-utility relations are explored via empirical stock-price data, and an illustrative portfolio is optimized using the empirical data.

  18. ATHENA: A Personalized Platform to Promote an Active Lifestyle and Wellbeing Based on Physical, Mental and Social Health Primitives

    PubMed Central

    Fahim, Muhammad; Idris, Muhammad; Ali, Rahman; Nugent, Christopher; Kang, Byeong; Huh, Eui-Nam; Lee, Sungyoung

    2014-01-01

    Technology provides ample opportunities for the acquisition and processing of physical, mental and social health primitives. However, several challenges remain for researchers as how to define the relationship between reported physical activities, mood and social interaction to define an active lifestyle. We are conducting a project, ATHENA(activity-awareness for human-engaged wellness applications) to design and integrate the relationship between these basic health primitives to approximate the human lifestyle and real-time recommendations for wellbeing services. Our goal is to develop a system to promote an active lifestyle for individuals and to recommend to them valuable interventions by making comparisons to their past habits. The proposed system processes sensory data through our developed machine learning algorithms inside smart devices and utilizes cloud infrastructure to reduce the cost. We exploit big data infrastructure for massive sensory data storage and fast retrieval for recommendations. Our contributions include the development of a prototype system to promote an active lifestyle and a visual design capable of engaging users in the goal of increasing self-motivation. We believe that our study will impact the design of future ubiquitous wellness applications. PMID:24859031

  19. A new complexity measure for time series analysis and classification

    NASA Astrophysics Data System (ADS)

    Nagaraj, Nithin; Balasubramanian, Karthi; Dey, Sutirth

    2013-07-01

    Complexity measures are used in a number of applications including extraction of information from data such as ecological time series, detection of non-random structure in biomedical signals, testing of random number generators, language recognition and authorship attribution etc. Different complexity measures proposed in the literature like Shannon entropy, Relative entropy, Lempel-Ziv, Kolmogrov and Algorithmic complexity are mostly ineffective in analyzing short sequences that are further corrupted with noise. To address this problem, we propose a new complexity measure ETC and define it as the "Effort To Compress" the input sequence by a lossless compression algorithm. Here, we employ the lossless compression algorithm known as Non-Sequential Recursive Pair Substitution (NSRPS) and define ETC as the number of iterations needed for NSRPS to transform the input sequence to a constant sequence. We demonstrate the utility of ETC in two applications. ETC is shown to have better correlation with Lyapunov exponent than Shannon entropy even with relatively short and noisy time series. The measure also has a greater rate of success in automatic identification and classification of short noisy sequences, compared to entropy and a popular measure based on Lempel-Ziv compression (implemented by Gzip).

  20. The historical development of the magnetic method in exploration

    USGS Publications Warehouse

    Nabighian, M.N.; Grauch, V.J.S.; Hansen, R.O.; LaFehr, T.R.; Li, Y.; Peirce, J.W.; Phillips, J.D.; Ruder, M.E.

    2005-01-01

    The magnetic method, perhaps the oldest of geophysical exploration techniques, blossomed after the advent of airborne surveys in World War II. With improvements in instrumentation, navigation, and platform compensation, it is now possible to map the entire crustal section at a variety of scales, from strongly magnetic basement at regional scale to weakly magnetic sedimentary contacts at local scale. Methods of data filtering, display, and interpretation have also advanced, especially with the availability of low-cost, high-performance personal computers and color raster graphics. The magnetic method is the primary exploration tool in the search for minerals. In other arenas, the magnetic method has evolved from its sole use for mapping basement structure to include a wide range of new applications, such as locating intrasedimentary faults, defining subtle lithologic contacts, mapping salt domes in weakly magnetic sediments, and better defining targets through 3D inversion. These new applications have increased the method's utility in all realms of exploration - in the search for minerals, oil and gas, geothermal resources, and groundwater, and for a variety of other purposes such as natural hazards assessment, mapping impact structures, and engineering and environmental studies. ?? 2005 Society of Exploration Geophysicists. All rights reserved.

  1. Performance evaluation of data center service localization based on virtual resource migration in software defined elastic optical network.

    PubMed

    Yang, Hui; Zhang, Jie; Ji, Yuefeng; Tan, Yuanlong; Lin, Yi; Han, Jianrui; Lee, Young

    2015-09-07

    Data center interconnection with elastic optical network is a promising scenario to meet the high burstiness and high-bandwidth requirements of data center services. In our previous work, we implemented cross stratum optimization of optical network and application stratums resources that allows to accommodate data center services. In view of this, this study extends the data center resources to user side to enhance the end-to-end quality of service. We propose a novel data center service localization (DCSL) architecture based on virtual resource migration in software defined elastic data center optical network. A migration evaluation scheme (MES) is introduced for DCSL based on the proposed architecture. The DCSL can enhance the responsiveness to the dynamic end-to-end data center demands, and effectively reduce the blocking probability to globally optimize optical network and application resources. The overall feasibility and efficiency of the proposed architecture are experimentally verified on the control plane of our OpenFlow-based enhanced SDN testbed. The performance of MES scheme under heavy traffic load scenario is also quantitatively evaluated based on DCSL architecture in terms of path blocking probability, provisioning latency and resource utilization, compared with other provisioning scheme.

  2. Development and evaluation of a web-based application for digital findings and documentation in physiotherapy education.

    PubMed

    Spieler, Bernadette; Burgsteiner, Harald; Messer-Misak, Karin; Gödl-Purrer, Barbara; Salchinger, Beate

    2015-01-01

    Findings in physiotherapy have standardized approaches in treatment, but there is also a significant margin of differences in how to implement these standards. Clinical decisions require experience and continuous learning processes to consolidate personal values and opinions and studies suggest that lecturers can influence students positively. Recently, the study course of Physiotherapy at the University of Applied Science in Graz has offered a paper based finding document. This document supported decisions through the adaption of the clinical reasoning process. The document was the starting point for our learning application called "EasyAssess", a Java based web-application for a digital findings documentation. A central point of our work was to ensure efficiency, effectiveness and usability of the web-application through usability tests utilized by both students and lecturers. Results show that our application fulfills the previously defined requirements and can be efficiently used in daily routine largely because of its simple user interface and its modest design. Due to the close cooperation with the study course Physiotherapy, the application has incorporated the various needs of the target audiences and confirmed the usefulness of our application.

  3. Breeding goals for the Kenya dual purpose goat. I. Model development and application to smallholder production systems.

    PubMed

    Bett, R C; Kosgey, I S; Bebe, B O; Kahi, A K

    2007-10-01

    A deterministic model was developed and applied to evaluate biological and economic variables that characterize smallholder production systems utilizing the Kenya Dual Purpose goat (KDPG) in Kenya. The systems were defined as: smallholder low-potential (SLP), smallholder medium-potential (SMP) and smallholder high-potential (SHP). The model was able to predict revenues and costs to the system. Revenues were from sale of milk, surplus yearlings and cull-forage animals, while costs included those incurred for feeds, husbandry, marketing and fixed asset (fixed costs). Of the total outputs, revenue from meat and milk accounted for about 55% and 45%, respectively, in SMP and 39% and 61% in SHP. Total costs comprised mainly variable costs (98%), with husbandry costs being the highest in both SMP and SLP. The total profit per doe per year was KSh 315.48 in SMP, KSh -1352.75 in SLP and KSh -80.22 in SHP. Results suggest that the utilization of the KDPG goat in Kenya is more profitable in the smallholder medium-potential production system. The implication for the application of the model to smallholder production systems in Kenya is discussed.

  4. Encapsulating Non-Human Primate Multipotent Stromal Cells in Alginate via High Voltage for Cell-Based Therapies and Cryopreservation

    PubMed Central

    Gryshkov, Oleksandr; Pogozhykh, Denys; Hofmann, Nicola; Pogozhykh, Olena; Mueller, Thomas; Glasmacher, Birgit

    2014-01-01

    Alginate cell-based therapy requires further development focused on clinical application. To assess engraftment, risk of mutations and therapeutic benefit studies should be performed in an appropriate non-human primate model, such as the common marmoset (Callithrix jacchus). In this work we encapsulated amnion derived multipotent stromal cells (MSCs) from Callithrix jacchus in defined size alginate beads using a high voltage technique. Our results indicate that i) alginate-cell mixing procedure and cell concentration do not affect the diameter of alginate beads, ii) encapsulation of high cell numbers (up to 10×106 cells/ml) can be performed in alginate beads utilizing high voltage and iii) high voltage (15–30 kV) does not alter the viability, proliferation and differentiation capacity of MSCs post-encapsulation compared with alginate encapsulated cells produced by the traditional air-flow method. The consistent results were obtained over the period of 7 days of encapsulated MSCs culture and after cryopreservation utilizing a slow cooling procedure (1 K/min). The results of this work show that high voltage encapsulation can further be maximized to develop cell-based therapies with alginate beads in a non-human primate model towards human application. PMID:25259731

  5. Combining wet and dry research: experience with model development for cardiac mechano-electric structure-function studies

    PubMed Central

    Quinn, T. Alexander; Kohl, Peter

    2013-01-01

    Since the development of the first mathematical cardiac cell model 50 years ago, computational modelling has become an increasingly powerful tool for the analysis of data and for the integration of information related to complex cardiac behaviour. Current models build on decades of iteration between experiment and theory, representing a collective understanding of cardiac function. All models, whether computational, experimental, or conceptual, are simplified representations of reality and, like tools in a toolbox, suitable for specific applications. Their range of applicability can be explored (and expanded) by iterative combination of ‘wet’ and ‘dry’ investigation, where experimental or clinical data are used to first build and then validate computational models (allowing integration of previous findings, quantitative assessment of conceptual models, and projection across relevant spatial and temporal scales), while computational simulations are utilized for plausibility assessment, hypotheses-generation, and prediction (thereby defining further experimental research targets). When implemented effectively, this combined wet/dry research approach can support the development of a more complete and cohesive understanding of integrated biological function. This review illustrates the utility of such an approach, based on recent examples of multi-scale studies of cardiac structure and mechano-electric function. PMID:23334215

  6. Utilization of Skills in the Care of the Patient with Common, Well-Defined Health Deviations I (NS 217): Competency-Based Course Syllabus.

    ERIC Educational Resources Information Center

    Green, Elizabeth G.; Yates, Laura H.

    "Utilization of Skills in the Care of the Patient with Common, Well-Defined Health Deviations I" (NS 217) is an associate degree nursing course offered at Chattanooga State Technical Community College to help students develop new competencies necessary for the care of patients with deviations of the cardiovascular, endocrine, integumentary, and…

  7. Results of the Software Process Improvement Efforts of the Early Adopters in NAVAIR 4.0

    DTIC Science & Technology

    2007-12-01

    and customer satisfaction. AIRSpeed utilizes a structured, problem solving methodology called DMAIC (Define, Measure, Analyze, Improve, Control...widely used in business. DMAIC leads project teams through the logical steps from problem definition to problem resolution. Each phase has a specific set...costs and improving productivity and customer satisfaction. AIRSpeed utilizes the DMAIC (Define, Measure, Analyze, Improve, Control) structured problem

  8. Lessons Learned from Deploying an Analytical Task Management Database

    NASA Technical Reports Server (NTRS)

    O'Neil, Daniel A.; Welch, Clara; Arceneaux, Joshua; Bulgatz, Dennis; Hunt, Mitch; Young, Stephen

    2007-01-01

    Defining requirements, missions, technologies, and concepts for space exploration involves multiple levels of organizations, teams of people with complementary skills, and analytical models and simulations. Analytical activities range from filling a To-Be-Determined (TBD) in a requirement to creating animations and simulations of exploration missions. In a program as large as returning to the Moon, there are hundreds of simultaneous analysis activities. A way to manage and integrate efforts of this magnitude is to deploy a centralized database that provides the capability to define tasks, identify resources, describe products, schedule deliveries, and generate a variety of reports. This paper describes a web-accessible task management system and explains the lessons learned during the development and deployment of the database. Through the database, managers and team leaders can define tasks, establish review schedules, assign teams, link tasks to specific requirements, identify products, and link the task data records to external repositories that contain the products. Data filters and spreadsheet export utilities provide a powerful capability to create custom reports. Import utilities provide a means to populate the database from previously filled form files. Within a four month period, a small team analyzed requirements, developed a prototype, conducted multiple system demonstrations, and deployed a working system supporting hundreds of users across the aeros pace community. Open-source technologies and agile software development techniques, applied by a skilled team enabled this impressive achievement. Topics in the paper cover the web application technologies, agile software development, an overview of the system's functions and features, dealing with increasing scope, and deploying new versions of the system.

  9. Defining Toll Fee of Wheeling Renewable with Reference to a Gas Pipeline in Indonesia

    NASA Astrophysics Data System (ADS)

    Hakim, Amrullah

    2017-07-01

    Indonesia has a huge number of renewable energy sources (RE) however; the utilization of these is currently very low. The main challenge of power production is its alignment with consumption levels; supply should equal demand at all times. There is a strong initiative from corporations with high energy demand, compared to other sectors, to apply a renewable portfolio standard for their energy input, e.g. 15% of their energy consumption requirement must come from a renewable energy source. To support this initiative, the utilization of power wheeling will help large factories on industrial estates to source firm and steady renewables from remote sites. The wheeling renewable via PLN’s transmission line has been regulated under the Ministry Decree in 2015 however; the tariff or toll fee has not yet been defined. The potential project to apply wheeling renewable will obtain power supply from a geothermal power plant, with power demand from the scattered factories under one company. This is the concept driving the application of power wheeling in the effort to push the growth of renewable energy in Indonesia. Given that the capacity of PLN’s transmission line are normally large and less congested compared to distribution line, the wheeling renewable can accommodate the scattered factories locations which then results in the cheaper toll fee of the wheeling renewable. Defining the best toll fee is the main topic of this paper with comparison of the toll fee of the gas pipeline infrastructure in Indonesia, so that it can be applied massively to achieve COP21’s commitment.

  10. groHMM: a computational tool for identifying unannotated and cell type-specific transcription units from global run-on sequencing data.

    PubMed

    Chae, Minho; Danko, Charles G; Kraus, W Lee

    2015-07-16

    Global run-on coupled with deep sequencing (GRO-seq) provides extensive information on the location and function of coding and non-coding transcripts, including primary microRNAs (miRNAs), long non-coding RNAs (lncRNAs), and enhancer RNAs (eRNAs), as well as yet undiscovered classes of transcripts. However, few computational tools tailored toward this new type of sequencing data are available, limiting the applicability of GRO-seq data for identifying novel transcription units. Here, we present groHMM, a computational tool in R, which defines the boundaries of transcription units de novo using a two state hidden-Markov model (HMM). A systematic comparison of the performance between groHMM and two existing peak-calling methods tuned to identify broad regions (SICER and HOMER) favorably supports our approach on existing GRO-seq data from MCF-7 breast cancer cells. To demonstrate the broader utility of our approach, we have used groHMM to annotate a diverse array of transcription units (i.e., primary transcripts) from four GRO-seq data sets derived from cells representing a variety of different human tissue types, including non-transformed cells (cardiomyocytes and lung fibroblasts) and transformed cells (LNCaP and MCF-7 cancer cells), as well as non-mammalian cells (from flies and worms). As an example of the utility of groHMM and its application to questions about the transcriptome, we show how groHMM can be used to analyze cell type-specific enhancers as defined by newly annotated enhancer transcripts. Our results show that groHMM can reveal new insights into cell type-specific transcription by identifying novel transcription units, and serve as a complete and useful tool for evaluating functional genomic elements in cells.

  11. The subject-fixated coaxially sighted corneal light reflex: a clinical marker for centration of refractive treatments and devices.

    PubMed

    Chang, Daniel H; Waring, George O

    2014-11-01

    To describe the inconsistencies in definition, application, and usage of the ocular reference axes (optical axis, visual axis, line of sight, pupillary axis, and topographic axis) and angles (angle kappa, lambda, and alpha) and to propose a precise, reproducible, clinically defined reference marker and axis for centration of refractive treatments and devices. Perspective. Literature review of papers dealing with ocular reference axes, angles, and centration. The inconsistent definitions and usage of the current ocular axes, as derived from eye models, limit their clinical utility. With a clear understanding of Purkinje images and a defined alignment of the observer, light source/fixation target, and subject eye, the subject-fixated coaxially sighted corneal light reflex can be a clinically useful reference marker. The axis formed by connecting the subject-fixated coaxially sighted corneal light reflex and the fixation point, the subject-fixated coaxially sighted corneal light reflex axis, is independent of pupillary dilation and phakic status of the eye. The relationship of the subject-fixated coaxially sighted corneal light reflex axis to a refined definition of the visual axis without reference to nodal points, the foveal-fixation axis, is discussed. The displacement between the subject-fixated coaxially sighted corneal light reflex and pupil center is described not by an angle, but by a chord, here termed chord mu. The application of the subject-fixated coaxially sighted corneal light reflex to the surgical centration of refractive treatments and devices is discussed. As a clinically defined reference marker, the subject-fixated coaxially sighted corneal light reflex avoids the shortcomings of current ocular axes for clinical application and may contribute to better consensus in the literature and improved patient outcomes. Copyright © 2014 Elsevier Inc. All rights reserved.

  12. AUTOMATED UTILITY SERVICE AREA ASSESSMENT UNDER EMERGENCY CONDITIONS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    G. TOOLE; S. LINGER

    2001-01-01

    All electric utilities serve power to their customers through a variety of functional levels, notably substations. The majority of these components consist of distribution substations operating at lower voltages while a small fraction are transmission substations. There is an associated geographical area that encompasses customers who are served, defined as the service area. Analysis of substation service areas is greatly complicated by several factors: distribution networks are often highly interconnected which allows a multitude of possible switching operations; also, utilities dynamically alter the network topology in order to respond to emergency events. As a result, the service area for amore » substation can change radically. A utility will generally attempt to minimize the number of customers outaged by switching effected loads to alternate substations. In this manner, all or a portion of a disabled substation's load may be served by one or more adjacent substations. This paper describes a suite of analytical tools developed at Los Alamos National Laboratory (LANL), which address the problem of determining how a utility might respond to such emergency events. The estimated outage areas derived using the tools are overlaid onto other geographical and electrical layers in a geographic information system (GIS) software application. The effects of a power outage on a population, other infrastructures, or other physical features, can be inferred by the proximity of these features to the estimated outage area.« less

  13. System Analyses of Pneumatic Technology for High Speed Civil Transport Aircraft

    NASA Technical Reports Server (NTRS)

    Mavris, Dimitri N.; Tai, Jimmy C.; Kirby, Michelle M.; Roth, Bryce A.

    1999-01-01

    The primary aspiration of this study was to objectively assess the feasibility of the application of a low speed pneumatic technology, in particular Circulation Control (CC) to an HSCT concept. Circulation Control has been chosen as an enabling technology to be applied on a generic High Speed Civil Transport (HSCT). This technology has been proven for various subsonic vehicles including flight tests on a Navy A-6 and computational application on a Boeing 737. Yet, CC has not been widely accepted for general commercial fixed-wing use but its potential has been extensively investigated for decades in wind tunnels across the globe for application to rotorcraft. More recently, an experimental investigation was performed at Georgia Tech Research Institute (GTRI) with application to an HSCT-type configuration. The data from those experiments was to be applied to a full-scale vehicle to assess the impact from a system level point of view. Hence, this study attempted to quantitatively assess the impact of this technology to an HSCT. The study objective was achieved in three primary steps: 1) Defining the need for CC technology; 2) Wind tunnel data reduction; 3) Detailed takeoff/landing performance assessment. Defining the need for the CC technology application to an HSCT encompassed a preliminary system level analysis. This was accomplished through the utilization of recent developments in modern aircraft design theory at Aerospace Systems Design Laboratory (ASDL). These developments include the creation of techniques and methods needed for the identification of technical feasibility show stoppers. These techniques and methods allow the designer to rapidly assess a design space and disciplinary metric enhancements to enlarge or improve the design space. The takeoff and landing field lengths were identified as the concept "show-stoppers". Once the need for CC was established, the actual application of data and trends was assessed. This assessment entailed a reduction of the wind tunnel data from the experiments performed by Mr. Bob Englar at the GTRI. Relevant data was identified and manipulated based on the required format of the analysis tools utilized. Propulsive, aerodynamic, duct sizing, and vehicle sizing investigations were performed and information supplied to a detailed takeoff and landing tool, From the assessments, CC was shown to improve the low speed performance metrics, which were previously not satisfied. An HSCT with CC augmentation does show potential for full-scale application. Yet, an economic assessment of an HSCT with and without CC showed that a moderate penalty was incurred from the increased RDT&E costs associated with developing the CC technology and slight increases in empty weight.

  14. APPLICATION OF NEURAL NETWORK ALGORITHMS FOR BPM LINEARIZATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Musson, John C.; Seaton, Chad; Spata, Mike F.

    2012-11-01

    Stripline BPM sensors contain inherent non-linearities, as a result of field distortions from the pickup elements. Many methods have been devised to facilitate corrections, often employing polynomial fitting. The cost of computation makes real-time correction difficult, particulalry when integer math is utilized. The application of neural-network technology, particularly the multi-layer perceptron algorithm, is proposed as an efficient alternative for electrode linearization. A process of supervised learning is initially used to determine the weighting coefficients, which are subsequently applied to the incoming electrode data. A non-linear layer, known as an activation layer, is responsible for the removal of saturation effects. Implementationmore » of a perceptron in an FPGA-based software-defined radio (SDR) is presented, along with performance comparisons. In addition, efficient calculation of the sigmoidal activation function via the CORDIC algorithm is presented.« less

  15. An Analysis of Applications Development Systems for Remotely Sensed, Multispectral Data for the Earth Observations Division of the NASA Lyndon B. Johnson Space Center

    NASA Technical Reports Server (NTRS)

    Vanrooy, D. L.; Smith, R. M.; Lynn, M. S.

    1974-01-01

    An application development system (ADS) is examined for remotely sensed, multispectral data at the Earth Observations Division (EOD) at Johnson Space Center. Design goals are detailed, along with design objectives that an ideal system should contain. The design objectives were arranged according to the priorities of EOD's program objectives. Four systems available to EOD were then measured against the ideal ADS as defined by the design objectives and their associated priorities. This was accomplished by rating each of the systems on each of the design objectives. Utilizing the established priorities, it was determined how each system stood up as an ADS. Recommendations were made as to possible courses of action for EOD to pursue to obtain a more efficient ADS.

  16. A study of the utilization of ERTS-1 data from the Wabash River Basin

    NASA Technical Reports Server (NTRS)

    Landgrebe, D. A. (Principal Investigator)

    1973-01-01

    The author has identified the following significant results. Nine projects are defined, five ERTS data applications experiments and four supporting technology tasks. The most significant applications results were achieved in the soil association mapping, earth surface feature identification, and urban land use mapping efforts. Four soil association boundaries were accurately delineated from ERTS-1 imagery. A data bank has been developed to test surface feature classifications obtained from ERTS-1 data. Preliminary forest cover classifications indicated that the number of acres estimated tended to be greater than actually existed by 25%. Urban land use analysis of ERTS-1 data indicated highly accurate classification could be obtained for many urban catagories. The wooded residential category tended to be misclassified as woods or agricultural land. Further statistical analysis revealed that these classes could be separated using sample variance.

  17. Implementing Multidisciplinary and Multi-Zonal Applications Using MPI

    NASA Technical Reports Server (NTRS)

    Fineberg, Samuel A.

    1995-01-01

    Multidisciplinary and multi-zonal applications are an important class of applications in the area of Computational Aerosciences. In these codes, two or more distinct parallel programs or copies of a single program are utilized to model a single problem. To support such applications, it is common to use a programming model where a program is divided into several single program multiple data stream (SPMD) applications, each of which solves the equations for a single physical discipline or grid zone. These SPMD applications are then bound together to form a single multidisciplinary or multi-zonal program in which the constituent parts communicate via point-to-point message passing routines. Unfortunately, simple message passing models, like Intel's NX library, only allow point-to-point and global communication within a single system-defined partition. This makes implementation of these applications quite difficult, if not impossible. In this report it is shown that the new Message Passing Interface (MPI) standard is a viable portable library for implementing the message passing portion of multidisciplinary applications. Further, with the extension of a portable loader, fully portable multidisciplinary application programs can be developed. Finally, the performance of MPI is compared to that of some native message passing libraries. This comparison shows that MPI can be implemented to deliver performance commensurate with native message libraries.

  18. Outcomes in the Orthopaedic Sports Medicine Fellowship Match, 2010-2017.

    PubMed

    Mulcahey, Mary K; Hayes, Meghan K; Smith, Christopher M; Kraeutler, Matthew J; Trojan, Jeffrey D; McCarty, Eric C

    2018-05-01

    Sports medicine is one of the most competitive fellowships in orthopaedic surgery. Despite its popularity, fellowship applicants have limited understanding of the orthopaedic sports medicine fellowship match process. To define key outcomes in the orthopaedic sports medicine fellowship match, including the overall match rate, number of programs filled, and number of applicants ranked by programs that filled between 2010 and 2017. Cross-sectional study. This study utilized data regarding the orthopaedic sports medicine fellowship match collected by the American Orthopaedic Society for Sports Medicine (AOSSM) from 2010 through 2017. Applicant data included number of applicants, number of matched and unmatched applicants, and percentage of applicants matching into their top choices. Fellowship program data included number of programs participating in the match and number of applicants ranked by filled and unfilled programs. Between 2010 and 2017, the mean number of orthopaedic sports medicine fellowship applicants was 244.8. On average, 92.0% of applicants matched into a fellowship program. The mean number of programs participating in the fellowship match was 92.9, with a mean of 219.9 accredited positions and 5.4 nonaccredited positions. Over the time period studied, a mean of 75.8% of programs matched all available positions. Programs that matched fully ranked 9.0 applicants per position, on average, compared with a mean of 6.5 applicants ranked per position among programs that did not fully match ( P = .0016). From 2010 to 2017, the number of applicants, positions available, overall match rate, and number of programs participating in the orthopaedic sports medicine fellowship match have remained consistent. The mean number of applicants per position ranked by fully matched fellowship programs was 9.0 compared with a mean of 6.5 applicants per position ranked by programs that did not fully match. These data may be helpful as we look to the future of orthopaedic sports medicine fellowship positions and the match process. In addition, this study reveals characteristics that divide sports medicine fellowship programs that fully match from those that do not. Applicants and/or fellowship program directors may utilize this information to modify their approach to the match process going forward.

  19. Outcomes in the Orthopaedic Sports Medicine Fellowship Match, 2010-2017

    PubMed Central

    Mulcahey, Mary K.; Hayes, Meghan K.; Smith, Christopher M.; Kraeutler, Matthew J.; Trojan, Jeffrey D.; McCarty, Eric C.

    2018-01-01

    Background: Sports medicine is one of the most competitive fellowships in orthopaedic surgery. Despite its popularity, fellowship applicants have limited understanding of the orthopaedic sports medicine fellowship match process. Purpose: To define key outcomes in the orthopaedic sports medicine fellowship match, including the overall match rate, number of programs filled, and number of applicants ranked by programs that filled between 2010 and 2017. Study Design: Cross-sectional study. Methods: This study utilized data regarding the orthopaedic sports medicine fellowship match collected by the American Orthopaedic Society for Sports Medicine (AOSSM) from 2010 through 2017. Applicant data included number of applicants, number of matched and unmatched applicants, and percentage of applicants matching into their top choices. Fellowship program data included number of programs participating in the match and number of applicants ranked by filled and unfilled programs. Results: Between 2010 and 2017, the mean number of orthopaedic sports medicine fellowship applicants was 244.8. On average, 92.0% of applicants matched into a fellowship program. The mean number of programs participating in the fellowship match was 92.9, with a mean of 219.9 accredited positions and 5.4 nonaccredited positions. Over the time period studied, a mean of 75.8% of programs matched all available positions. Programs that matched fully ranked 9.0 applicants per position, on average, compared with a mean of 6.5 applicants ranked per position among programs that did not fully match (P = .0016). Conclusion: From 2010 to 2017, the number of applicants, positions available, overall match rate, and number of programs participating in the orthopaedic sports medicine fellowship match have remained consistent. The mean number of applicants per position ranked by fully matched fellowship programs was 9.0 compared with a mean of 6.5 applicants per position ranked by programs that did not fully match. These data may be helpful as we look to the future of orthopaedic sports medicine fellowship positions and the match process. In addition, this study reveals characteristics that divide sports medicine fellowship programs that fully match from those that do not. Applicants and/or fellowship program directors may utilize this information to modify their approach to the match process going forward. PMID:29796398

  20. Evaluating gambles using dynamics

    NASA Astrophysics Data System (ADS)

    Peters, O.; Gell-Mann, M.

    2016-02-01

    Gambles are random variables that model possible changes in wealth. Classic decision theory transforms money into utility through a utility function and defines the value of a gamble as the expectation value of utility changes. Utility functions aim to capture individual psychological characteristics, but their generality limits predictive power. Expectation value maximizers are defined as rational in economics, but expectation values are only meaningful in the presence of ensembles or in systems with ergodic properties, whereas decision-makers have no access to ensembles, and the variables representing wealth in the usual growth models do not have the relevant ergodic properties. Simultaneously addressing the shortcomings of utility and those of expectations, we propose to evaluate gambles by averaging wealth growth over time. No utility function is needed, but a dynamic must be specified to compute time averages. Linear and logarithmic "utility functions" appear as transformations that generate ergodic observables for purely additive and purely multiplicative dynamics, respectively. We highlight inconsistencies throughout the development of decision theory, whose correction clarifies that our perspective is legitimate. These invalidate a commonly cited argument for bounded utility functions.

  1. A chemically defined substrate for the expansion and neuronal differentiation of human pluripotent stem cell-derived neural progenitor cells.

    PubMed

    Tsai, Yihuan; Cutts, Josh; Kimura, Azuma; Varun, Divya; Brafman, David A

    2015-07-01

    Due to the limitation of current pharmacological therapeutic strategies, stem cell therapies have emerged as a viable option for treating many incurable neurological disorders. Specifically, human pluripotent stem cell (hPSC)-derived neural progenitor cells (hNPCs), a multipotent cell population that is capable of near indefinite expansion and subsequent differentiation into the various cell types that comprise the central nervous system (CNS), could provide an unlimited source of cells for such cell-based therapies. However the clinical application of these cells will require (i) defined, xeno-free conditions for their expansion and neuronal differentiation and (ii) scalable culture systems that enable their expansion and neuronal differentiation in numbers sufficient for regenerative medicine and drug screening purposes. Current extracellular matrix protein (ECMP)-based substrates for the culture of hNPCs are expensive, difficult to isolate, subject to batch-to-batch variations, and, therefore, unsuitable for clinical application of hNPCs. Using a high-throughput array-based screening approach, we identified a synthetic polymer, poly(4-vinyl phenol) (P4VP), that supported the long-term proliferation and self-renewal of hNPCs. The hNPCs cultured on P4VP maintained their characteristic morphology, expressed high levels of markers of multipotency, and retained their ability to differentiate into neurons. Such chemically defined substrates will eliminate critical roadblocks for the utilization of hNPCs for human neural regenerative repair, disease modeling, and drug discovery. Copyright © 2015. Published by Elsevier B.V.

  2. The applicability of the UK Public Health Skills and Knowledge Framework to the practitioner workforce: lessons for competency framework development.

    PubMed

    Shickle, Darren; Stroud, Laura; Day, Matthew; Smith, Kevin

    2018-06-05

    Many countries have developed competency frameworks for public health practice. While the number of competencies vary, frameworks cover similar knowledge and skills although they are not explicitly based on competency theory. A total of 15 qualitative group interviews (of up to six people), were conducted with 51 public health practitioners in 8 local authorities to assess the extent to which practitioners utilize competencies defined within the UK Public Health Skills and Knowledge Framework (PHSKF). Framework analysis was applied to the transcribed interviews. The overall framework was seen positively although no participants had previously read or utilized the PHSKF. Most could provide evidence, although some PHSKF competencies required creative thinking to fit expectations of practitioners and to reflect variation across the domains of practice which are impacted by job role and level of seniority. Evidence from previous NHS jobs or education may be needed as some competencies were not regularly utilized within their current local authority role. Further development of the PHSKF is required to provide guidance on how it should be used for practitioners and other members of the public health workforce. Empirical research can help benchmark knowledge/skills for workforce levels so improving the utility of competency frameworks.

  3. Optimal threshold estimator of a prognostic marker by maximizing a time-dependent expected utility function for a patient-centered stratified medicine.

    PubMed

    Dantan, Etienne; Foucher, Yohann; Lorent, Marine; Giral, Magali; Tessier, Philippe

    2018-06-01

    Defining thresholds of prognostic markers is essential for stratified medicine. Such thresholds are mostly estimated from purely statistical measures regardless of patient preferences potentially leading to unacceptable medical decisions. Quality-Adjusted Life-Years are a widely used preferences-based measure of health outcomes. We develop a time-dependent Quality-Adjusted Life-Years-based expected utility function for censored data that should be maximized to estimate an optimal threshold. We performed a simulation study to compare estimated thresholds when using the proposed expected utility approach and purely statistical estimators. Two applications illustrate the usefulness of the proposed methodology which was implemented in the R package ROCt ( www.divat.fr ). First, by reanalysing data of a randomized clinical trial comparing the efficacy of prednisone vs. placebo in patients with chronic liver cirrhosis, we demonstrate the utility of treating patients with a prothrombin level higher than 89%. Second, we reanalyze the data of an observational cohort of kidney transplant recipients: we conclude to the uselessness of the Kidney Transplant Failure Score to adapt the frequency of clinical visits. Applying such a patient-centered methodology may improve future transfer of novel prognostic scoring systems or markers in clinical practice.

  4. Terahertz plasmonic laser radiating in an ultra-narrow beam

    DOE PAGES

    Wu, Chongzhao; Khanal, Sudeep; Reno, John L.; ...

    2016-07-07

    Plasmonic lasers (spasers) generate coherent surface plasmon polaritons (SPPs) and could be realized at subwavelength dimensions in metallic cavities for applications in nanoscale optics. Plasmonic cavities are also utilized for terahertz quantum-cascade lasers (QCLs), which are the brightest available solid-state sources of terahertz radiation. A long standing challenge for spasers that are utilized as nanoscale sources of radiation, is their poor coupling to the far-field radiation. Unlike conventional lasers that could produce directional beams, spasers have highly divergent radiation patterns due to their subwavelength apertures. Here, we theoretically and experimentally demonstrate a new technique for implementing distributed feedback (DFB) thatmore » is distinct from any other previously utilized DFB schemes for semiconductor lasers. The so-termed antenna-feedback scheme leads to single-mode operation in plasmonic lasers, couples the resonant SPP mode to a highly directional far-field radiation pattern, and integrates hybrid SPPs in surrounding medium into the operation of the DFB lasers. Experimentally, the antenna-feedback method, which does not require the phase matching to a well-defined effective index, is implemented for terahertz QCLs, and single-mode terahertz QCLs with a beam divergence as small as 4°×4° are demonstrated, which is the narrowest beam reported for any terahertz QCL to date. Moreover, in contrast to a negligible radiative field in conventional photonic band-edge lasers, in which the periodicity follows the integer multiple of half-wavelengths inside the active medium, antenna-feedback breaks this integer limit for the first time and enhances the radiative field of the lasing mode. Terahertz lasers with narrow-beam emission will find applications for integrated as well as standoff terahertz spectroscopy and sensing. Furthermore, the antenna-feedback scheme is generally applicable to any plasmonic laser with a Fabry–Perot cavity irrespective of its operating wavelength and could bring plasmonic lasers closer to practical applications.« less

  5. Finite-element modeling of compression and gravity on a population of breast phantoms for multimodality imaging simulation.

    PubMed

    Sturgeon, Gregory M; Kiarashi, Nooshin; Lo, Joseph Y; Samei, E; Segars, W P

    2016-05-01

    The authors are developing a series of computational breast phantoms based on breast CT data for imaging research. In this work, the authors develop a program that will allow a user to alter the phantoms to simulate the effect of gravity and compression of the breast (craniocaudal or mediolateral oblique) making the phantoms applicable to multimodality imaging. This application utilizes a template finite-element (FE) breast model that can be applied to their presegmented voxelized breast phantoms. The FE model is automatically fit to the geometry of a given breast phantom, and the material properties of each element are set based on the segmented voxels contained within the element. The loading and boundary conditions, which include gravity, are then assigned based on a user-defined position and compression. The effect of applying these loads to the breast is computed using a multistage contact analysis in FEBio, a freely available and well-validated FE software package specifically designed for biomedical applications. The resulting deformation of the breast is then applied to a boundary mesh representation of the phantom that can be used for simulating medical images. An efficient script performs the above actions seamlessly. The user only needs to specify which voxelized breast phantom to use, the compressed thickness, and orientation of the breast. The authors utilized their FE application to simulate compressed states of the breast indicative of mammography and tomosynthesis. Gravity and compression were simulated on example phantoms and used to generate mammograms in the craniocaudal or mediolateral oblique views. The simulated mammograms show a high degree of realism illustrating the utility of the FE method in simulating imaging data of repositioned and compressed breasts. The breast phantoms and the compression software can become a useful resource to the breast imaging research community. These phantoms can then be used to evaluate and compare imaging modalities that involve different positioning and compression of the breast.

  6. 42 CFR 456.51 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... institution for mental disease, as defined in § 440.10; (2) [Reserved] (3) Services provided in specialty hospitals and (b) Exclude services provided in mental hospitals. Utilization control requirements for mental... ASSISTANCE PROGRAMS UTILIZATION CONTROL Utilization Control: Hospitals § 456.51 Definitions. As used in this...

  7. Clustering execution in a processing system to increase power savings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bose, Pradip; Buyuktosunoglu, Alper; Jacobson, Hans M.

    Embodiments relate to clustering execution in a processing system. An aspect includes accessing a control flow graph that defines a data dependency and an execution sequence of a plurality of tasks of an application that executes on a plurality of system components. The execution sequence of the tasks in the control flow graph is modified as a clustered control flow graph that clusters active and idle phases of a system component while maintaining the data dependency. The clustered control flow graph is sent to an operating system, where the operating system utilizes the clustered control flow graph for scheduling themore » tasks.« less

  8. Research pressure instrumentation for NASA Space Shuttle main engine, modification no. 5

    NASA Technical Reports Server (NTRS)

    Anderson, P. J.; Nussbaum, P.; Gustafson, G.

    1984-01-01

    The objective of the research project described is to define and demonstrate methods to advance the state of the art of pressure sensors for the space shuttle main engine (SSME). Silicon piezoresistive technology was utilized in completing tasks: generation and testing of three transducer design concepts for solid state applications; silicon resistor characterization at cryogenic temperatures; experimental chip mounting characterization; frequency response optimization and prototype design and fabrication. Excellent silicon sensor performance was demonstrated at liquid nitrogen temperature. A silicon resistor ion implant dose was customized for SSME temperature requirements. A basic acoustic modeling software program was developed as a design tool to evaluate frequency response characteristics.

  9. Heavy Lift Launch Capability with a New Hydrocarbon Engine

    NASA Technical Reports Server (NTRS)

    Threet, Grady E., Jr.; Holt, James B.; Philips, Alan D.; Garcia, Jessica A.

    2011-01-01

    The Advanced Concepts Office at NASA's George C. Marshall Space Flight Center was tasked to define the thrust requirement of a new liquid oxygen rich staged combustion cycle hydrocarbon engine that could be utilized in a launch vehicle to meet NASA s future heavy lift needs. Launch vehicle concepts were sized using this engine for different heavy lift payload classes. Engine out capabilities for one of the heavy lift configurations were also analyzed for increased reliability that may be desired for high value payloads or crewed missions. The applicability for this engine in vehicle concepts to meet military and commercial class payloads comparable to current ELV capability was also evaluated.

  10. High-intensity focused ultrasound: advances in technology and experimental trials support enhanced utility of focused ultrasound surgery in oncology

    PubMed Central

    Malietzis, G; Monzon, L; Hand, J; Wasan, H; Leen, E; Abel, M; Muhammad, A; Abel, P

    2013-01-01

    High-intensity focused ultrasound (HIFU) is a rapidly maturing technology with diverse clinical applications. In the field of oncology, the use of HIFU to non-invasively cause tissue necrosis in a defined target, a technique known as focused ultrasound surgery (FUS), has considerable potential for tumour ablation. In this article, we outline the development and underlying principles of HIFU, overview the limitations and commercially available equipment for FUS, then summarise some of the recent technological advances and experimental clinical trials that we predict will have a positive impact on extending the role of FUS in cancer therapy. PMID:23403455

  11. Use of Six Sigma strategies to pull the line on central line-associated bloodstream infections in a neurotrauma intensive care unit.

    PubMed

    Loftus, Kelli; Tilley, Terry; Hoffman, Jason; Bradburn, Eric; Harvey, Ellen

    2015-01-01

    The creation of a consistent culture of safety and quality in an intensive care unit is challenging. We applied the Six Sigma Define-Measure-Analyze-Improve-Control (DMAIC) model for quality improvement (QI) to develop a long-term solution to improve outcomes in a high-risk neurotrauma intensive care unit. We sought to reduce central line utilization as a cornerstone in preventing central line-associated bloodstream infections (CLABSIs). This study describes the successful application of the DMAIC model in the creation and implementation of evidence-based quality improvement designed to reduce CLABSIs to below national benchmarks.

  12. Pressure loss modulus correlation for Delta p across uniformly distributed-loss devices

    NASA Technical Reports Server (NTRS)

    Nunz, Gregory J.

    1994-01-01

    A dimensionless group, called a pressure loss modulus (N(sub PL)), is introduced that, in conjunction with an appropriately defined Reynolds number, is of considerable engineering utility in correlating steady-state Delta p vs flow calibration data and subsequently as a predictor, using the same or a different fluid, in uniformly distributed pressure loss devices. It is particularly useful under operation in the transition regime. Applications of this simple bivariate correlation to three diverse devices of particular interest for small liquid rocket engine fluid systems are discussed: large L/D capillary tube restrictors, packed granular catalyst beds, and stacked vortex-loss disk restrictors.

  13. A psycho-endocrinological overview of transsexualism.

    PubMed

    Michel, A; Mormont, C; Legros, J J

    2001-10-01

    The technical possibility of surgical sex change has opened up a debate concerning the legitimacy and utility of carrying out such an intervention at the request of the transsexual. Diagnostic, psychological, medical and ethical arguments have been brought forth, both for and against. Nonetheless, anatomical transformation by surgical means has currently become a practice as the frequency of serious gender identity disorders is constantly progressing. After a brief introduction, the present paper will consider typological, aetiological and epidemiological aspects of transsexualism. Treatment of the sex change applicant is then defined and discussed in terms of psychological, psychiatric, endocrinological and surgical aspects. Finally, the question of post-operation follow-up will be examined.

  14. LSST system analysis and integration task for an advanced science and application space platform

    NASA Technical Reports Server (NTRS)

    1980-01-01

    To support the development of an advanced science and application space platform (ASASP) requirements of a representative set of payloads requiring large separation distances selected from the Science and Applications Space Platform data base. These payloads were a 100 meter diameter atmospheric gravity wave antenna, a 100 meter by 100 meter particle beam injection experiment, a 2 meter diameter, 18 meter long astrometric telescope, and a 15 meter diameter, 35 meter long large ambient deployable IR telescope. A low earth orbit at 500 km altitude and 56 deg inclination was selected as being the best compromise for meeting payload requirements. Platform subsystems were defined which would support the payload requirements and a physical platform concept was developed. Structural system requirements which included utilities accommodation, interface requirements, and platform strength and stiffness requirements were developed. An attitude control system concept was also described. The resultant ASASP concept was analyzed and technological developments deemed necessary in the area of large space systems were recommended.

  15. Mobile Tele-Mental Health: Increasing Applications and a Move to Hybrid Models of Care

    PubMed Central

    Chan, Steven Richard; Torous, John; Hinton, Ladson; Yellowlees, Peter

    2014-01-01

    Mobile telemental health is defined as the use of mobile phones and other wireless devices as applied to psychiatric and mental health practice. Applications of such include treatment monitoring and adherence, health promotion, ecological momentary assessment, and decision support systems. Advantages of mobile telemental health are underscored by its interactivity, just-in-time interventions, and low resource requirements and portability. Challenges in realizing this potential of mobile telemental health include the low penetration rates of health applications on mobile devices in part due to health literacy, the delay in current published research in evaluating newer technologies, and outdated research methodologies. Despite such challenges, one immediate opportunity for mobile telemental health is utilizing mobile devices as videoconferencing mediums for psychotherapy and psychosocial interventions enhanced by novel sensor based monitoring and behavior-prediction algorithms. This paper provides an overview of mobile telemental health and its current trends, as well as future opportunities as applied to patient care in both academic research and commercial ventures. PMID:27429272

  16. Preliminary design of a solar central receiver for site-specific repowering application (Saguaro Power Plant). Volume II. Preliminary design. Final report, October 1982-September 1983

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weber, E.R.

    1983-09-01

    The solar central receiver technology, site, and specific unit for repowering were selected in prior analyses and studies. The objectives of this preliminary design study were to: develop a solar central receiver repowering design for Saguaro that (1) has potential to be economically competitive with fossil fueled plants in near and long term applications, (2) has the greatest chance for completion without further government funding, (3) will further define technical and economic feasibility of a 66 MWe gross size plant that is adequate to meet the requirements for utility and industrial process heat applications, (4) can potentially be constructed andmore » operated within the next five years, and (5) incorporates solar central receiver technology and represents state-of-the-art development. This volume on the preliminary design includes the following sections: executive summary; introduction; changes from advanced conceptual design; preliminary design; system characteristics; economic analysis; and development plan.« less

  17. An introduction to adaptive management for threatened and endangered species

    USGS Publications Warehouse

    Runge, Michael C.

    2011-01-01

    Management of threatened and endangered species would seem to be a perfect context for adaptive management. Many of the decisions are recurrent and plagued by uncertainty, exactly the conditions that warrant an adaptive approach. But although the potential of adaptive management in these settings has been extolled, there are limited applications in practice. The impediments to practical implementation are manifold and include semantic confusion, institutional inertia, misperceptions about the suitability and utility, and a lack of guiding examples. In this special section of the Journal of Fish and Wildlife Management, we hope to reinvigorate the appropriate application of adaptive management for threatened and endangered species by framing such management in a decision-analytical context, clarifying misperceptions, classifying the types of decisions that might be amenable to an adaptive approach, and providing three fully developed case studies. In this overview paper, I define terms, review the past application of adaptive management, challenge perceived hurdles, and set the stage for the case studies which follow.

  18. Mobile Tele-Mental Health: Increasing Applications and a Move to Hybrid Models of Care.

    PubMed

    Chan, Steven Richard; Torous, John; Hinton, Ladson; Yellowlees, Peter

    2014-05-06

    Mobile telemental health is defined as the use of mobile phones and other wireless devices as applied to psychiatric and mental health practice. Applications of such include treatment monitoring and adherence, health promotion, ecological momentary assessment, and decision support systems. Advantages of mobile telemental health are underscored by its interactivity, just-in-time interventions, and low resource requirements and portability. Challenges in realizing this potential of mobile telemental health include the low penetration rates of health applications on mobile devices in part due to health literacy, the delay in current published research in evaluating newer technologies, and outdated research methodologies. Despite such challenges, one immediate opportunity for mobile telemental health is utilizing mobile devices as videoconferencing mediums for psychotherapy and psychosocial interventions enhanced by novel sensor based monitoring and behavior-prediction algorithms. This paper provides an overview of mobile telemental health and its current trends, as well as future opportunities as applied to patient care in both academic research and commercial ventures.

  19. Bioconversion of Chitin to Bioactive Chitooligosaccharides: Amelioration and Coastal Pollution Reduction by Microbial Resources.

    PubMed

    Kumar, Manish; Brar, Amandeep; Vivekanand, V; Pareek, Nidhi

    2018-04-10

    Chitin-metabolizing products are of high industrial relevance in current scenario due to their wide biological applications, relatively lower cost, greater abundance, and sustainable supply. Chitooligosaccharides have remarkably wide spectrum of applications in therapeutics such as antitumor agents, immunomodulators, drug delivery, gene therapy, wound dressings, as chitinase inhibitors to prevent malaria. Hypocholesterolemic and antimicrobial activities of chitooligosaccharides make them a molecule of choice for food industry, and their functional profile depends on the physicochemical characteristics. Recently, chitin-based nanomaterials are also gaining tremendous importance in biomedical and agricultural applications. Crystallinity and insolubility of chitin imposes a major hurdle in the way of polymer utilization. Chemical production processes are known to produce chitooligosaccharides with variable degree of polymerization and properties along with ecological concerns. Biological production routes mainly involve chitinases, chitosanases, and chitin-binding proteins. Development of bio-catalytic production routes for chitin will not only enhance the production of commercially viable chitooligosaccharides with defined molecular properties but will also provide a means to combat marine pollution with value addition.

  20. A Data Filter for Identifying Steady-State Operating Points in Engine Flight Data for Condition Monitoring Applications

    NASA Technical Reports Server (NTRS)

    Simon, Donald L.; Litt, Jonathan S.

    2010-01-01

    This paper presents an algorithm that automatically identifies and extracts steady-state engine operating points from engine flight data. It calculates the mean and standard deviation of select parameters contained in the incoming flight data stream. If the standard deviation of the data falls below defined constraints, the engine is assumed to be at a steady-state operating point, and the mean measurement data at that point are archived for subsequent condition monitoring purposes. The fundamental design of the steady-state data filter is completely generic and applicable for any dynamic system. Additional domain-specific logic constraints are applied to reduce data outliers and variance within the collected steady-state data. The filter is designed for on-line real-time processing of streaming data as opposed to post-processing of the data in batch mode. Results of applying the steady-state data filter to recorded helicopter engine flight data are shown, demonstrating its utility for engine condition monitoring applications.

  1. Resource utilization in surgery after the revision of surgical fee schedule in Japan.

    PubMed

    Nakata, Yoshinori; Yoshimura, Tatsuya; Watanabe, Yuichi; Otake, Hiroshi; Oiso, Giichiro; Sawa, Tomohiro

    2015-01-01

    The purpose of this paper is to examine whether the current surgical reimbursement system in Japan reflects resource utilization after the revision of fee schedule in 2014. The authors collected data from all the surgical procedures performed at Teikyo University Hospital from April 1 through September 30, 2014. The authors defined the decision-making unit as a surgeon with the highest academic rank in the surgery. Inputs were defined as the number of medical doctors who assisted surgery, and the time of operation from skin incision to closure. An output was defined as the surgical fee. The authors calculated surgeons' efficiency scores using data envelopment analysis. The efficiency scores of each surgical specialty were significantly different (p=0.000). This result demonstrates that the Japanese surgical reimbursement scales still fail to reflect resource utilization despite the revision of surgical fee schedule.

  2. Computer technology applications in industrial and organizational psychology.

    PubMed

    Crespin, Timothy R; Austin, James T

    2002-08-01

    This article reviews computer applications developed and utilized by industrial-organizational (I-O) psychologists, both in practice and in research. A primary emphasis is on applications developed for Internet usage, because this "network of networks" changes the way I-O psychologists work. The review focuses on traditional and emerging topics in I-O psychology. The first topic involves information technology applications in measurement, defined broadly across levels of analysis (persons, groups, organizations) and domains (abilities, personality, attitudes). Discussion then focuses on individual learning at work, both in formal training and in coping with continual automation of work. A section on job analysis follows, illustrating the role of computers and the Internet in studying jobs. Shifting focus to the group level of analysis, we briefly review how information technology is being used to understand and support cooperative work. Finally, special emphasis is given to the emerging "third discipline" in I-O psychology research-computational modeling of behavioral events in organizations. Throughout this review, themes of innovation and dissemination underlie a continuum between research and practice. The review concludes by setting a framework for I-O psychology in a computerized and networked world.

  3. Magnetic Resonance Guided Focused Ultrasound Surgery: Part 2 – A Review of Current and Future Applications

    PubMed Central

    Medel, Ricky; Monteith, Stephen J.; Elias, W. Jeffrey; Eames, Matthew; Snell, John; Sheehan, Jason P.; Wintermark, Max; Jolesz, Ferenc A.; Kassell, Neal F.

    2014-01-01

    Magnetic Resonance guided Focused Ultrasound Surgery (MRgFUS) represents a novel combination of technologies that is actively being realized as a non-invasive therapeutic tool for a myriad of conditions. These applications are reviewed with a focus on neurological utilization. A combined search of Pubmed and Medline was performed to identify the key events and current status of MRgFUS, with a focus on neurological applications. MRgFUS signifies a potentially ideal device for the treatment of neurological diseases. As it is nearly real-time, it allows monitored provision of treatment location and energy deposition, is noninvasive, thereby limiting or eliminating disruption of normal tissue, provides focal delivery of therapeutic agents, enhances radiation delivery, and permits modulation of neural function. Multiple clinical applications are currently in clinical use and many more are under active preclinical investigation. The therapeutic potential of MRgFUS is expanding rapidly. Although clinically in its infancy, preclinical and early phase I clinical trials in neurosurgery suggest a promising future for MRgFUS. Further investigation is necessary to define its true potential and impact. PMID:22791029

  4. Stem cell bioprinting for applications in regenerative medicine.

    PubMed

    Tricomi, Brad J; Dias, Andrew D; Corr, David T

    2016-11-01

    Many regenerative medicine applications seek to harness the biologic power of stem cells in architecturally complex scaffolds or microenvironments. Traditional tissue engineering methods cannot create such intricate structures, nor can they precisely control cellular position or spatial distribution. These limitations have spurred advances in the field of bioprinting, aimed to satisfy these structural and compositional demands. Bioprinting can be defined as the programmed deposition of cells or other biologics, often with accompanying biomaterials. In this concise review, we focus on recent advances in stem cell bioprinting, including performance, utility, and applications in regenerative medicine. More specifically, this review explores the capability of bioprinting to direct stem cell fate, engineer tissue(s), and create functional vascular networks. Furthermore, the unique challenges and concerns related to bioprinting living stem cells, such as viability and maintaining multi- or pluripotency, are discussed. The regenerative capacity of stem cells, when combined with the structural/compositional control afforded by bioprinting, provides a unique and powerful tool to address the complex demands of tissue engineering and regenerative medicine applications. © 2016 New York Academy of Sciences.

  5. Automatic corpus callosum segmentation for standardized MR brain scanning

    NASA Astrophysics Data System (ADS)

    Xu, Qing; Chen, Hong; Zhang, Li; Novak, Carol L.

    2007-03-01

    Magnetic Resonance (MR) brain scanning is often planned manually with the goal of aligning the imaging plane with key anatomic landmarks. The planning is time-consuming and subject to inter- and intra- operator variability. An automatic and standardized planning of brain scans is highly useful for clinical applications, and for maximum utility should work on patients of all ages. In this study, we propose a method for fully automatic planning that utilizes the landmarks from two orthogonal images to define the geometry of the third scanning plane. The corpus callosum (CC) is segmented in sagittal images by an active shape model (ASM), and the result is further improved by weighting the boundary movement with confidence scores and incorporating region based refinement. Based on the extracted contour of the CC, several important landmarks are located and then combined with landmarks from the coronal or transverse plane to define the geometry of the third plane. Our automatic method is tested on 54 MR images from 24 patients and 3 healthy volunteers, with ages ranging from 4 months to 70 years old. The average accuracy with respect to two manually labeled points on the CC is 3.54 mm and 4.19 mm, and differed by an average of 2.48 degrees from the orientation of the line connecting them, demonstrating that our method is sufficiently accurate for clinical use.

  6. Defined xenogeneic-free and hypoxic environment provides superior conditions for long-term expansion of human adipose-derived stem cells.

    PubMed

    Yang, Sufang; Pilgaard, Linda; Chase, Lucas G; Boucher, Shayne; Vemuri, Mohan C; Fink, Trine; Zachar, Vladimir

    2012-08-01

    Development and implementation of therapeutic protocols based on stem cells or tissue-engineered products relies on methods that enable the production of substantial numbers of cells while complying with stringent quality and safety demands. In the current study, we aimed to assess the benefits of maintaining cultures of adipose-derived stem cells (ASCs) in a defined culture system devoid of xenogeneic components (xeno-free) and hypoxia over a 49-day growth period. Our data provide evidence that conditions involving StemPro mesenchymal stem cells serum-free medium (SFM) Xeno-Free and hypoxia (5% oxygen concentration) in the culture atmosphere provide a superior proliferation rate compared to a standard growth environment comprised of alpha-modified Eagle medium (A-MEM) supplemented with fetal calf serum (FCS) and ambient air (20% oxygen concentration) or that of A-MEM supplemented with FCS and hypoxia. Furthermore, a flow cytometric analysis and in vitro differentiation assays confirmed the immunophenotype stability and maintained multipotency of ASCs when expanded under xeno-free conditions and hypoxia. In conclusion, our data demonstrate that growth conditions utilizing a xeno-free and hypoxic environment not only provide an improved environment for the expansion of ASCs, but also set the stage as a culture system with the potential broad spectrum utility for regenerative medicine and tissue engineering applications.

  7. EXACT2: the semantics of biomedical protocols

    PubMed Central

    2014-01-01

    Background The reliability and reproducibility of experimental procedures is a cornerstone of scientific practice. There is a pressing technological need for the better representation of biomedical protocols to enable other agents (human or machine) to better reproduce results. A framework that ensures that all information required for the replication of experimental protocols is essential to achieve reproducibility. Methods We have developed the ontology EXACT2 (EXperimental ACTions) that is designed to capture the full semantics of biomedical protocols required for their reproducibility. To construct EXACT2 we manually inspected hundreds of published and commercial biomedical protocols from several areas of biomedicine. After establishing a clear pattern for extracting the required information we utilized text-mining tools to translate the protocols into a machine amenable format. We have verified the utility of EXACT2 through the successful processing of previously 'unseen' (not used for the construction of EXACT2) protocols. Results The paper reports on a fundamentally new version EXACT2 that supports the semantically-defined representation of biomedical protocols. The ability of EXACT2 to capture the semantics of biomedical procedures was verified through a text mining use case. In this EXACT2 is used as a reference model for text mining tools to identify terms pertinent to experimental actions, and their properties, in biomedical protocols expressed in natural language. An EXACT2-based framework for the translation of biomedical protocols to a machine amenable format is proposed. Conclusions The EXACT2 ontology is sufficient to record, in a machine processable form, the essential information about biomedical protocols. EXACT2 defines explicit semantics of experimental actions, and can be used by various computer applications. It can serve as a reference model for for the translation of biomedical protocols in natural language into a semantically-defined format. PMID:25472549

  8. Distributed controller clustering in software defined networks

    PubMed Central

    Gani, Abdullah; Akhunzada, Adnan; Talebian, Hamid; Choo, Kim-Kwang Raymond

    2017-01-01

    Software Defined Networking (SDN) is an emerging promising paradigm for network management because of its centralized network intelligence. However, the centralized control architecture of the software-defined networks (SDNs) brings novel challenges of reliability, scalability, fault tolerance and interoperability. In this paper, we proposed a novel clustered distributed controller architecture in the real setting of SDNs. The distributed cluster implementation comprises of multiple popular SDN controllers. The proposed mechanism is evaluated using a real world network topology running on top of an emulated SDN environment. The result shows that the proposed distributed controller clustering mechanism is able to significantly reduce the average latency from 8.1% to 1.6%, the packet loss from 5.22% to 4.15%, compared to distributed controller without clustering running on HP Virtual Application Network (VAN) SDN and Open Network Operating System (ONOS) controllers respectively. Moreover, proposed method also shows reasonable CPU utilization results. Furthermore, the proposed mechanism makes possible to handle unexpected load fluctuations while maintaining a continuous network operation, even when there is a controller failure. The paper is a potential contribution stepping towards addressing the issues of reliability, scalability, fault tolerance, and inter-operability. PMID:28384312

  9. Analysis and evaluation of the applicability of green energy technology

    NASA Astrophysics Data System (ADS)

    Xu, Z. J.; Song, Y. K.

    2017-11-01

    With the seriousness of environmental issues and the shortage of resources, the applicability of green energy technology has been paid more and more attention by scholars in different fields. However, the current researches are often single in perspective and simple in method. According to the Theory of Applicable Technology, this paper analyzes and defines the green energy technology and its applicability from the all-around perspectives of symbiosis of economy, society, environment and science & technology etc., and correspondingly constructs the evaluation index system. The paper further applies the Fuzzy Comprehensive Evaluation to the evaluation of its applicability, discusses in depth the evaluation models and methods, and explains in detail with an example. The author holds that the applicability of green energy technology involves many aspects of economy, society, environment and science & technology and can be evaluated comprehensively by an index system composed of a number of independent indexes. The evaluation is multi-object, multi-factor, multi-level and fuzzy comprehensive, which is undoubtedly correct, effective and feasible by the Fuzzy Comprehensive Evaluation. It is of vital theoretical and practical significance to understand and evaluate comprehensively the applicability of green energy technology for the rational development and utilization of green energy technology and for the better promotion of sustainable development of human and nature.

  10. 10 CFR 205.375 - Guidelines defining inadequate fuel or energy supply.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 3 2010-01-01 2010-01-01 false Guidelines defining inadequate fuel or energy supply. 205.375 Section 205.375 Energy DEPARTMENT OF ENERGY OIL ADMINISTRATIVE PROCEDURES AND SANCTIONS Electric... Electric Power § 205.375 Guidelines defining inadequate fuel or energy supply. An inadequate utility system...

  11. 10 CFR 205.375 - Guidelines defining inadequate fuel or energy supply.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 3 2011-01-01 2011-01-01 false Guidelines defining inadequate fuel or energy supply. 205.375 Section 205.375 Energy DEPARTMENT OF ENERGY OIL ADMINISTRATIVE PROCEDURES AND SANCTIONS Electric... Electric Power § 205.375 Guidelines defining inadequate fuel or energy supply. An inadequate utility system...

  12. 10 CFR 205.375 - Guidelines defining inadequate fuel or energy supply.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 3 2014-01-01 2014-01-01 false Guidelines defining inadequate fuel or energy supply. 205.375 Section 205.375 Energy DEPARTMENT OF ENERGY OIL ADMINISTRATIVE PROCEDURES AND SANCTIONS Electric... Electric Power § 205.375 Guidelines defining inadequate fuel or energy supply. An inadequate utility system...

  13. 10 CFR 205.375 - Guidelines defining inadequate fuel or energy supply.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 3 2013-01-01 2013-01-01 false Guidelines defining inadequate fuel or energy supply. 205.375 Section 205.375 Energy DEPARTMENT OF ENERGY OIL ADMINISTRATIVE PROCEDURES AND SANCTIONS Electric... Electric Power § 205.375 Guidelines defining inadequate fuel or energy supply. An inadequate utility system...

  14. 10 CFR 205.375 - Guidelines defining inadequate fuel or energy supply.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 3 2012-01-01 2012-01-01 false Guidelines defining inadequate fuel or energy supply. 205.375 Section 205.375 Energy DEPARTMENT OF ENERGY OIL ADMINISTRATIVE PROCEDURES AND SANCTIONS Electric... Electric Power § 205.375 Guidelines defining inadequate fuel or energy supply. An inadequate utility system...

  15. Evaluation of expert system application based on usability aspects

    NASA Astrophysics Data System (ADS)

    Munaiseche, C. P. C.; Liando, O. E. S.

    2016-04-01

    Usability usually defined as a point of human acceptance to a product or a system based on understands and right reaction to an interface. The performance of web application has been influence by the quality of the interface of that web to supporting information transfer process. Preferably, before the applications of expert systems were installed in the operational environment, these applications must be evaluated first by usability testing. This research aimed to measure the usability of the expert system application using tasks as interaction media. This study uses an expert system application to diagnose skin disease in human using questionnaire method which utilize the tasks as interaction media in measuring the usability. Certain tasks were executed by the participants in observing usability value of the application. The usability aspects observed were learnability, efficiency, memorability, errors, and satisfaction. Each questionnaire question represent aspects of usability. The results present the usability value for each aspect and the total average merit for all the five-usability aspect was 4.28, this indicated that the tested expert system application is in the range excellent for the usability level, so the application can be implemented as the operated system by user. The main contribution of the study is the research became the first step in using task model in the usability evaluation for the expert system application software.

  16. Utility of coupling nonlinear optimization methods with numerical modeling software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Murphy, M.J.

    1996-08-05

    Results of using GLO (Global Local Optimizer), a general purpose nonlinear optimization software package for investigating multi-parameter problems in science and engineering is discussed. The package consists of the modular optimization control system (GLO), a graphical user interface (GLO-GUI), a pre-processor (GLO-PUT), a post-processor (GLO-GET), and nonlinear optimization software modules, GLOBAL & LOCAL. GLO is designed for controlling and easy coupling to any scientific software application. GLO runs the optimization module and scientific software application in an iterative loop. At each iteration, the optimization module defines new values for the set of parameters being optimized. GLO-PUT inserts the new parametermore » values into the input file of the scientific application. GLO runs the application with the new parameter values. GLO-GET determines the value of the objective function by extracting the results of the analysis and comparing to the desired result. GLO continues to run the scientific application over and over until it finds the ``best`` set of parameters by minimizing (or maximizing) the objective function. An example problem showing the optimization of material model is presented (Taylor cylinder impact test).« less

  17. Application of borehole geophysics to water-resources investigations

    USGS Publications Warehouse

    Keys, W.S.; MacCary, L.M.

    1971-01-01

    This manual is intended to be a guide for hydrologists using borehole geophysics in ground-water studies. The emphasis is on the application and interpretation of geophysical well logs, and not on the operation of a logger. It describes in detail those logging techniques that have been utilized within the Water Resources Division of the U.S. Geological Survey, and those used in petroleum investigations that have potential application to hydrologic problems. Most of the logs described can be made by commercial logging service companies, and many can be made with small water-well loggers. The general principles of each technique and the rules of log interpretation are the same, regardless of differences in instrumentation. Geophysical well logs can be interpreted to determine the lithology, geometry, resistivity, formation factor, bulk density, porosity, permeability, moisture content, and specific yield of water-bearing rocks, and to define the source, movement, and chemical and physical characteristics of ground water. Numerous examples of logs are used to illustrate applications and interpretation in various ground-water environments. The interrelations between various types of logs are emphasized, and the following aspects are described for each of the important logging techniques: Principles and applications, instrumentation, calibration and standardization, radius of investigation, and extraneous effects.

  18. Introduction to Generalized Functions with Applications in Aerodynamics and Aeroacoustics

    NASA Technical Reports Server (NTRS)

    Farassat, F.

    1994-01-01

    Generalized functions have many applications in science and engineering. One useful aspect is that discontinuous functions can be handled as easily as continuous or differentiable functions and provide a powerful tool in formulating and solving many problems of aerodynamics and acoustics. Furthermore, generalized function theory elucidates and unifies many ad hoc mathematical approaches used by engineers and scientists. We define generalized functions as continuous linear functionals on the space of infinitely differentiable functions with compact support, then introduce the concept of generalized differentiation. Generalized differentiation is the most important concept in generalized function theory and the applications we present utilize mainly this concept. First, some results of classical analysis, are derived with the generalized function theory. Other applications of the generalized function theory in aerodynamics discussed here are the derivations of general transport theorems for deriving governing equations of fluid mechanics, the interpretation of the finite part of divergent integrals, the derivation of the Oswatitsch integral equation of transonic flow, and the analysis of velocity field discontinuities as sources of vorticity. Applications in aeroacoustics include the derivation of the Kirchhoff formula for moving surfaces, the noise from moving surfaces, and shock noise source strength based on the Ffowcs Williams-Hawkings equation.

  19. The PSRO hospital review system.

    PubMed

    Goran, M J; Roberts, J S; Kellogg, M A; Fielding, J; Jessee, W

    1975-04-01

    The 1972 Social Security amendments contained the landmark Professional Standards Review Organization (PSRO) provisions as well as several sections upgrading existing utilization review (UR) requirements under Medicare and Medicaid. With issuance of the PSRO Program Manual and the recent publication of the new UR regulations, HEW for the first time has brought Medicare and Medicaid hospital review requirements into conformity and made them compatible with and supportive of the PSRO program. This article defines the PSRO hospital review system, describes how the three major components-concurrent review, medical care evaluation studies, and profile analysis-interrelate and provides examples of each of these components. Under utilization review requirements or PSRO, hospitals will be required to implement an integrated system of review designed to assure appropriate utilization practices and improve the quality of care. These aims are to be accomplished through the application of concepts of peer review, the use of norms, criteria, and standards, the identification of deficiencies in the quality, administration, or appropriateness of health care services, and their correction through linkage with programs of continuing medical education. Although PSROs are initially responsible for review in hospitals, they will likely provide the locus for a community-wide system of peer review for all services provided under National Health Insurance.

  20. Recent advances in immunosensor for narcotic drug detection

    PubMed Central

    Gandhi, Sonu; Suman, Pankaj; Kumar, Ashok; Sharma, Prince; Capalash, Neena; Suri, C. Raman

    2015-01-01

    Introduction: Immunosensor for illicit drugs have gained immense interest and have found several applications for drug abuse monitoring. This technology has offered a low cost detection of narcotics; thereby, providing a confirmatory platform to compliment the existing analytical methods. Methods: In this minireview, we define the basic concept of transducer for immunosensor development that utilizes antibodies and low molecular mass hapten (opiate) molecules. Results: This article emphasizes on recent advances in immunoanalytical techniques for monitoring of opiate drugs. Our results demonstrate that high quality antibodies can be used for immunosensor development against target analyte with greater sensitivity, specificity and precision than other available analytical methods. Conclusion: In this review we highlight the fundamentals of different transducer technologies and its applications for immunosensor development currently being developed in our laboratory using rapid screening via immunochromatographic kit, label free optical detection via enzyme, fluorescence, gold nanoparticles and carbon nanotubes based immunosensing for sensitive and specific monitoring of opiates. PMID:26929925

  1. Electrochemical Analysis of Neurotransmitters

    NASA Astrophysics Data System (ADS)

    Bucher, Elizabeth S.; Wightman, R. Mark

    2015-07-01

    Chemical signaling through the release of neurotransmitters into the extracellular space is the primary means of communication between neurons. More than four decades ago, Ralph Adams and his colleagues realized the utility of electrochemical methods for the study of easily oxidizable neurotransmitters, such as dopamine, norepinephrine, and serotonin and their metabolites. Today, electrochemical techniques are frequently coupled to microelectrodes to enable spatially resolved recordings of rapid neurotransmitter dynamics in a variety of biological preparations spanning from single cells to the intact brain of behaving animals. In this review, we provide a basic overview of the principles underlying constant-potential amperometry and fast-scan cyclic voltammetry, the most commonly employed electrochemical techniques, and the general application of these methods to the study of neurotransmission. We thereafter discuss several recent developments in sensor design and experimental methodology that are challenging the current limitations defining the application of electrochemical methods to neurotransmitter measurements.

  2. Energy loss analysis of an integrated space power distribution system

    NASA Technical Reports Server (NTRS)

    Kankam, M. D.; Ribeiro, P. F.

    1992-01-01

    The results of studies related to conceptual topologies of an integrated utility-like space power system are described. The system topologies are comparatively analyzed by considering their transmission energy losses as functions of mainly distribution voltage level and load composition. The analysis is expedited by use of a Distribution System Analysis and Simulation (DSAS) software. This recently developed computer program by the Electric Power Research Institute (EPRI) uses improved load models to solve the power flow within the system. However, present shortcomings of the software with regard to space applications, and incompletely defined characteristics of a space power system make the results applicable to only the fundamental trends of energy losses of the topologies studied. Accountability, such as included, for the effects of the various parameters on the system performance can constitute part of a planning tool for a space power distribution system.

  3. Photovoltaic design optimization for terrestrial applications

    NASA Technical Reports Server (NTRS)

    Ross, R. G., Jr.

    1978-01-01

    As part of the Jet Propulsion Laboratory's Low-Cost Solar Array Project, a comprehensive program of module cost-optimization has been carried out. The objective of these studies has been to define means of reducing the cost and improving the utility and reliability of photovoltaic modules for the broad spectrum of terrestrial applications. This paper describes one of the methods being used for module optimization, including the derivation of specific equations which allow the optimization of various module design features. The method is based on minimizing the life-cycle cost of energy for the complete system. Comparison of the life-cycle energy cost with the marginal cost of energy each year allows the logical plant lifetime to be determined. The equations derived allow the explicit inclusion of design parameters such as tracking, site variability, and module degradation with time. An example problem involving the selection of an optimum module glass substrate is presented.

  4. Performance Evaluation of Phasor Measurement Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Zhenyu; Kasztenny, Bogdan; Madani, Vahid

    2008-07-20

    After two decades of phasor network deployment, phasor measurements are now available at many major substations and power plants. The North American SynchroPhasor Initiative (NASPI), supported by both the US Department of Energy and the North American Electricity Reliability Council (NERC), provides a forum to facilitate the efforts in phasor technology in North America. Phasor applications have been explored and some are in today’s utility practice. IEEE C37.118 Standard is a milestone in standardizing phasor measurements and defining performance requirements. To comply with IEEE C37.118 and to better understand the impact of phasor quality on applications, the NASPI Performance andmore » Standards Task Team (PSTT) initiated and accomplished the development of two important documents to address characterization of PMUs and instrumentation channels, which leverage prior work (esp. in WECC) and international experience. This paper summarizes the accomplished PSTT work and presents the methods for phasor measurement evaluation.« less

  5. Extreme Programming in a Research Environment

    NASA Technical Reports Server (NTRS)

    Wood, William A.; Kleb, William L.

    2002-01-01

    This article explores the applicability of Extreme Programming in a scientific research context. The cultural environment at a government research center differs from the customer-centric business view. The chief theoretical difficulty lies in defining the customer to developer relationship. Specifically, can Extreme Programming be utilized when the developer and customer are the same person? Eight of Extreme Programming's 12 practices are perceived to be incompatible with the existing research culture. Further, six of the nine 'environments that I know don't do well with XP' apply. A pilot project explores the use of Extreme Programming in scientific research. The applicability issues are addressed and it is concluded that Extreme Programming can function successfully in situations for which it appears to be ill-suited. A strong discipline for mentally separating the customer and developer roles is found to be key for applying Extreme Programming in a field that lacks a clear distinction between the customer and the developer.

  6. Robust and Low-Cost Flame-Treated Wood for High-Performance Solar Steam Generation.

    PubMed

    Xue, Guobin; Liu, Kang; Chen, Qian; Yang, Peihua; Li, Jia; Ding, Tianpeng; Duan, Jiangjiang; Qi, Bei; Zhou, Jun

    2017-05-03

    Solar-enabled steam generation has attracted increasing interest in recent years because of its potential applications in power generation, desalination, and wastewater treatment, among others. Recent studies have reported many strategies for promoting the efficiency of steam generation by employing absorbers based on carbon materials or plasmonic metal nanoparticles with well-defined pores. In this work, we report that natural wood can be utilized as an ideal solar absorber after a simple flame treatment. With ultrahigh solar absorbance (∼99%), low thermal conductivity (0.33 W m -1 K -1 ), and good hydrophilicity, the flame-treated wood can localize the solar heating at the evaporation surface and enable a solar-thermal efficiency of ∼72% under a solar intensity of 1 kW m -2 , and it thus represents a renewable, scalable, low-cost, and robust material for solar steam applications.

  7. Computational Aerothermodynamics in Aeroassist Applications

    NASA Technical Reports Server (NTRS)

    Gnoffo, Peter A.

    2001-01-01

    Aeroassisted planetary entry uses atmospheric drag to decelerate spacecraft from super-orbital to orbital or suborbital velocities. Numerical simulation of flow fields surrounding these spacecraft during hypersonic atmospheric entry is required to define aerothermal loads. The severe compression in the shock layer in front of the vehicle and subsequent, rapid expansion into the wake are characterized by high temperature, thermo-chemical nonequilibrium processes. Implicit algorithms required for efficient, stable computation of the governing equations involving disparate time scales of convection, diffusion, chemical reactions, and thermal relaxation are discussed. Robust point-implicit strategies are utilized in the initialization phase; less robust but more efficient line-implicit strategies are applied in the endgame. Applications to ballutes (balloon-like decelerators) in the atmospheres of Venus, Mars, Titan, Saturn, and Neptune and a Mars Sample Return Orbiter (MSRO) are featured. Examples are discussed where time-accurate simulation is required to achieve a steady-state solution.

  8. USAF Bioenvironmental Noise Data Handbook. Volume 156. HH-1N In-flight Crew Noise

    NASA Astrophysics Data System (ADS)

    Hille, H. K.

    1982-11-01

    The HH-IN is a USAF multi-purpose utility helicopter providing support for various USAF missions. This report provides measured data defining the bioacoustic environments at flight crew locations inside this helicopter during normal flight operations. Data are reported for two locations in a wide variety of physical and psychoacoustic measures: overall and band sound pressure levels, C-weighted and A-weighted sound levels, preferred speech interference level, perceived noise level, and limiting times for total daily exposure of personnel with and without standard Air Force ear protectors. Refer to Volume 1 of this handbook, USAF Bioenvironmental Noise Data Handbook, Vol. 1: Organization, Content and Application, AMRL-TR-75-50(1) 1975, for discussion of the objective and design of the handbook, the types of data presented, measurement procedures, instrumentation, data processing, definitions of quantities, symbols, equations, applications, limitations, etc.

  9. Systems study of transport aircraft incorporating advanced aluminum alloys

    NASA Technical Reports Server (NTRS)

    Sakata, I. F.

    1982-01-01

    A study was performed to quantify the potential benefits of utilizing advanced aluminum alloys in commercial transport aircraft and to define the effort necessary to develop fully the alloys to a viable commercial production capability. The comprehensive investigation (1) established realistic advanced aluminum alloy property goals to maximize aircraft systems effectiveness (2) identified performance and economic benefits of incorporating the advanced alloy in future advanced technology commercial aircraft designs (3) provided a recommended plan for development and integration of the alloys into commercial aircraft production (4) provided an indication of the timing and investigation required by the metal producing industry to support the projected market and (5) evaluate application of advanced aluminum alloys to other aerospace and transit systems as a secondary objective. The results of the investigation provided a roadmap and identified key issues requiring attention in an advanced aluminum alloy and applications technology development program.

  10. Applications of Endothermic Reaction Technology to the High Speed Civil Transport

    NASA Technical Reports Server (NTRS)

    Glickstein, Marvin R.; Spadaccini, Louis J.

    1998-01-01

    The success of strategies for controlling emissions and enhancing performance in High Speed Research applications may be Increased by more effective utilization of the heat sink afforded by the fuel in the vehicle thermal management system. This study quantifies the potential benefits associated with the use of supercritical preheating and endothermic cracking of let fuel prior to combustion to enhance the thermal management capabilities of the propulsion systems in the High Speed Civil Transport (HSCT). A fuel-cooled thermal management system, consisting of plate-fin heat exchangers and a small auxiliary compressor, is defined for the HSCT, Integrated with the engine, and an assessment of the effect on engine performance, weight, and operating cost is performed. The analysis indicates significant savings due a projected improvement in fuel economy, and the potential for additional benefit if the cycle is modified to take full advantage of all the heat sink available in the fuel.

  11. Psychodynamic psychotherapy: a core conceptual model and its application.

    PubMed

    Corradi, Richard B

    2006-01-01

    Contemporary American psychiatry, influenced by the "biologic revolution" with its emphasis on a brain-disease model of mental illness, and operating in a managed care delivery system, is in danger of relinquishing its listening and talking functions--psychotherapy--in favor of prescribing drugs. However, despite remarkable advances in the neurosciences, there is still no pharmaceutical magic bullet. The author argues for the continued relevancy of psychotherapy and outlines a practical psychodynamic approach that utilizes fundamental analytic concepts. These concepts--transference, the dual theory of drives, the repetition compulsion, and mechanisms of defense--are described and their clinical application is illustrated. This core conceptual model has significant heuristic value in treating patients and in teaching psychotherapy to psychiatric residents. With its emphasis on the power of the doctor-patient relationship, it teaches residents an effective body of knowledge that helps them define their professional identity-as psychiatrists whose most effective therapeutic tool is themselves, not the drugs they dispense.

  12. A Robust Compositional Architecture for Autonomous Systems

    NASA Technical Reports Server (NTRS)

    Brat, Guillaume; Deney, Ewen; Farrell, Kimberley; Giannakopoulos, Dimitra; Jonsson, Ari; Frank, Jeremy; Bobby, Mark; Carpenter, Todd; Estlin, Tara

    2006-01-01

    Space exploration applications can benefit greatly from autonomous systems. Great distances, limited communications and high costs make direct operations impossible while mandating operations reliability and efficiency beyond what traditional commanding can provide. Autonomous systems can improve reliability and enhance spacecraft capability significantly. However, there is reluctance to utilizing autonomous systems. In part this is due to general hesitation about new technologies, but a more tangible concern is that of reliability of predictability of autonomous software. In this paper, we describe ongoing work aimed at increasing robustness and predictability of autonomous software, with the ultimate goal of building trust in such systems. The work combines state-of-the-art technologies and capabilities in autonomous systems with advanced validation and synthesis techniques. The focus of this paper is on the autonomous system architecture that has been defined, and on how it enables the application of validation techniques for resulting autonomous systems.

  13. Plasmonic beaming and active control over fluorescent emission.

    PubMed

    Jun, Young Chul; Huang, Kevin C Y; Brongersma, Mark L

    2011-01-01

    Nanometallic optical antennas are rapidly gaining popularity in applications that require exquisite control over light concentration and emission processes. The search is on for high-performance antennas that offer facile integration on chips. Here we demonstrate a new, easily fabricated optical antenna design that achieves an unprecedented level of control over fluorescent emission by combining concepts from plasmonics, radiative decay engineering and optical beaming. The antenna consists of a nanoscale plasmonic cavity filled with quantum dots coupled to a miniature grating structure that can be engineered to produce one or more highly collimated beams. Electromagnetic simulations and confocal microscopy were used to visualize the beaming process. The metals defining the plasmonic cavity can be utilized to electrically control the emission intensity and wavelength. These findings facilitate the realization of a new class of active optical antennas for use in new optical sources and a wide range of nanoscale optical spectroscopy applications.

  14. Electrochemical Analysis of Neurotransmitters

    PubMed Central

    Bucher, Elizabeth S.; Wightman, R. Mark

    2016-01-01

    Chemical signaling through the release of neurotransmitters into the extracellular space is the primary means of communication between neurons. More than four decades ago, Ralph Adams and his colleagues realized the utility of electrochemical methods for the study of easily oxidizable neurotransmitters, such as dopamine, norepinephrine, and serotonin and their metabolites. Today, electrochemical techniques are frequently coupled to microelectrodes to enable spatially resolved recordings of rapid neurotransmitter dynamics in a variety of biological preparations spanning from single cells to the intact brain of behaving animals. In this review, we provide a basic overview of the principles underlying constant-potential amperometry and fast-scan cyclic voltammetry, the most commonly employed electrochemical techniques, and the general application of these methods to the study of neurotransmission. We thereafter discuss several recent developments in sensor design and experimental methodology that are challenging the current limitations defining the application of electrochemical methods to neurotransmitter measurements. PMID:25939038

  15. Utility photovoltaic group: Status report

    NASA Astrophysics Data System (ADS)

    Serfass, Jeffrey A.; Hester, Stephen L.; Wills, Bethany N.

    1996-01-01

    The Utility PhotoVoltaic Group (UPVG) was formed in October of 1992 with a mission to accelerate the use of cost-effective small-scale and emerging grid-connected applications of photovoltaics for the benefit of electric utilities and their customers. The UPVG is now implementing a program to install up to 50 megawatts of photovoltaics in small-scale and grid-connected applications. This program, called TEAM-UP, is a partnership of the U.S. electric utility industry and the U.S. Department of Energy to help develop utility PV markets. TEAM-UP is a utility-directed program to significantly increase utility PV experience by promoting installations of utility PV systems. Two primary program areas are proposed for TEAM-UP: (1) Small-Scale Applications (SSA)—an initiative to aggregate utility purchases of small-scale, grid-independent applications; and (2) Grid-Connected Applications (GCA)—an initiative to identify and competitively award cost-sharing contracts for grid-connected PV systems with high market growth potential, or collective purchase programs involving multiple buyers. This paper describes these programs and outlines the schedule, the procurement status, and the results of the TEAM-UP process.

  16. High-power LED package requirements

    NASA Astrophysics Data System (ADS)

    Wall, Frank; Martin, Paul S.; Harbers, Gerard

    2004-01-01

    Power LEDs have evolved from simple indicators into illumination devices. For general lighting applications, where the objective is to light up an area, white LED arrays have been utilized to serve that function. Cost constraints will soon drive the industry to provide a discrete lighting solution. Early on, that will mean increasing the power densities while quantum efficiencies are addressed. For applications such as automotive headlamps & projection, where light needs to be tightly collimated, or controlled, arrays of die or LEDs will not be able to satisfy the requirements & limitations defined by etendue. Ultimately, whether a luminaire requires a small source with high luminance, or light spread over a general area, economics will force the evolution of the illumination LED into a compact discrete high power package. How the customer interfaces with this new package should be an important element considered early on in the design cycle. If an LED footprint of adequate size is not provided, it may prove impossible for the customer, or end user, to get rid of the heat in a manner sufficient to prevent premature LED light output degradation. Therefore it is critical, for maintaining expected LED lifetime & light output, that thermal performance parameters be defined, by design, at the system level, which includes heat sinking methods & interface materials or methdology.

  17. The application of optical coherence tomography angiography in retinal diseases.

    PubMed

    Sambhav, Kumar; Grover, Sandeep; Chalam, Kakarla V

    Optical coherence tomography angiography (OCTA) is a new, noninvasive imaging technique that generates real-time volumetric data on chorioretinal vasculature and its flow pattern. With the advent of high-speed optical coherence tomography, established enface chorioretinal segmentation, and efficient algorithms, OCTA generates images that resemble an angiogram. The principle of OCTA involves determining the change in backscattering between consecutive B-scans and then attributing the differences to the flow of erythrocytes through retinal blood vessels. OCTA has shown promise in the evaluation of common ophthalmologic diseases such as diabetic retinopathy, age-related macular degeneration, and retinal vascular occlusions. It quantifies vascular compromise reflecting the severity of diabetic retinopathy. OCTA detects the presence of choroidal neovascularization in exudative age-related macular degeneration and maps loss of choriocapillaris in nonexudative age-related macular degeneration. We describe principles of OCTA and findings in common and some uncommon retinal pathologies. Finally, we summarize its potential future applications. Its current limitations include a relatively small field of view, inability to show leakage, and a tendency for image artifacts. Further larger studies will define OCTAs utility in clinical settings and establish if the technology may offer its utility in decreasing morbidity through early detection and guide therapeutic interventions in retinal diseases. Copyright © 2017 Elsevier Inc. All rights reserved.

  18. Pros and cons of conjoint analysis of discrete choice experiments to define classification and response criteria in rheumatology.

    PubMed

    Taylor, William J

    2016-03-01

    Conjoint analysis of choice or preference data has been used in marketing for over 40 years but has appeared in healthcare settings much more recently. It may be a useful technique for applications within the rheumatology field. Conjoint analysis in rheumatology contexts has mainly used the approaches implemented in 1000Minds Ltd, Dunedin, New Zealand, Sawtooth Software, Orem UT, USA. Examples include classification criteria, composite response criteria, service prioritization tools and utilities assessment. Limitations imposed by very many attributes can be managed using new techniques. Conjoint analysis studies of classification and response criteria suggest that the assumption of equal weighting of attributes cannot be met, which challenges traditional approaches to composite criteria construction. Weights elicited through choice experiments with experts can derive more accurate classification criteria, than unweighted criteria. Studies that find significant variation in attribute weights for composite response criteria for gout make construction of such criteria problematic. Better understanding of various multiattribute phenomena is likely to increase with increased use of conjoint analysis, especially when the attributes concern individual perceptions or opinions. In addition to classification criteria, some applications for conjoint analysis that are emerging in rheumatology include prioritization tools, remission criteria, and utilities for life areas.

  19. 4-acetamido-2,2,6,6-tetramethylpiperidine-1-oxyl as a model organic redox active compound for nonaqueous flow batteries

    NASA Astrophysics Data System (ADS)

    Milshtein, Jarrod D.; Barton, John L.; Darling, Robert M.; Brushett, Fikile R.

    2016-09-01

    Nonaqueous redox flow batteries (NAqRFBs) that utilize redox active organic molecules are an emerging energy storage concept with the possibility of meeting grid storage requirements. Sporadic and uneven advances in molecular discovery and development, however, have stymied efforts to quantify the performance characteristics of nonaqueous redox electrolytes and flow cells. A need exists for archetypal redox couples, with well-defined electrochemical properties, high solubility in relevant electrolytes, and broad availability, to serve as probe molecules. This work investigates the 4-acetamido-2,2,6,6-tetramethylpiperidine-1-oxyl (AcNH-TEMPO) redox pair for such an application. We report the physicochemical and electrochemical properties of the reduced and oxidized compounds at dilute concentrations for electroanalysis, as well as moderate-to-high concentrations for RFB applications. Changes in conductivity, viscosity, and UV-vis absorbance as a function of state-of-charge are quantified. Cyclic voltammetry investigates the redox potential, reversibility, and diffusion coefficients of dilute solutions, while symmetric flow cell cycling determines the stability of the AcNH-TEMPO redox pair over long experiment times. Finally, single electrolyte flow cell studies demonstrate the utility of this redox couple as a platform chemistry for benchmarking NAqRFB performance.

  20. The Utility of the OMI HCHO/NO2 in Air Quality Decision-Making Activities

    NASA Technical Reports Server (NTRS)

    Duncan, Bryan

    2010-01-01

    I will discuss a novel and practical application of the OMI HCHU and NO2 data products to the "weight of evidence" in the air quality decision-making process (e.g., State Implementation Plan (SIP)) for a city, region, or state to demonstrate that it is making progress toward attainment of the National Ambient Air Quality Standard (NAAQS) for ozone. Any trend, or lack thereof, in the observed OMI HCHO/NO2 may support that an emission control strategy implemented to reduce ozone is or is not occurring for a metropolitan area. In addition, the observed OMI HCHO/NO2 may be used to define new emission control strategies as the photochemical environments of urban areas evolve over time. I will demonstrate the utility of the OMI HCHO/NO2 over the U.S. for air quality applications with support from simulations with both a regional model and a photochemical box model. These results support mission planning of an OMI-like instrument for the proposed GEO-CAPE satellite that has as one of its objectives to study air quality from space. However, I'm attending the meeting as the Aura Deputy Project Scientist, so I don't technically need to present anything to justify the travel.

  1. Engineering peptide ligase specificity by proteomic identification of ligation sites.

    PubMed

    Weeks, Amy M; Wells, James A

    2018-01-01

    Enzyme-catalyzed peptide ligation is a powerful tool for site-specific protein bioconjugation, but stringent enzyme-substrate specificity limits its utility. We developed an approach for comprehensively characterizing peptide ligase specificity for N termini using proteome-derived peptide libraries. We used this strategy to characterize the ligation efficiency for >25,000 enzyme-substrate pairs in the context of the engineered peptide ligase subtiligase and identified a family of 72 mutant subtiligases with activity toward N-terminal sequences that were previously recalcitrant to modification. We applied these mutants individually for site-specific bioconjugation of purified proteins, including antibodies, and in algorithmically selected combinations for sequencing of the cellular N terminome with reduced sequence bias. We also developed a web application to enable algorithmic selection of the most efficient subtiligase variant(s) for bioconjugation to user-defined sequences. Our methods provide a new toolbox of enzymes for site-specific protein modification and a general approach for rapidly defining and engineering peptide ligase specificity.

  2. Onboard Inert Gas Generation System/Onboard Oxygen Gas Generation System (OBIGGS/OBOGS) Study. Part 1; Aircraft System Requirements

    NASA Technical Reports Server (NTRS)

    Reynolds, Thomas L.; Bailey, Delbert B.; Lewinski, Daniel F.; Roseburg, Conrad M.; Palaszewski, Bryan (Technical Monitor)

    2001-01-01

    The purpose of this technology assessment is to define a multiphase research study program investigating Onboard Inert Gas Generation Systems (OBIGGS) and Onboard Oxygen Generation Systems (OBOGS) that would identify current airplane systems design and certification requirements (Subtask 1); explore state-of-the-art technology (Subtask 2); develop systems specifications (Subtask 3); and develop an initial system design (Subtask 4). If feasible, consideration may be given to the development of a prototype laboratory test system that could potentially be used in commercial transport aircraft (Subtask 5). These systems should be capable of providing inert nitrogen gas for improved fire cargo compartment fire suppression and fuel tank inerting and emergency oxygen for crew and passenger use. Subtask I of this research study, presented herein, defines current production aircraft certification requirements and design objectives necessary to meet mandatory FAA certification requirements and Boeing design and performance specifications. These requirements will be utilized for baseline comparisons for subsequent OBIGGS/OBOGS application evaluations and assessments.

  3. Versatile alignment layer method for new types of liquid crystal photonic devices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Finnemeyer, V.; Bryant, D.; Lu, L.

    2015-07-21

    Liquid crystal photonic devices are becoming increasingly popular. These devices often present a challenge when it comes to creating a robust alignment layer in pre-assembled cells. In this paper, we describe a method of infusing a dye into a microcavity to produce an effective photo-definable alignment layer. However, previous research on such alignment layers has shown that they have limited stability, particularly against subsequent light exposure. As such, we further describe a method of utilizing a pre-polymer, infused into the microcavity along with the liquid crystal, to provide photostability. We demonstrate that the polymer layer, formed under ultraviolet irradiation ofmore » liquid crystal cells, has been effectively localized to a thin region near the substrate surface and provides a significant improvement in the photostability of the liquid crystal alignment. This versatile alignment layer method, capable of being utilized in devices from the described microcavities to displays, offers significant promise for new photonics applications.« less

  4. Advanced System Design Requirements for Large and Small Fixed-wing Aerial Application Systems for Agriculture

    NASA Technical Reports Server (NTRS)

    Hinely, J. T., Jr.; Boyles, R. Q., Jr.

    1979-01-01

    Several candidate aircraft configurations were defined over the range of 1000 to 10,000 pounds payload and evaluated over a broad spectrum of agricultural missions. From these studies, baseline design points were selected at 3200 pounds payload for the small aircraft and 7500 pounds for the large aircraft. The small baseline aircraft utilizes a single turboprop powerplant while the large aircraft utilizes two turboprop powerplants. These configurations were optimized for wing loading, aspect ratio, and power loading to provide the best mission economics in representative missions. Wing loading of 20 lb/sq ft was selected for the small aircraft and 25 lb/sq ft for the large aircraft. Aspect ratio of 8 was selected for both aircraft. It was found that a 10% reduction in engine power from the original configurations provided improved mission economics for both aircraft by reducing the cost of the turboprop. Refined configurations incorporate a 675 HP engine in the small aircraft and two 688 HP engines in the large aircraft.

  5. A semi-automatic method for left ventricle volume estimate: an in vivo validation study

    NASA Technical Reports Server (NTRS)

    Corsi, C.; Lamberti, C.; Sarti, A.; Saracino, G.; Shiota, T.; Thomas, J. D.

    2001-01-01

    This study aims to the validation of the left ventricular (LV) volume estimates obtained by processing volumetric data utilizing a segmentation model based on level set technique. The validation has been performed by comparing real-time volumetric echo data (RT3DE) and magnetic resonance (MRI) data. A validation protocol has been defined. The validation protocol was applied to twenty-four estimates (range 61-467 ml) obtained from normal and pathologic subjects, which underwent both RT3DE and MRI. A statistical analysis was performed on each estimate and on clinical parameters as stroke volume (SV) and ejection fraction (EF). Assuming MRI estimates (x) as a reference, an excellent correlation was found with volume measured by utilizing the segmentation procedure (y) (y=0.89x + 13.78, r=0.98). The mean error on SV was 8 ml and the mean error on EF was 2%. This study demonstrated that the segmentation technique is reliably applicable on human hearts in clinical practice.

  6. One-side forward-backward asymmetry at the LHC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang Youkai; Xiao Bo; Zhu Shouhua

    2011-01-01

    Forward-backward asymmetry A{sub FB} is an essential observable to study the nature of coupling in the standard model and physics beyond the standard model, as shown at LEP and Tevatron. As a proton-proton collider, the LHC does not have the preferred direction contrary to her counterparts, namely, LEP and Tevatron. Therefore, A{sub FB} is not applicable at the LHC. However, for the proton the momentum of the valence quark is usually larger than that of the sea quark. Utilizing this feature we have defined a so-called one-side forward-backward asymmetry A{sub OFB} for the top quark pair production at the LHCmore » in the previous work. In this paper we extend our studies to the charged leptons and bottom quarks as the final states. Our numerical results show that at the LHC A{sub OFB} can be utilized to study the nature of the couplings once enough events are collected.« less

  7. Advances in the Study of Heart Development and Disease Using Zebrafish

    PubMed Central

    Brown, Daniel R.; Samsa, Leigh Ann; Qian, Li; Liu, Jiandong

    2016-01-01

    Animal models of cardiovascular disease are key players in the translational medicine pipeline used to define the conserved genetic and molecular basis of disease. Congenital heart diseases (CHDs) are the most common type of human birth defect and feature structural abnormalities that arise during cardiac development and maturation. The zebrafish, Danio rerio, is a valuable vertebrate model organism, offering advantages over traditional mammalian models. These advantages include the rapid, stereotyped and external development of transparent embryos produced in large numbers from inexpensively housed adults, vast capacity for genetic manipulation, and amenability to high-throughput screening. With the help of modern genetics and a sequenced genome, zebrafish have led to insights in cardiovascular diseases ranging from CHDs to arrhythmia and cardiomyopathy. Here, we discuss the utility of zebrafish as a model system and summarize zebrafish cardiac morphogenesis with emphasis on parallels to human heart diseases. Additionally, we discuss the specific tools and experimental platforms utilized in the zebrafish model including forward screens, functional characterization of candidate genes, and high throughput applications. PMID:27335817

  8. Compendium of information on identification and testing of materials for plastic solar thermal collectors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McGinniss, V.D.; Sliemers, F.A.; Landstrom, D.K.

    1980-07-31

    This report is intended to organize and summarize prior and current literature concerning the weathering, aging, durability, degradation, and testing methodologies as applied to materials for plastic solar thermal collectors. Topics covered include (1) rate of aging of polymeric materials; (2) environmental factors affecting performance; (3) evaluation and prediction of service life; (4) measurement of physical and chemical properties; (5) discussion of evaluation techniques and specific instrumentation; (6) degradation reactions and mechanisms; (7) weathering of specific polymeric materials; and (8) exposure testing methodology. Major emphasis has been placed on defining the current state of the art in plastics degradation andmore » on identifying information that can be utilized in applying appropriate and effective aging tests for use in projecting service life of plastic solar thermal collectors. This information will also be of value where polymeric components are utilized in the construction of conventional solar collectors or any application where plastic degradation and weathering are prime factors in material selection.« less

  9. Negotiating on location, timing, duration, and participant in agent-mediated joint activity-travel scheduling

    NASA Astrophysics Data System (ADS)

    Ma, Huiye; Ronald, Nicole; Arentze, Theo A.; Timmermans, Harry J. P.

    2013-10-01

    Agent-based simulation has become an important modeling approach in activity-travel analysis. Social activities account for a large amount of travel and have an important effect on activity-travel scheduling. Participants in joint activities usually have various options regarding location, participants, and timing and take different approaches to make their decisions. In this context, joint activity participation requires negotiation among agents involved, so that conflicts among the agents can be addressed. Existing mechanisms do not fully provide a solution when utility functions of agents are nonlinear and non-monotonic. Considering activity-travel scheduling in time and space as an application, we propose a novel negotiation approach, which takes into account these properties, such as continuous and discrete issues, and nonlinear and non-monotonic utility functions, by defining a concession strategy and a search mechanism. The results of experiments show that agents having these properties can negotiate efficiently. Furthermore, the negotiation procedure affects individuals’ choices of location, timing, duration, and participants.

  10. Lipid cross-linking of nanolipoprotein particles substantially enhances serum stability and cellular uptake [Lipid crosslinking enhances the stability of nanolipoprotein particles in serum by multiple orders of magnitude

    DOE PAGES

    Gilmore, Sean F.; Blanchette, Craig D.; Scharadin, Tiffany M.; ...

    2016-07-13

    Nanolipoprotein particles (NLPs) consist of a discoidal phospholipid lipid bilayer confined by an apolipoprotein belt. NLPs are a promising platform for a variety of biomedical applications due to their biocompatibility, size, definable composition, and amphipathic characteristics. However, poor serum stability hampers the use of NLPs for in vivo applications such as drug formulation. In this study, NLP stability was enhanced upon the incorporation and subsequent UV-mediated intermolecular cross-linking of photoactive DiynePC phospholipids in the lipid bilayer, forming cross-linked nanoparticles (X-NLPs). Both the concentration of DiynePC in the bilayer and UV exposure time significantly affected the resulting X-NLP stability in 100%more » serum, as assessed by size exclusion chromatography (SEC) of fluorescently labeled particles. Cross-linking did not significantly impact the size of X-NLPs as determined by dynamic light scattering and SEC. X-NLPs had essentially no degradation over 48 h in 100% serum, which is a drastic improvement compared to non-cross-linked NLPs (50% degradation by ~10 min). X-NLPs had greater uptake into the human ATCC 5637 bladder cancer cell line compared to non-cross-linked particles, indicating their potential utility for targeted drug delivery. X-NLPs also exhibited enhanced stability following intravenous administration in mice. Lastly, these results collectively support the potential utility of X-NLPs for a variety of in vivo applications.« less

  11. Lipid cross-linking of nanolipoprotein particles substantially enhances serum stability and cellular uptake [Lipid crosslinking enhances the stability of nanolipoprotein particles in serum by multiple orders of magnitude

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gilmore, Sean F.; Blanchette, Craig D.; Scharadin, Tiffany M.

    Nanolipoprotein particles (NLPs) consist of a discoidal phospholipid lipid bilayer confined by an apolipoprotein belt. NLPs are a promising platform for a variety of biomedical applications due to their biocompatibility, size, definable composition, and amphipathic characteristics. However, poor serum stability hampers the use of NLPs for in vivo applications such as drug formulation. In this study, NLP stability was enhanced upon the incorporation and subsequent UV-mediated intermolecular cross-linking of photoactive DiynePC phospholipids in the lipid bilayer, forming cross-linked nanoparticles (X-NLPs). Both the concentration of DiynePC in the bilayer and UV exposure time significantly affected the resulting X-NLP stability in 100%more » serum, as assessed by size exclusion chromatography (SEC) of fluorescently labeled particles. Cross-linking did not significantly impact the size of X-NLPs as determined by dynamic light scattering and SEC. X-NLPs had essentially no degradation over 48 h in 100% serum, which is a drastic improvement compared to non-cross-linked NLPs (50% degradation by ~10 min). X-NLPs had greater uptake into the human ATCC 5637 bladder cancer cell line compared to non-cross-linked particles, indicating their potential utility for targeted drug delivery. X-NLPs also exhibited enhanced stability following intravenous administration in mice. Lastly, these results collectively support the potential utility of X-NLPs for a variety of in vivo applications.« less

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ennis, G.; Lala, T.K.

    This document presents the results of a study undertaken by First Pacific Networks as part of EPRI Project RP-3567-01 regarding the support of broadcast services within the EPRI Utility Communications Architecture (UCA) protocols and the use of such services by UCA applications. This report has focused on the requirements and architectural implications of broadcast within UCA. A subsequent phase of this project is to develop specific recommendations for extending CUA so as to support broadcast. The conclusions of this report are presented in Section 5. The authors summarize the major conclusions as follows: broadcast and multicast support would be verymore » useful within UCA, not only for utility-specific applications but also simply to support the network engineering of a large-scale communications system, in this regard, UCA is no different from other large network systems which have found broadcast and multicast to be of substantial benefit for a variety of system management purposes; the primary architectural impact of broadcast and multicast falls on the UCA network level (which would need to be enhanced) and the UCA application level (which would be the user of broadcast); there is a useful subset of MMS services which could take advantage of broadcast; the UCA network level would need to be enhanced both in the areas of addressing and routing so as to properly support broadcast. A subsequent analysis will be required to define the specific enhancements to UCA required to support broadcast and multicast.« less

  13. Epitaxial growth of silicon for layer transfer

    DOEpatents

    Teplin, Charles; Branz, Howard M

    2015-03-24

    Methods of preparing a thin crystalline silicon film for transfer and devices utilizing a transferred crystalline silicon film are disclosed. The methods include preparing a silicon growth substrate which has an interface defining substance associated with an exterior surface. The methods further include depositing an epitaxial layer of silicon on the silicon growth substrate at the surface and separating the epitaxial layer from the substrate substantially along the plane or other surface defined by the interface defining substance. The epitaxial layer may be utilized as a thin film of crystalline silicon in any type of semiconductor device which requires a crystalline silicon layer. In use, the epitaxial transfer layer may be associated with a secondary substrate.

  14. Light collection optics for measuring flux and spectrum from light-emitting devices

    DOEpatents

    McCord, Mark A.; DiRegolo, Joseph A.; Gluszczak, Michael R.

    2016-05-24

    Systems and methods for accurately measuring the luminous flux and color (spectra) from light-emitting devices are disclosed. An integrating sphere may be utilized to directly receive a first portion of light emitted by a light-emitting device through an opening defined on the integrating sphere. A light collector may be utilized to collect a second portion of light emitted by the light-emitting device and direct the second portion of light into the integrating sphere through the opening defined on the integrating sphere. A spectrometer may be utilized to measure at least one property of the first portion and the second portion of light received by the integrating sphere.

  15. Japanese plan for SSF utilization

    NASA Technical Reports Server (NTRS)

    Mizuno, Toshio

    1992-01-01

    The Japanese Experiment Module (JEM) program has made significant progress. The JEM preliminary design review was completed in July 1992; construction of JEM operation facilities has begun; and the micro-G airplane, drop shaft, and micro-G experiment rocket are all operational. The national policy for JEM utilization was also established. The Space Experiment Laboratory (SEL) opened in June '92 and will function as a user support center. Eight JEM multiuser facilities are in phase B, and scientific requirements are being defined for 17 candidate multiuser facilities. The National Joint Research Program is about to start. Precursor missions and early Space Station utilization activities are being defined. This paper summarizes the program in outline and graphic form.

  16. Antiandrogenic steroidal sulfonyl heterocycles. Utility of electrostatic complementarity in defining bioisosteric sulfonyl heterocycles.

    PubMed

    Mallamo, J P; Pilling, G M; Wetzel, J R; Kowalczyk, P J; Bell, M R; Kullnig, R K; Batzold, F H; Juniewicz, P E; Winneker, R C; Luss, H R

    1992-05-15

    Complementarity of electrostatic potential surface maps was utilized in defining bioisosteric steroidal androgen receptor antagonists. Semiempirical and ab initio level calculations performed on a series of methanesulfonyl heterocycles indicated the requirement for a partial negative charge at the heteroatom attached to C-3 of the steroid nucleus to attain androgen receptor affinity. Synthesis and testing of six heterocycle A-ring-fused dihydroethisterone derivatives support this hypothesis, and we have identified two new androgen receptor antagonists of this class.

  17. 26 CFR 1.42-10 - Utility allowances.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... into account, among other things, local utility rates, property type, climate and degree-day variables... which the building containing the units is located. (c) Changes in applicable utility allowance—(1) In...)), the applicable utility allowance for units changes, the new utility allowance must be used to compute...

  18. 26 CFR 1.42-10 - Utility allowances.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... into account, among other things, local utility rates, property type, climate and degree-day variables... which the building containing the units is located. (c) Changes in applicable utility allowance—(1) In...)), the applicable utility allowance for units changes, the new utility allowance must be used to compute...

  19. 26 CFR 1.42-10 - Utility allowances.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... into account, among other things, local utility rates, property type, climate and degree-day variables... which the building containing the units is located. (c) Changes in applicable utility allowance—(1) In...)), the applicable utility allowance for units changes, the new utility allowance must be used to compute...

  20. 26 CFR 1.42-10 - Utility allowances.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... into account, among other things, local utility rates, property type, climate and degree-day variables... which the building containing the units is located. (c) Changes in applicable utility allowance—(1) In...)), the applicable utility allowance for units changes, the new utility allowance must be used to compute...

  1. 26 CFR 1.42-10 - Utility allowances.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... into account, among other things, local utility rates, property type, climate and degree-day variables... which the building containing the units is located. (c) Changes in applicable utility allowance—(1) In...)), the applicable utility allowance for units changes, the new utility allowance must be used to compute...

  2. Automated Quantification of Gradient Defined Features

    DTIC Science & Technology

    2008-09-01

    defined features in submarine environments. The technique utilizes MATLAB scripts to convert bathymetry data into a gradient dataset, produce gradient...maps, and most importantly, automate the process of defining and characterizing gradient defined features such as flows, faults, landslide scarps, folds...convergent plate margin hosts a series of large serpentinite mud volcanoes (Fig. 1). One of the largest of these active mud volcanoes is Big Blue

  3. Application-Defined Decentralized Access Control

    PubMed Central

    Xu, Yuanzhong; Dunn, Alan M.; Hofmann, Owen S.; Lee, Michael Z.; Mehdi, Syed Akbar; Witchel, Emmett

    2014-01-01

    DCAC is a practical OS-level access control system that supports application-defined principals. It allows normal users to perform administrative operations within their privilege, enabling isolation and privilege separation for applications. It does not require centralized policy specification or management, giving applications freedom to manage their principals while the policies are still enforced by the OS. DCAC uses hierarchically-named attributes as a generic framework for user-defined policies such as groups defined by normal users. For both local and networked file systems, its execution time overhead is between 0%–9% on file system microbenchmarks, and under 1% on applications. This paper shows the design and implementation of DCAC, as well as several real-world use cases, including sandboxing applications, enforcing server applications’ security policies, supporting NFS, and authenticating user-defined sub-principals in SSH, all with minimal code changes. PMID:25426493

  4. 76 FR 13125 - Announcement of Grant Application Deadlines and Funding Levels; Correction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-10

    ... DEPARTMENT OF AGRICULTURE Rural Utilities Service Announcement of Grant Application Deadlines and Funding Levels; Correction AGENCY: Rural Utilities Service, USDA. ACTION: Notice of Solicitation of Applications; correction. SUMMARY: The United States Department of Agriculture's (USDA) Rural Utilities Service...

  5. 76 FR 12017 - Announcement of Grant Application Deadlines and Funding Levels

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-04

    ... DEPARTMENT OF AGRICULTURE Rural Utilities Service Announcement of Grant Application Deadlines and Funding Levels AGENCY: Rural Utilities Service, USDA. ACTION: Notice of solicitation of applications. SUMMARY: The United States Department of Agriculture's (USDA) Rural Utilities Service (RUS) announces the...

  6. 16 CFR 238.0 - Bait advertising defined. 1

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 16 Commercial Practices 1 2010-01-01 2010-01-01 false Bait advertising defined. 1 238.0 Section... BAIT ADVERTISING § 238.0 Bait advertising defined. 1 1 For the purpose of this part “advertising” includes any form of public notice however disseminated or utilized. Bait advertising is an alluring but...

  7. 16 CFR 238.0 - Bait advertising defined. 1

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 16 Commercial Practices 1 2011-01-01 2011-01-01 false Bait advertising defined. 1 238.0 Section... BAIT ADVERTISING § 238.0 Bait advertising defined. 1 1 For the purpose of this part “advertising” includes any form of public notice however disseminated or utilized. Bait advertising is an alluring but...

  8. 16 CFR 238.0 - Bait advertising defined. 1

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 16 Commercial Practices 1 2014-01-01 2014-01-01 false Bait advertising defined. 1 238.0 Section... BAIT ADVERTISING § 238.0 Bait advertising defined. 1 1 For the purpose of this part “advertising” includes any form of public notice however disseminated or utilized. Bait advertising is an alluring but...

  9. 16 CFR 238.0 - Bait advertising defined. 1

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 16 Commercial Practices 1 2013-01-01 2013-01-01 false Bait advertising defined. 1 238.0 Section... BAIT ADVERTISING § 238.0 Bait advertising defined. 1 1 For the purpose of this part “advertising” includes any form of public notice however disseminated or utilized. Bait advertising is an alluring but...

  10. 16 CFR 238.0 - Bait advertising defined. 1

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 16 Commercial Practices 1 2012-01-01 2012-01-01 false Bait advertising defined. 1 238.0 Section... BAIT ADVERTISING § 238.0 Bait advertising defined. 1 1 For the purpose of this part “advertising” includes any form of public notice however disseminated or utilized. Bait advertising is an alluring but...

  11. Estimating the Integrated Information Measure Phi from High-Density Electroencephalography during States of Consciousness in Humans

    PubMed Central

    Kim, Hyoungkyu; Hudetz, Anthony G.; Lee, Joseph; Mashour, George A.; Lee, UnCheol; Avidan, Michael S.

    2018-01-01

    The integrated information theory (IIT) proposes a quantitative measure, denoted as Φ, of the amount of integrated information in a physical system, which is postulated to have an identity relationship with consciousness. IIT predicts that the value of Φ estimated from brain activities represents the level of consciousness across phylogeny and functional states. Practical limitations, such as the explosive computational demands required to estimate Φ for real systems, have hindered its application to the brain and raised questions about the utility of IIT in general. To achieve practical relevance for studying the human brain, it will be beneficial to establish the reliable estimation of Φ from multichannel electroencephalogram (EEG) and define the relationship of Φ to EEG properties conventionally used to define states of consciousness. In this study, we introduce a practical method to estimate Φ from high-density (128-channel) EEG and determine the contribution of each channel to Φ. We examine the correlation of power, frequency, functional connectivity, and modularity of EEG with regional Φ in various states of consciousness as modulated by diverse anesthetics. We find that our approximation of Φ alone is insufficient to discriminate certain states of anesthesia. However, a multi-dimensional parameter space extended by four parameters related to Φ and EEG connectivity is able to differentiate all states of consciousness. The association of Φ with EEG connectivity during clinically defined anesthetic states represents a new practical approach to the application of IIT, which may be used to characterize various physiological (sleep), pharmacological (anesthesia), and pathological (coma) states of consciousness in the human brain. PMID:29503611

  12. Estimating the Integrated Information Measure Phi from High-Density Electroencephalography during States of Consciousness in Humans.

    PubMed

    Kim, Hyoungkyu; Hudetz, Anthony G; Lee, Joseph; Mashour, George A; Lee, UnCheol

    2018-01-01

    The integrated information theory (IIT) proposes a quantitative measure, denoted as Φ, of the amount of integrated information in a physical system, which is postulated to have an identity relationship with consciousness. IIT predicts that the value of Φ estimated from brain activities represents the level of consciousness across phylogeny and functional states. Practical limitations, such as the explosive computational demands required to estimate Φ for real systems, have hindered its application to the brain and raised questions about the utility of IIT in general. To achieve practical relevance for studying the human brain, it will be beneficial to establish the reliable estimation of Φ from multichannel electroencephalogram (EEG) and define the relationship of Φ to EEG properties conventionally used to define states of consciousness. In this study, we introduce a practical method to estimate Φ from high-density (128-channel) EEG and determine the contribution of each channel to Φ. We examine the correlation of power, frequency, functional connectivity, and modularity of EEG with regional Φ in various states of consciousness as modulated by diverse anesthetics. We find that our approximation of Φ alone is insufficient to discriminate certain states of anesthesia. However, a multi-dimensional parameter space extended by four parameters related to Φ and EEG connectivity is able to differentiate all states of consciousness. The association of Φ with EEG connectivity during clinically defined anesthetic states represents a new practical approach to the application of IIT, which may be used to characterize various physiological (sleep), pharmacological (anesthesia), and pathological (coma) states of consciousness in the human brain.

  13. A fully defined and scalable 3D culture system for human pluripotent stem cell expansion and differentiation

    NASA Astrophysics Data System (ADS)

    Lei, Yuguo; Schaffer, David V.

    2013-12-01

    Human pluripotent stem cells (hPSCs), including human embryonic stem cells and induced pluripotent stem cells, are promising for numerous biomedical applications, such as cell replacement therapies, tissue and whole-organ engineering, and high-throughput pharmacology and toxicology screening. Each of these applications requires large numbers of cells of high quality; however, the scalable expansion and differentiation of hPSCs, especially for clinical utilization, remains a challenge. We report a simple, defined, efficient, scalable, and good manufacturing practice-compatible 3D culture system for hPSC expansion and differentiation. It employs a thermoresponsive hydrogel that combines easy manipulation and completely defined conditions, free of any human- or animal-derived factors, and entailing only recombinant protein factors. Under an optimized protocol, the 3D system enables long-term, serial expansion of multiple hPSCs lines with a high expansion rate (∼20-fold per 5-d passage, for a 1072-fold expansion over 280 d), yield (∼2.0 × 107 cells per mL of hydrogel), and purity (∼95% Oct4+), even with single-cell inoculation, all of which offer considerable advantages relative to current approaches. Moreover, the system enabled 3D directed differentiation of hPSCs into multiple lineages, including dopaminergic neuron progenitors with a yield of ∼8 × 107 dopaminergic progenitors per mL of hydrogel and ∼80-fold expansion by the end of a 15-d derivation. This versatile system may be useful at numerous scales, from basic biological investigation to clinical development.

  14. PMR Graphite Engine Duct Development

    NASA Technical Reports Server (NTRS)

    Stotler, C. L.; Yokel, S. A.

    1989-01-01

    The objective was to demonstrate the cost and weight advantages that could be obtained by utilizing the graphite/PMR15 material system to replace titanium in selected turbofan engine applications. The first component to be selected as a basis for evaluation was the outer bypass duct of the General Electric F404 engine. The operating environment of this duct was defined and then an extensive mechanical and physical property test program was conducted using material made by processing techniques which were also established by this program. Based on these properties, design concepts to fabricate a composite version of the duct were established and two complete ducts fabricated. One of these ducts was proof pressure tested and then run successfully on a factory test engine for over 1900 hours. The second duct was static tested to 210 percent design limit load without failure. An improved design was then developed which utilized integral composite end flanges. A complete duct was fabricated and successfully proof pressure tested. The net results of this effort showed that a composite version of the outer duct would be 14 percent lighter and 30 percent less expensive that the titanium duct. The other type of structure chosen for investigation was the F404 fan stator assembly, including the fan stator vanes. It was concluded that it was feasible to utilize composite materials for this type structure but that the requirements imposed by replacing an existing metal design resulted in an inefficient composite design. It was concluded that if composites were to be effectively used in this type structure, the design must be tailored for composite application from the outset.

  15. The Application of Chinese High-Spatial Remote Sensing Satellite Image in Land Law Enforcement Information Extraction

    NASA Astrophysics Data System (ADS)

    Wang, N.; Yang, R.

    2018-04-01

    Chinese high -resolution (HR) remote sensing satellites have made huge leap in the past decade. Commercial satellite datasets, such as GF-1, GF-2 and ZY-3 images, the panchromatic images (PAN) resolution of them are 2 m, 1 m and 2.1 m and the multispectral images (MS) resolution are 8 m, 4 m, 5.8 m respectively have been emerged in recent years. Chinese HR satellite imagery has been free downloaded for public welfare purposes using. Local government began to employ more professional technician to improve traditional land management technology. This paper focused on analysing the actual requirements of the applications in government land law enforcement in Guangxi Autonomous Region. 66 counties in Guangxi Autonomous Region were selected for illegal land utilization spot extraction with fusion Chinese HR images. The procedure contains: A. Defines illegal land utilization spot type. B. Data collection, GF-1, GF-2, and ZY-3 datasets were acquired in the first half year of 2016 and other auxiliary data were collected in 2015. C. Batch process, HR images were collected for batch preprocessing through ENVI/IDL tool. D. Illegal land utilization spot extraction by visual interpretation. E. Obtaining attribute data with ArcGIS Geoprocessor (GP) model. F. Thematic mapping and surveying. Through analysing 42 counties results, law enforcement officials found 1092 illegal land using spots and 16 suspicious illegal mining spots. The results show that Chinese HR satellite images have great potential for feature information extraction and the processing procedure appears robust.

  16. Generalized vector calculus on convex domain

    NASA Astrophysics Data System (ADS)

    Agrawal, Om P.; Xu, Yufeng

    2015-06-01

    In this paper, we apply recently proposed generalized integral and differential operators to develop generalized vector calculus and generalized variational calculus for problems defined over a convex domain. In particular, we present some generalization of Green's and Gauss divergence theorems involving some new operators, and apply these theorems to generalized variational calculus. For fractional power kernels, the formulation leads to fractional vector calculus and fractional variational calculus for problems defined over a convex domain. In special cases, when certain parameters take integer values, we obtain formulations for integer order problems. Two examples are presented to demonstrate applications of the generalized variational calculus which utilize the generalized vector calculus developed in the paper. The first example leads to a generalized partial differential equation and the second example leads to a generalized eigenvalue problem, both in two dimensional convex domains. We solve the generalized partial differential equation by using polynomial approximation. A special case of the second example is a generalized isoperimetric problem. We find an approximate solution to this problem. Many physical problems containing integer order integrals and derivatives are defined over arbitrary domains. We speculate that future problems containing fractional and generalized integrals and derivatives in fractional mechanics will be defined over arbitrary domains, and therefore, a general variational calculus incorporating a general vector calculus will be needed for these problems. This research is our first attempt in that direction.

  17. Socioeconomic impact of asthma, chronic obstructive pulmonary disease and asthma-COPD overlap syndrome.

    PubMed

    Kim, Jinhee; Kim, Young Sam; Kim, Kyungjoo; Oh, Yeon-Mok; Yoo, Kwang Ha; Rhee, Chin Kook; Lee, Jin Hwa

    2017-06-01

    Asthma-chronic obstructive pulmonary disease (COPD) overlap syndrome (ACOS) is defined as having both features of asthma and COPD, which are airway hyper-responsiveness and incompletely reversible airway obstruction. However, socioeconomic impact of ACOS have not been well appreciated. Adults with available wheezing history and acceptable spirometry were selected from the fourth Korean National Health and Nutrition Examination Survey (KNHANES IV) in 2007-2009. Their data were merged with the Korean National Health Insurance claim data. 'Asthma group' was defined as having self-reported wheezing history and FEV 1 /FVC ≥0.7, 'COPD group' was defined as having FEV 1 /FVC <0.7 and no wheezing, 'ACOS group' was defined as having both wheezing and FEV 1 /FVC <0.7, and 'no airway disease (NAD) group' was defined as having no wheezing and FEV 1 /FVC ≥0.7. Among a total of 11,656 subjects, ACOS comprise 2.2%; COPD, 8.4%; asthma, 5.8% and NAD, 83.6%. Total length of healthcare utilization and medical costs of ACOS group was the top among four groups (P<0.001), though inpatient medical cost was the highest in COPD group (P=0.025). Multiple linear regression analyses showed that ACOS group (β=12.63, P<0.001) and asthma group (β=6.14, P<0.001) were significantly associated with longer duration of healthcare utilization and ACOS group (β=350,475.88, P=0.008) and asthma group (β=386,876.81, P<0.001) were associated with higher medical costs. This study demonstrated that ACOS independently influences healthcare utilization after adjusting several factors. In order to utilize limited medical resources efficiently, it may be necessary to find and manage ACOS patients.

  18. Exploring the cost-utility of stratified primary care management for low back pain compared with current best practice within risk-defined subgroups.

    PubMed

    Whitehurst, David G T; Bryan, Stirling; Lewis, Martyn; Hill, Jonathan; Hay, Elaine M

    2012-11-01

    Stratified management for low back pain according to patients' prognosis and matched care pathways has been shown to be an effective treatment approach in primary care. The aim of this within-trial study was to determine the economic implications of providing such an intervention, compared with non-stratified current best practice, within specific risk-defined subgroups (low-risk, medium-risk and high-risk). Within a cost-utility framework, the base-case analysis estimated the incremental healthcare cost per additional quality-adjusted life year (QALY), using the EQ-5D to generate QALYs, for each risk-defined subgroup. Uncertainty was explored with cost-utility planes and acceptability curves. Sensitivity analyses were performed to consider alternative costing methodologies, including the assessment of societal loss relating to work absence and the incorporation of generic (ie, non-back pain) healthcare utilisation. The stratified management approach was a cost-effective treatment strategy compared with current best practice within each risk-defined subgroup, exhibiting dominance (greater benefit and lower costs) for medium-risk patients and acceptable incremental cost to utility ratios for low-risk and high-risk patients. The likelihood that stratified care provides a cost-effective use of resources exceeds 90% at willingness-to-pay thresholds of £4000 (≈ 4500; $6500) per additional QALY for the medium-risk and high-risk groups. Patients receiving stratified care also reported fewer back pain-related days off work in all three subgroups. Compared with current best practice, stratified primary care management for low back pain provides a highly cost-effective use of resources across all risk-defined subgroups.

  19. Bio-inspired synthesis of hybrid silica nanoparticles templated from elastin-like polypeptide micelles

    NASA Astrophysics Data System (ADS)

    Han, Wei; MacEwan, Sarah R.; Chilkoti, Ashutosh; López, Gabriel P.

    2015-07-01

    The programmed self-assembly of block copolymers into higher order nanoscale structures offers many attractive attributes for the development of new nanomaterials for numerous applications including drug delivery and biosensing. The incorporation of biomimetic silaffin peptides in these block copolymers enables the formation of hybrid organic-inorganic materials, which can potentially enhance the utility and stability of self-assembled nanostructures. We demonstrate the design, synthesis and characterization of amphiphilic elastin-like polypeptide (ELP) diblock copolymers that undergo temperature-triggered self-assembly into well-defined spherical micelles. Genetically encoded incorporation of the silaffin R5 peptide at the hydrophilic terminus of the diblock ELP leads to presentation of the silaffin R5 peptide on the coronae of the micelles, which results in localized condensation of silica and the formation of near-monodisperse, discrete, sub-100 nm diameter hybrid ELP-silica particles. This synthesis method, can be carried out under mild reaction conditions suitable for bioactive materials, and will serve as the basis for the development and application of functional nanomaterials. Beyond silicification, the general strategies described herein may also be adapted for the synthesis of other biohybrid nanomaterials as well.The programmed self-assembly of block copolymers into higher order nanoscale structures offers many attractive attributes for the development of new nanomaterials for numerous applications including drug delivery and biosensing. The incorporation of biomimetic silaffin peptides in these block copolymers enables the formation of hybrid organic-inorganic materials, which can potentially enhance the utility and stability of self-assembled nanostructures. We demonstrate the design, synthesis and characterization of amphiphilic elastin-like polypeptide (ELP) diblock copolymers that undergo temperature-triggered self-assembly into well-defined spherical micelles. Genetically encoded incorporation of the silaffin R5 peptide at the hydrophilic terminus of the diblock ELP leads to presentation of the silaffin R5 peptide on the coronae of the micelles, which results in localized condensation of silica and the formation of near-monodisperse, discrete, sub-100 nm diameter hybrid ELP-silica particles. This synthesis method, can be carried out under mild reaction conditions suitable for bioactive materials, and will serve as the basis for the development and application of functional nanomaterials. Beyond silicification, the general strategies described herein may also be adapted for the synthesis of other biohybrid nanomaterials as well. Electronic supplementary information (ESI) available. See DOI: 10.1039/c5nr01407g

  20. A differential approach to microcomputer test battery development and implementation

    NASA Technical Reports Server (NTRS)

    Kennedy, R. S.; Baltzley, D. R.; Osteen, M. K.; Turnage, J. J.

    1988-01-01

    The present microcomputer-based performance test battery emphasizes psychometric theory and utility for repeated-measures applications during extended exposure to various environmental stressors. In the menu that has been defined at the current state of this system's development, there are more than 30 'qualified' mental tests which stabilize in less than 10 min and possess test-retest reliabilities greater than 0.7 for a three-minute test/work period. The battery encompasses tests of cognition, information processing, psychomotor skill, memory, mood, etc. Several of the tests have demonstrated sensitivity to chemoradiotherapy, sleep loss, hypoxia, amphetamines, thermal stress, sensory deprivation, altitude, fatigue, and alcohol use. Recommendations are presented for 6-, 12-, and 22-min batteries.

  1. Classification with spatio-temporal interpixel class dependency contexts

    NASA Technical Reports Server (NTRS)

    Jeon, Byeungwoo; Landgrebe, David A.

    1992-01-01

    A contextual classifier which can utilize both spatial and temporal interpixel dependency contexts is investigated. After spatial and temporal neighbors are defined, a general form of maximum a posterior spatiotemporal contextual classifier is derived. This contextual classifier is simplified under several assumptions. Joint prior probabilities of the classes of each pixel and its spatial neighbors are modeled by the Gibbs random field. The classification is performed in a recursive manner to allow a computationally efficient contextual classification. Experimental results with bitemporal TM data show significant improvement of classification accuracy over noncontextual pixelwise classifiers. This spatiotemporal contextual classifier should find use in many applications of remote sensing, especially when the classification accuracy is important.

  2. Model authoring system for fail safe analysis

    NASA Technical Reports Server (NTRS)

    Sikora, Scott E.

    1990-01-01

    The Model Authoring System is a prototype software application for generating fault tree analyses and failure mode and effects analyses for circuit designs. Utilizing established artificial intelligence and expert system techniques, the circuits are modeled as a frame-based knowledge base in an expert system shell, which allows the use of object oriented programming and an inference engine. The behavior of the circuit is then captured through IF-THEN rules, which then are searched to generate either a graphical fault tree analysis or failure modes and effects analysis. Sophisticated authoring techniques allow the circuit to be easily modeled, permit its behavior to be quickly defined, and provide abstraction features to deal with complexity.

  3. Metalloproteomics: Forward and Reverse Approaches in Metalloprotein Structural and Functional Characterization

    PubMed Central

    Shi, Wuxian; Chance, Mark R.

    2010-01-01

    About one-third of all proteins are associated with a metal. Metalloproteomics is defined as the structural and functional characterization of metalloproteins on a genome-wide scale. The methodologies utilized in metalloproteomics, including both forward (bottom-up) and reverse (top-down) technologies, to provide information on the identity, quantity and function of metalloproteins are discussed. Important techniques frequently employed in metalloproteomics include classical proteomics tools such as mass spectrometry and 2-D gels, immobilized-metal affinity chromatography, bioinformatics sequence analysis and homology modeling, X-ray absorption spectroscopy and other synchrotron radiation based tools. Combinative applications of these techniques provide a powerful approach to understand the function of metalloproteins. PMID:21130021

  4. Bioprinting for stem cell research

    PubMed Central

    Tasoglu, Savas; Demirci, Utkan

    2012-01-01

    Recently, there has been a growing interest to apply bioprinting techniques to stem cell research. Several bioprinting methods have been developed utilizing acoustics, piezoelectricity, and lasers to deposit living cells onto receiving substrates. Using these technologies, spatially defined gradients of immobilized proteins can be engineered to direct stem cell differentiation into multiple subpopulations of different lineages. Stem cells can also be patterned in a high-throughput manner onto flexible implementation patches for tissue regeneration or onto substrates with the goal of accessing encapsulated stem cell of interest for genomic analysis. Here, we review recent achievements with bioprinting technologies in stem cell research, and identify future challenges and potential applications including tissue engineering and regenerative medicine, wound healing, and genomics. PMID:23260439

  5. Retroviral DNA Integration

    PubMed Central

    2016-01-01

    The integration of a DNA copy of the viral RNA genome into host chromatin is the defining step of retroviral replication. This enzymatic process is catalyzed by the virus-encoded integrase protein, which is conserved among retroviruses and LTR-retrotransposons. Retroviral integration proceeds via two integrase activities: 3′-processing of the viral DNA ends, followed by the strand transfer of the processed ends into host cell chromosomal DNA. Herein we review the molecular mechanism of retroviral DNA integration, with an emphasis on reaction chemistries and architectures of the nucleoprotein complexes involved. We additionally discuss the latest advances on anti-integrase drug development for the treatment of AIDS and the utility of integrating retroviral vectors in gene therapy applications. PMID:27198982

  6. Engineering test facility design definition

    NASA Technical Reports Server (NTRS)

    Bercaw, R. W.; Seikel, G. R.

    1980-01-01

    The Engineering Test Facility (ETF) is the major focus of the Department of Energy (DOE) Magnetohydrodynamics (MHD) Program to facilitate commercialization and to demonstrate the commercial operability of MHD/steam electric power. The ETF will be a fully integrated commercial prototype MHD power plant with a nominal output of 200 MW sub e. Performance of this plant is expected to meet or surpass existing utility standards for fuel, maintenance, and operating costs; plant availability; load following; safety; and durability. It is expected to meet all applicable environmental regulations. The current design concept conforming to the general definition, the basis for its selection, and the process which will be followed in further defining and updating the conceptual design.

  7. Development problem analysis of correlation leak detector’s software

    NASA Astrophysics Data System (ADS)

    Faerman, V. A.; Avramchuk, V. S.; Marukyan, V. M.

    2018-05-01

    In the article, the practical application and the structure of the correlation leak detectors’ software is studied and the task of its designing is analyzed. In the first part of the research paper, the expediency of the facilities development of correlation leak detectors for the following operating efficiency of public utilities exploitation is shown. The analysis of the functional structure of correlation leak detectors is conducted and its program software tasks are defined. In the second part of the research paper some development steps of the software package – requirement forming, program structure definition and software concept creation – are examined in the context of the usage experience of the hardware-software prototype of correlation leak detector.

  8. Optofluidic platforms based on surface-enhanced Raman scattering.

    PubMed

    Lim, Chaesung; Hong, Jongin; Chung, Bong Geun; deMello, Andrew J; Choo, Jaebum

    2010-05-01

    We report recent progress in the development of surface-enhanced Raman scattering (SERS)-based optofluidic platforms for the fast and sensitive detection of chemical and biological analytes. In the current context, a SERS-based optofluidic platform is defined as an integrated analytical device composed of a microfluidic element and a sensitive Raman spectrometer. Optofluidic devices for SERS detection normally involve nanocolloid-based microfluidic systems or metal nanostructure-embedded microfluidic systems. In the current review, recent advances in both approaches are surveyed and assessed. Additionally, integrated real-time sensing systems that combine portable Raman spectrometers with microfluidic devices are also reviewed. Such real-time sensing systems have significant utility in environmental monitoring, forensic science and homeland defense applications.

  9. Utilization of LANDSAT data for water quality surveys in the Choptank River

    NASA Technical Reports Server (NTRS)

    Johnson, J. M.; Cressy, P.; Dallam, W. C.

    1975-01-01

    Computer processing of LANDSAT-1 multispectral digital data demonstrated the applicability of remotely sensed data to water quality survey in the Choptank River. Water classes derived by automated analysis correlate to river nuisance levels of chlorophyll a and sediment loading as defined by the Maryland Department of Water Resources and the U.S. Corps of Engineers. Results indicate that an increase in chlorophyll a concentration corresponds, relative to MSS 5, to decreases in 4 and increases in 6 relative to the trends with increasing sediment load. It appears that for the purpose of water quality analysis, under favorable atmospheric conditions, only MSS 4, 5 and 6 are necessary.

  10. Bindings and RESTlets: A Novel Set of CoAP-Based Application Enablers to Build IoT Applications.

    PubMed

    Teklemariam, Girum Ketema; Van Den Abeele, Floris; Moerman, Ingrid; Demeester, Piet; Hoebeke, Jeroen

    2016-08-02

    Sensors and actuators are becoming important components of Internet of Things (IoT) applications. Today, several approaches exist to facilitate communication of sensors and actuators in IoT applications. Most communications go through often proprietary gateways requiring availability of the gateway for each and every interaction between sensors and actuators. Sometimes, the gateway does some processing of the sensor data before triggering actuators. Other approaches put this processing logic further in the cloud. These approaches introduce significant latencies and increased number of packets. In this paper, we introduce a CoAP-based mechanism for direct binding of sensors and actuators. This flexible binding solution is utilized further to build IoT applications through RESTlets. RESTlets are defined to accept inputs and produce outputs after performing some processing tasks. Sensors and actuators could be associated with RESTlets (which can be hosted on any device) through the flexible binding mechanism we introduced. This approach facilitates decentralized IoT application development by placing all or part of the processing logic in Low power and Lossy Networks (LLNs). We run several tests to compare the performance of our solution with existing solutions and found out that our solution reduces communication delay and number of packets in the LLN.

  11. Bindings and RESTlets: A Novel Set of CoAP-Based Application Enablers to Build IoT Applications

    PubMed Central

    Teklemariam, Girum Ketema; Van Den Abeele, Floris; Moerman, Ingrid; Demeester, Piet; Hoebeke, Jeroen

    2016-01-01

    Sensors and actuators are becoming important components of Internet of Things (IoT) applications. Today, several approaches exist to facilitate communication of sensors and actuators in IoT applications. Most communications go through often proprietary gateways requiring availability of the gateway for each and every interaction between sensors and actuators. Sometimes, the gateway does some processing of the sensor data before triggering actuators. Other approaches put this processing logic further in the cloud. These approaches introduce significant latencies and increased number of packets. In this paper, we introduce a CoAP-based mechanism for direct binding of sensors and actuators. This flexible binding solution is utilized further to build IoT applications through RESTlets. RESTlets are defined to accept inputs and produce outputs after performing some processing tasks. Sensors and actuators could be associated with RESTlets (which can be hosted on any device) through the flexible binding mechanism we introduced. This approach facilitates decentralized IoT application development by placing all or part of the processing logic in Low power and Lossy Networks (LLNs). We run several tests to compare the performance of our solution with existing solutions and found out that our solution reduces communication delay and number of packets in the LLN. PMID:27490554

  12. Resource Management for Real-Time Adaptive Agents

    NASA Technical Reports Server (NTRS)

    Welch, Lonnie; Chelberg, David; Pfarr, Barbara; Fleeman, David; Parrott, David; Tan, Zhen-Yu; Jain, Shikha; Drews, Frank; Bruggeman, Carl; Shuler, Chris

    2003-01-01

    Increased autonomy and automation in onboard flight systems offer numerous potential benefits, including cost reduction and greater flexibility. The existence of generic mechanisms for automation is critical for handling unanticipated science events and anomalies where limitations in traditional control software with fixed, predetermined algorithms can mean loss of science data and missed opportunities for observing important terrestrial events. We have developed such a mechanism by adding a Hierarchical Agent-based ReaLTime technology (HART) extension to our Dynamic Resource Management (DRM) middleware. Traditional DRM provides mechanisms to monitor the realtime performance of distributed applications and to move applications among processors to improve real-time performance. In the HART project we have designed and implemented a performance adaptation mechanism to improve reaktime performance. To use this mechanism, applications are developed that can run at various levels of quality. The DRM can choose a setting for the quality level of an application dynamically at run-time in order to manage satellite resource usage more effectively. A groundbased prototype of a satellite system that captures and processes images has also been developed as part of this project to be used as a benchmark for evaluating the resource management framework A significant enhancement of this generic mission-independent framework allows scientists to specify the utility, or "scientific benefit," of science observations under various conditions like cloud cover and compression method. The resource manager then uses these benefit tables to determine in redtime how to set the quality levels for applications to maximize overall system utility as defined by the scientists running the mission. We also show how maintenance functions llke health and safety data can be integrated into the utility framework. Once thls framework has been certified for missions and successfully flight tested it can be reused with little development overhead for other missions. In contrast, current space missions llke Swift manage similar types of resource trade -off completely with the scientific application code itself, and such code must be re-certified and tested for each mission even if a large portion of the code base is shared. This final report discusses some of the major issues motivating this research effort, provides a literature review of the related work, discusses the resource management framework and ground-based satellite system prototype that has been developed, indicates what work is yet to be performed, and provides a list of publications resulting from this work.

  13. Applications of harvesting system simulation to timber management and utilization analyses

    Treesearch

    John E. Baumgras; Chris B. LeDoux

    1990-01-01

    Applications of timber harvesting system simulation to the economic analysis of forest management and wood utilization practices are presented. These applications include estimating thinning revenue by stand age, estimating impacts of minimum merchantable tree diameter on harvesting revenue, and evaluating wood utilization alternatives relative to pulpwood quotas and...

  14. 77 FR 66607 - Northern Wasco County People's Utility District; Notice of Preliminary Permit Application...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-06

    ... County People's Utility District; Notice of Preliminary Permit Application Accepted for Filing and... County People's Utility District (Northern Wasco) filed an application for a preliminary permit, pursuant...; (2) two five-foot by five-foot sluice gates connecting the new intake channel structure to the...

  15. Potential Re-utilization of Composted Mangrove Litters for Pond Environment Quality Improvement

    NASA Astrophysics Data System (ADS)

    Dwi Hastuti, Endah; Budi Hastuti, Rini; Hariyati, Riche

    2018-05-01

    Production of mangrove litter from pruning and thinning activities is potential source of organic materials which could be re-utilized to improve pond environment quality and fertility. This research aimed to analyze the nutrient composition compost produced from mangrove litter and to describe the effect of compost application on pond quality. This research was conducted through two phases, including composting trial and application of compost on pond trial. Composting process was conducted for 45-60 days on mangrove litter achieved from pruning activities in the silvofishery pond using composting container, while application of compost in pond was conducted by pouring 2 kg of compost in 25 m2 pond. Production of compost included solid compost and liquid compost. Nutrient concentration of solid compost was ranged from 0.47-0.52% for N; 0.36-0.44% for P; and 5.45-6.39% for organic C, while liquid compost provided 0.62-0.69%; 0.24-0.32%; and 3.98-4.45% respectively for N, P and organic C. While C/N ratio was ranged from 11.60-12.78 and 5.77-7.18 respectively for solid and liquid compost. Solid compost quality resulted that N, P and C/N ration had fulfilled the standart criteria defined by Indonesia National Standart for compost. Observed impact of compost application on pond water quality were the improvement of water clarity and increasing abundance of klekap (lab-lab). This showed that mangrove litters could be converted into a more productive materials to enhance pond environment quality and productivity, decrease management cost and increase benefit. Scheduled fertilization with compost is suggested to be conducted to provide best benefit on silvofishery management.

  16. Planetary Data Systems (PDS) Imaging Node Atlas II

    NASA Technical Reports Server (NTRS)

    Stanboli, Alice; McAuley, James M.

    2013-01-01

    The Planetary Image Atlas (PIA) is a Rich Internet Application (RIA) that serves planetary imaging data to the science community and the general public. PIA also utilizes the USGS Unified Planetary Coordinate system (UPC) and the on-Mars map server. The Atlas was designed to provide the ability to search and filter through greater than 8 million planetary image files. This software is a three-tier Web application that contains a search engine backend (MySQL, JAVA), Web service interface (SOAP) between server and client, and a GWT Google Maps API client front end. This application allows for the search, retrieval, and download of planetary images and associated meta-data from the following missions: 2001 Mars Odyssey, Cassini, Galileo, LCROSS, Lunar Reconnaissance Orbiter, Mars Exploration Rover, Mars Express, Magellan, Mars Global Surveyor, Mars Pathfinder, Mars Reconnaissance Orbiter, MESSENGER, Phoe nix, Viking Lander, Viking Orbiter, and Voyager. The Atlas utilizes the UPC to translate mission-specific coordinate systems into a unified coordinate system, allowing the end user to query across missions of similar targets. If desired, the end user can also use a mission-specific view of the Atlas. The mission-specific views rely on the same code base. This application is a major improvement over the initial version of the Planetary Image Atlas. It is a multi-mission search engine. This tool includes both basic and advanced search capabilities, providing a product search tool to interrogate the collection of planetary images. This tool lets the end user query information about each image, and ignores the data that the user has no interest in. Users can reduce the number of images to look at by defining an area of interest with latitude and longitude ranges.

  17. Central station applications planning activities and supporting studies. [application of photovoltaic technology to power generation plants

    NASA Technical Reports Server (NTRS)

    Leonard, S. L.; Siegel, B.

    1980-01-01

    The application of photovoltaic technology in central station (utility) power generation plants is considered. A program of data collection and analysis designed to provide additional information about the subset of the utility market that was identified as the initial target for photovoltaic penetration, the oil-dependent utilities (especially muncipals) of the U.S. Sunbelt, is described along with a series of interviews designed to ascertain utility industry opinions about the National Photovoltaic Program as it relates to central station applications.

  18. An Investigation of Data Privacy and Utility Using Machine Learning as a Gauge

    ERIC Educational Resources Information Center

    Mivule, Kato

    2014-01-01

    The purpose of this investigation is to study and pursue a user-defined approach in preserving data privacy while maintaining an acceptable level of data utility using machine learning classification techniques as a gauge in the generation of synthetic data sets. This dissertation will deal with data privacy, data utility, machine learning…

  19. Fifteen-foot diameter modular space station Kennedy Space Center launch site support definition (space station program Phase B extension definition)

    NASA Technical Reports Server (NTRS)

    Bjorn, L. C.; Martin, M. L.; Murphy, C. W.; Niebla, J. F., V

    1971-01-01

    This document defines the facilities, equipment, and operational plans required to support the MSS Program at KSC. Included is an analysis of KSC operations, a definition of flow plans, facility utilization and modifications, test plans and concepts, activation, and tradeoff studies. Existing GSE and facilities that have a potential utilization are identified, and new items are defined where possible. The study concludes that the existing facilities are suitable for use in the space station program without major modification from the Saturn-Apollo configuration.

  20. MIRIADS: miniature infrared imaging applications development system description and operation

    NASA Astrophysics Data System (ADS)

    Baxter, Christopher R.; Massie, Mark A.; McCarley, Paul L.; Couture, Michael E.

    2001-10-01

    A cooperative effort between the U.S. Air Force Research Laboratory, Nova Research, Inc., the Raytheon Infrared Operations (RIO) and Optics 1, Inc. has successfully produced a miniature infrared camera system that offers significant real-time signal and image processing capabilities by virtue of its modular design. This paper will present an operational overview of the system as well as results from initial testing of the 'Modular Infrared Imaging Applications Development System' (MIRIADS) configured as a missile early-warning detection system. The MIRIADS device can operate virtually any infrared focal plane array (FPA) that currently exists. Programmable on-board logic applies user-defined processing functions to the real-time digital image data for a variety of functions. Daughterboards may be plugged onto the system to expand the digital and analog processing capabilities of the system. A unique full hemispherical infrared fisheye optical system designed and produced by Optics 1, Inc. is utilized by the MIRIADS in a missile warning application to demonstrate the flexibility of the overall system to be applied to a variety of current and future AFRL missions.

  1. Employing WebGL to develop interactive stereoscopic 3D content for use in biomedical visualization

    NASA Astrophysics Data System (ADS)

    Johnston, Semay; Renambot, Luc; Sauter, Daniel

    2013-03-01

    Web Graphics Library (WebGL), the forthcoming web standard for rendering native 3D graphics in a browser, represents an important addition to the biomedical visualization toolset. It is projected to become a mainstream method of delivering 3D online content due to shrinking support for third-party plug-ins. Additionally, it provides a virtual reality (VR) experience to web users accommodated by the growing availability of stereoscopic displays (3D TV, desktop, and mobile). WebGL's value in biomedical visualization has been demonstrated by applications for interactive anatomical models, chemical and molecular visualization, and web-based volume rendering. However, a lack of instructional literature specific to the field prevents many from utilizing this technology. This project defines a WebGL design methodology for a target audience of biomedical artists with a basic understanding of web languages and 3D graphics. The methodology was informed by the development of an interactive web application depicting the anatomy and various pathologies of the human eye. The application supports several modes of stereoscopic displays for a better understanding of 3D anatomical structures.

  2. Photovoltaic village power application: assessment of the near-term market

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rosenblum, L.; Bifano, W.J.; Poley, W.A.

    1978-01-01

    A preliminary assessment of the near-term market for photovoltaic village power applications is presented. One of the objectives of the Department of Energy's (DOE) National Photovoltaic Program is to stimulate the demand for photovoltaic power systems so that appropriate markets will be developed in the near-term to support the increasing photovoltaic production capacity also being developed by DOE. The village power application represents such a potential market for photovoltaics. The price of energy for photovoltaic systems is compared to that of utility line extensions and diesel generators. The potential ''domestic''' demand (including the 50 states of the union plus themore » areas under legal control of the U.S. government) is defined in both the goverment and commercial sectors. The foreign demand and sources of funding for village power systems in the developing countries are also discussed briefly. It is concluded that a near-term domestic market of at least 12 MW (peak) and a foreign market of about 10 GW (peak) exists and that significant market penetration should be possible beginning in the 1981--82 period.« less

  3. Analyzing Spacecraft Telecommunication Systems

    NASA Technical Reports Server (NTRS)

    Kordon, Mark; Hanks, David; Gladden, Roy; Wood, Eric

    2004-01-01

    Multi-Mission Telecom Analysis Tool (MMTAT) is a C-language computer program for analyzing proposed spacecraft telecommunication systems. MMTAT utilizes parameterized input and computational models that can be run on standard desktop computers to perform fast and accurate analyses of telecommunication links. MMTAT is easy to use and can easily be integrated with other software applications and run as part of almost any computational simulation. It is distributed as either a stand-alone application program with a graphical user interface or a linkable library with a well-defined set of application programming interface (API) calls. As a stand-alone program, MMTAT provides both textual and graphical output. The graphs make it possible to understand, quickly and easily, how telecommunication performance varies with variations in input parameters. A delimited text file that can be read by any spreadsheet program is generated at the end of each run. The API in the linkable-library form of MMTAT enables the user to control simulation software and to change parameters during a simulation run. Results can be retrieved either at the end of a run or by use of a function call at any time step.

  4. Cogeneration Technology Alternatives Study (CTAS) Volume 5: Analytical approach and results

    NASA Technical Reports Server (NTRS)

    1980-01-01

    Data and information in the area of advanced energy conversion systems for industrial cogeneration applications in the 1985 to 2000 time period are provided. Six current and thirty-six advanced energy conversion systems were defined and combined with appropriate balance of plant equipment. Twenty-six industrial processes were selected from among the high energy consuming industries to serve as a framework for the study. Each conversion system was analyzed as a cogenerator with each industrial plant. Fuel consumption, costs, and environmental intrusion were evaluated and compared to corresponding traditional values. Various cogeneration strategies were analyzed and both topping and bottoming (using industrial by-product heat) applications were included. The advanced energy conversion technologies indicated reduced fuel consumption, costs, and emissions. Typically fuel energy savings of 10 to 25 percent were predicted compared to traditional on site furnaces and utility electricity. Gas turbines and combined cycles indicated high overall annual cost savings. Steam turbines and gas turbines produced high estimated returns. In some applications, diesels were most efficient. The advanced technologies used coal derived fuels, or coal with advanced fluid bed combustion or on site gasification systems.

  5. Design, implementation, use, and preliminary evaluation of SEBASTIAN, a standards-based Web service for clinical decision support.

    PubMed

    Kawamoto, Kensaku; Lobach, David F

    2005-01-01

    Despite their demonstrated ability to improve care quality, clinical decision support systems are not widely used. In part, this limited use is due to the difficulty of sharing medical knowledge in a machine-executable format. To address this problem, we developed a decision support Web service known as SEBASTIAN. In SEBASTIAN, individual knowledge modules define the data requirements for assessing a patient, the conclusions that can be drawn using that data, and instructions on how to generate those conclusions. Using standards-based XML messages transmitted over HTTP, client decision support applications provide patient data to SEBASTIAN and receive patient-specific assessments and recommendations. SEBASTIAN has been used to implement four distinct decision support systems; an architectural overview is provided for one of these systems. Preliminary assessments indicate that SEBASTIAN fulfills all original design objectives, including the re-use of executable medical knowledge across diverse applications and care settings, the straightforward authoring of knowledge modules, and use of the framework to implement decision support applications with significant clinical utility.

  6. Decision curve analysis revisited: overall net benefit, relationships to ROC curve analysis, and application to case-control studies.

    PubMed

    Rousson, Valentin; Zumbrunn, Thomas

    2011-06-22

    Decision curve analysis has been introduced as a method to evaluate prediction models in terms of their clinical consequences if used for a binary classification of subjects into a group who should and into a group who should not be treated. The key concept for this type of evaluation is the "net benefit", a concept borrowed from utility theory. We recall the foundations of decision curve analysis and discuss some new aspects. First, we stress the formal distinction between the net benefit for the treated and for the untreated and define the concept of the "overall net benefit". Next, we revisit the important distinction between the concept of accuracy, as typically assessed using the Youden index and a receiver operating characteristic (ROC) analysis, and the concept of utility of a prediction model, as assessed using decision curve analysis. Finally, we provide an explicit implementation of decision curve analysis to be applied in the context of case-control studies. We show that the overall net benefit, which combines the net benefit for the treated and the untreated, is a natural alternative to the benefit achieved by a model, being invariant with respect to the coding of the outcome, and conveying a more comprehensive picture of the situation. Further, within the framework of decision curve analysis, we illustrate the important difference between the accuracy and the utility of a model, demonstrating how poor an accurate model may be in terms of its net benefit. Eventually, we expose that the application of decision curve analysis to case-control studies, where an accurate estimate of the true prevalence of a disease cannot be obtained from the data, is achieved with a few modifications to the original calculation procedure. We present several interrelated extensions to decision curve analysis that will both facilitate its interpretation and broaden its potential area of application.

  7. Decision curve analysis revisited: overall net benefit, relationships to ROC curve analysis, and application to case-control studies

    PubMed Central

    2011-01-01

    Background Decision curve analysis has been introduced as a method to evaluate prediction models in terms of their clinical consequences if used for a binary classification of subjects into a group who should and into a group who should not be treated. The key concept for this type of evaluation is the "net benefit", a concept borrowed from utility theory. Methods We recall the foundations of decision curve analysis and discuss some new aspects. First, we stress the formal distinction between the net benefit for the treated and for the untreated and define the concept of the "overall net benefit". Next, we revisit the important distinction between the concept of accuracy, as typically assessed using the Youden index and a receiver operating characteristic (ROC) analysis, and the concept of utility of a prediction model, as assessed using decision curve analysis. Finally, we provide an explicit implementation of decision curve analysis to be applied in the context of case-control studies. Results We show that the overall net benefit, which combines the net benefit for the treated and the untreated, is a natural alternative to the benefit achieved by a model, being invariant with respect to the coding of the outcome, and conveying a more comprehensive picture of the situation. Further, within the framework of decision curve analysis, we illustrate the important difference between the accuracy and the utility of a model, demonstrating how poor an accurate model may be in terms of its net benefit. Eventually, we expose that the application of decision curve analysis to case-control studies, where an accurate estimate of the true prevalence of a disease cannot be obtained from the data, is achieved with a few modifications to the original calculation procedure. Conclusions We present several interrelated extensions to decision curve analysis that will both facilitate its interpretation and broaden its potential area of application. PMID:21696604

  8. Clinical Laboratory Practice Recommendations for the Use of Cardiac Troponin in Acute Coronary Syndrome: Expert Opinion from the Academy of the American Association for Clinical Chemistry and the Task Force on Clinical Applications of Cardiac Bio-Markers of the International Federation of Clinical Chemistry and Laboratory Medicine.

    PubMed

    Wu, Alan H B; Christenson, Robert H; Greene, Dina N; Jaffe, Allan S; Kavsak, Peter A; Ordonez-Llanos, Jordi; Apple, Fred S

    2018-04-01

    This document is an essential companion to the third iteration of the National Academy of Clinical Biochemistry [NACB, 8 now the American Association for Clinical Chemistry (AACC) Academy] Laboratory Medicine Practice Guidelines (LMPG) on cardiac markers. The expert consensus recommendations were drafted in collaboration with the International Federation of Clinical Chemistry and Laboratory Medicine Task Force on Clinical Applications of Bio-Markers (IFCC TF-CB). We determined that there is sufficient clinical guidance on the use of cardiac troponin (cTn) testing from clinical practice groups. Thus, in this expert consensus document, we focused on clinical laboratory practice recommendations for high-sensitivity (hs)-cTn assays. This document utilized the expert opinion class of evidence to focus on the following 10 topics: ( a ) quality control (QC) utilization, ( b ) validation of the lower reportable analytical limits, ( c ) units to be used in reporting measurable concentrations for patients and QC materials, ( d ) 99th percentile sex-specific upper reference limits to define the reference interval; ( e ) criteria required to define hs-cTn assays, ( f ) communication with clinicians and the laboratory's role in educating clinicians regarding the influence of preanalytic and analytic problems that can confound assay results, ( g ) studies on hs-cTn assays and how authors need to document preanalytical and analytical variables, ( h ) harmonizing and standardizing assay results and the role of commutable materials, ( i ) time to reporting of results from sample receipt and sample collection, and ( j ) changes in hs-cTn concentrations over time and the role of both analytical and biological variabilities in interpreting results of serial blood collections. © 2017 American Association for Clinical Chemistry.

  9. Development of a web-based application and multicountry analysis framework for assessing interdicted infections and cost-utility of screening donated blood for HIV, HCV and HBV.

    PubMed

    Custer, B; Janssen, M P; Hubben, G; Vermeulen, M; van Hulst, M

    2017-08-01

    Most countries test donations for HIV, HCV and HBV using serology with or without nucleic acid testing (NAT). Cost-utility analyses provide information on the relative value of different screening options. The aim of this project was to develop an open access risk assessment and cost-utility analysis web-tool for assessing HIV, HCV and HBV screening options (http://www.isbtweb.org/working-parties/transfusion-transmitted-infectious-diseases/). An analysis for six countries (Brazil, Ghana, the Netherlands, South Africa, Thailand and USA) was conducted. Four strategies; (1) antibody assays (Abs) for HIV and HCV + HBsAg, (2) antibody assays that include antigens for HIV and HCV (Combo) + HBsAg, (3) NAT in minipools of variable size (MP NAT) and (4) individual donation (ID) NAT can be evaluated using the tool. Country-specific data on donors, donation testing results, recipient outcomes and costs are entered using the online interface. Results obtained include the number infections interdicted using each screening options, and the (incremental and average) cost-utility of the options. In each of the six countries evaluated, the use of antibody assays is cost effective or even cost saving. NAT has varying cost-utility depending on the setting, and where adopted, the incremental cost-utility exceeds any previously defined or proposed threshold in each country. The web-tool allows an assessment of infectious units interdicted and value for money of different testing strategies. Regardless of gross national income (GNI) per capita, countries appear willing to dedicate healthcare resources to blood supply safety in excess of that for other sectors of health care. © 2017 International Society of Blood Transfusion.

  10. Development of a preference-based index from the National Eye Institute Visual Function Questionnaire-25.

    PubMed

    Rentz, Anne M; Kowalski, Jonathan W; Walt, John G; Hays, Ron D; Brazier, John E; Yu, Ren; Lee, Paul; Bressler, Neil; Revicki, Dennis A

    2014-03-01

    Understanding how individuals value health states is central to patient-centered care and to health policy decision making. Generic preference-based measures of health may not effectively capture the impact of ocular diseases. Recently, 6 items from the National Eye Institute Visual Function Questionnaire-25 were used to develop the Visual Function Questionnaire-Utility Index health state classification, which defines visual function health states. To describe elicitation of preferences for health states generated from the Visual Function Questionnaire-Utility Index health state classification and development of an algorithm to estimate health preference scores for any health state. Nonintervention, cross-sectional study of the general community in 4 countries (Australia, Canada, United Kingdom, and United States). A total of 607 adult participants were recruited from local newspaper advertisements. In the United Kingdom, an existing database of participants from previous studies was used for recruitment. Eight of 15,625 possible health states from the Visual Function Questionnaire-Utility Index were valued using time trade-off technique. A θ severity score was calculated for Visual Function Questionnaire-Utility Index-defined health states using item response theory analysis. Regression models were then used to develop an algorithm to assign health state preference values for all potential health states defined by the Visual Function Questionnaire-Utility Index. Health state preference values for the 8 states ranged from a mean (SD) of 0.343 (0.395) to 0.956 (0.124). As expected, preference values declined with worsening visual function. Results indicate that the Visual Function Questionnaire-Utility Index describes states that participants view as spanning most of the continuum from full health to dead. Visual Function Questionnaire-Utility Index health state classification produces health preference scores that can be estimated in vision-related studies that include the National Eye Institute Visual Function Questionnaire-25. These preference scores may be of value for estimating utilities in economic and health policy analyses.

  11. PV water pumping: NEOS Corporation recent PV water pumping activities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lane, C.

    1995-11-01

    NEOS Corporation has been very active in PV-powered water pumping, particularly with respect to electric utilities. Most of the recent activity has been through the Photovoltaic Services Network (PSN). The PSN is an independent, not-for-profit organization comprised of all types of electric utilities: rural electric coops, public power districts, investor-owned utilities, and power marketing agencies. The PSN`s mission is to work pro-actively to promote utility involvement in PV through education and training. PV information is distributed by the PSN in three primary forms: (1) consultation with PSN technical service representatives: (2) literature generated by the PSN; and (3) literature publishedmore » by other organizations. The PSN can also provide assistance to members in developing PV customer service programs. The PSN`s product support activities include consolidation of information on existing packaged PV systems and facilitation of the development of new PV product packages that meet utility-defined specifications for cost performance, and reliability. The PSN`s initial product support efforts will be focused on commercially available packaged PV systems for a variety of off-grid applications. In parallel with this effort, if no products exist that meet the PSN`s functional specifications, the PSN will initiate the second phase of product development support process by encouraging the development of new packaged systems. Through these services and product support activities, the PSN anticipates engaging all segments for the PV industry, thus providing benefits to PV systems suppliers as well as local PV service contractors.This paper describes field testing of pv power systems for water pumping.« less

  12. Inter-comparison of weather and circulation type classifications for hydrological drought development

    NASA Astrophysics Data System (ADS)

    Fleig, Anne K.; Tallaksen, Lena M.; Hisdal, Hege; Stahl, Kerstin; Hannah, David M.

    Classifications of weather and circulation patterns are often applied in research seeking to relate atmospheric state to surface environmental phenomena. However, numerous procedures have been applied to define the patterns, thus limiting comparability between studies. The COST733 Action “ Harmonisation and Applications of Weather Type Classifications for European regions” tests 73 different weather type classifications (WTC) and their associate weather types (WTs) and compares the WTCs’ utility for various applications. The objective of this study is to evaluate the potential of these WTCs for analysis of regional hydrological drought development in north-western Europe. Hydrological drought is defined in terms of a Regional Drought Area Index (RDAI), which is based on deficits derived from daily river flow series. RDAI series (1964-2001) were calculated for four homogeneous regions in Great Britain and two in Denmark. For each region, WTs associated with hydrological drought development were identified based on antecedent and concurrent WT-frequencies for major drought events. The utility of the different WTCs for the study of hydrological drought development was evaluated, and the influence of WTC attributes, i.e. input variables, number of defined WTs and general classification concept, on WTC performance was assessed. The objective Grosswetterlagen (OGWL), the objective Second-Generation Lamb Weather Type Classification (LWT2) with 18 WTs and two implementations of the objective Wetterlagenklassifikation (WLK; with 40 and 28 WTs) outperformed all other WTCs. In general, WTCs with more WTs (⩾27) were found to perform better than WTCs with less (⩽18) WTs. The influence of input variables was not consistent across the different classification procedures, and the performance of a WTC was determined primarily by the classification procedure itself. Overall, classification procedures following the relatively simple general classification concept of predefining WTs based on thresholds, performed better than those based on more sophisticated classification concepts such as deriving WTs by cluster analysis or artificial neural networks. In particular, PCA based WTCs with 9 WTs and automated WTCs with a high number of predefined WTs (subjectively and threshold based) performed well. It is suggested that the explicit consideration of the air flow characteristics of meridionality, zonality and cyclonicity in the definition of WTs is a useful feature for a WTC when analysing regional hydrological drought development.

  13. Method for photolithographic definition of recessed features on a semiconductor wafer utilizing auto-focusing alignment

    DOEpatents

    Farino, A.J.; Montague, S.; Sniegowski, J.J.; Smith, J.H.; McWhorter, P.J.

    1998-07-21

    A method is disclosed for photolithographically defining device features up to the resolution limit of an auto-focusing projection stepper when the device features are to be formed in a wafer cavity at a depth exceeding the depth of focus of the stepper. The method uses a focusing cavity located in a die field at the position of a focusing light beam from the auto-focusing projection stepper, with the focusing cavity being of the same depth as one or more adjacent cavities wherein a semiconductor device is to be formed. The focusing cavity provides a bottom surface for referencing the focusing light beam and focusing the stepper at a predetermined depth below the surface of the wafer, whereat the device features are to be defined. As material layers are deposited in each device cavity to build up a semiconductor structure such as a microelectromechanical system (MEMS) device, the same material layers are deposited in the focusing cavity, raising the bottom surface and re-focusing the stepper for accurately defining additional device features in each succeeding material layer. The method is especially applicable for forming MEMS devices within a cavity or trench and integrating the MEMS devices with electronic circuitry fabricated on the wafer surface. 15 figs.

  14. Method for photolithographic definition of recessed features on a semiconductor wafer utilizing auto-focusing alignment

    DOEpatents

    Farino, Anthony J.; Montague, Stephen; Sniegowski, Jeffry J.; Smith, James H.; McWhorter, Paul J.

    1998-01-01

    A method is disclosed for photolithographically defining device features up to the resolution limit of an auto-focusing projection stepper when the device features are to be formed in a wafer cavity at a depth exceeding the depth of focus of the stepper. The method uses a focusing cavity located in a die field at the position of a focusing light beam from the auto-focusing projection stepper, with the focusing cavity being of the same depth as one or more adjacent cavities wherein a semiconductor device is to be formed. The focusing cavity provides a bottom surface for referencing the focusing light beam and focusing the stepper at a predetermined depth below the surface of the wafer, whereat the device features are to be defined. As material layers are deposited in each device cavity to build up a semiconductor structure such as a microelectromechanical system (MEMS) device, the same material layers are deposited in the focusing cavity, raising the bottom surface and re-focusing the stepper for accurately defining additional device features in each succeeding material layer. The method is especially applicable for forming MEMS devices within a cavity or trench and integrating the MEMS devices with electronic circuitry fabricated on the wafer surface.

  15. Single x-ray transmission system for bone mineral density determination

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jimenez-Mendoza, Daniel; Vargas-Vazquez, Damian; Espinosa-Arbelaez, Diego G.

    2011-12-15

    Bones are the support of the body. They are composed of many inorganic compounds and other organic materials that all together can be used to determine the mineral density of the bones. The bone mineral density is a measure index that is widely used as an indicator of the health of the bone. A typical manner to evaluate the quality of the bone is a densitometry study; a dual x-ray absorptiometry system based study that has been widely used to assess the mineral density of some animals' bones. However, despite the success stories of utilizing these systems in many differentmore » applications, it is a very expensive method that requires frequent calibration processes to work properly. Moreover, its usage in small species applications (e.g., rodents) has not been quite demonstrated yet. Following this argument, it is suggested that there is a need for an instrument that would perform such a task in a more reliable and economical manner. Therefore, in this paper we explore the possibility to develop a new, affordable, and reliable single x-ray absorptiometry system. The method consists of utilizing a single x-ray source, an x-ray image sensor, and a computer platform that all together, as a whole, will allow us to calculate the mineral density of the bone. Utilizing an x-ray transmission theory modified through a version of the Lambert-Beer law equation, a law that expresses the relationship among the energy absorbed, the thickness, and the absorption coefficient of the sample at the x-rays wavelength to calculate the mineral density of the bone can be advantageous. Having determined the parameter equation that defines the ratio of the pixels in radiographies and the bone mineral density [measured in mass per unit of area (g/cm{sup 2})], we demonstrated the utility of our novel methodology by calculating the mineral density of Wistar rats' femur bones.« less

  16. Single x-ray transmission system for bone mineral density determination

    NASA Astrophysics Data System (ADS)

    Jimenez-Mendoza, Daniel; Espinosa-Arbelaez, Diego G.; Giraldo-Betancur, Astrid L.; Hernandez-Urbiola, Margarita I.; Vargas-Vazquez, Damian; Rodriguez-Garcia, Mario E.

    2011-12-01

    Bones are the support of the body. They are composed of many inorganic compounds and other organic materials that all together can be used to determine the mineral density of the bones. The bone mineral density is a measure index that is widely used as an indicator of the health of the bone. A typical manner to evaluate the quality of the bone is a densitometry study; a dual x-ray absorptiometry system based study that has been widely used to assess the mineral density of some animals' bones. However, despite the success stories of utilizing these systems in many different applications, it is a very expensive method that requires frequent calibration processes to work properly. Moreover, its usage in small species applications (e.g., rodents) has not been quite demonstrated yet. Following this argument, it is suggested that there is a need for an instrument that would perform such a task in a more reliable and economical manner. Therefore, in this paper we explore the possibility to develop a new, affordable, and reliable single x-ray absorptiometry system. The method consists of utilizing a single x-ray source, an x-ray image sensor, and a computer platform that all together, as a whole, will allow us to calculate the mineral density of the bone. Utilizing an x-ray transmission theory modified through a version of the Lambert-Beer law equation, a law that expresses the relationship among the energy absorbed, the thickness, and the absorption coefficient of the sample at the x-rays wavelength to calculate the mineral density of the bone can be advantageous. Having determined the parameter equation that defines the ratio of the pixels in radiographies and the bone mineral density [measured in mass per unit of area (g/cm2)], we demonstrated the utility of our novel methodology by calculating the mineral density of Wistar rats' femur bones.

  17. Defining personal utility in genomics: A Delphi study.

    PubMed

    Kohler, J N; Turbitt, E; Lewis, K L; Wilfond, B S; Jamal, L; Peay, H L; Biesecker, L G; Biesecker, B B

    2017-09-01

    Individual genome sequencing results are valued by patients in ways distinct from clinical utility. Such outcomes have been described as components of "personal utility," a concept that broadly encompasses patient-endorsed benefits, that is operationally defined as non-clinical outcomes. No empirical delineation of these outcomes has been reported. To address this gap, we administered a Delphi survey to adult participants in a National Institute of Health (NIH) clinical exome study to extract the most highly endorsed outcomes constituting personal utility. Forty research participants responded to a Delphi survey to rate 35 items identified by a systematic literature review of personal utility. Two rounds of ranking resulted in 24 items that represented 14 distinct elements of personal utility. Elements most highly endorsed by participants were: increased self-knowledge, knowledge of "the condition," altruism, and anticipated coping. Our findings represent the first systematic effort to delineate elements of personal utility that may be used to anticipate participant expectation and inform genetic counseling prior to sequencing. The 24 items reported need to be studied further in additional clinical genome sequencing studies to assess generalizability in other populations. Further research will help to understand motivations and to predict the meaning and use of results. Published 2017. This article is a U.S. Government work and is in the public domain in the USA.

  18. Some controversial multiple testing problems in regulatory applications.

    PubMed

    Hung, H M James; Wang, Sue-Jane

    2009-01-01

    Multiple testing problems in regulatory applications are often more challenging than the problems of handling a set of mathematical symbols representing multiple null hypotheses under testing. In the union-intersection setting, it is important to define a family of null hypotheses relevant to the clinical questions at issue. The distinction between primary endpoint and secondary endpoint needs to be considered properly in different clinical applications. Without proper consideration, the widely used sequential gate keeping strategies often impose too many logical restrictions to make sense, particularly to deal with the problem of testing multiple doses and multiple endpoints, the problem of testing a composite endpoint and its component endpoints, and the problem of testing superiority and noninferiority in the presence of multiple endpoints. Partitioning the null hypotheses involved in closed testing into clinical relevant orderings or sets can be a viable alternative to resolving the illogical problems requiring more attention from clinical trialists in defining the clinical hypotheses or clinical question(s) at the design stage. In the intersection-union setting there is little room for alleviating the stringency of the requirement that each endpoint must meet the same intended alpha level, unless the parameter space under the null hypothesis can be substantially restricted. Such restriction often requires insurmountable justification and usually cannot be supported by the internal data. Thus, a possible remedial approach to alleviate the possible conservatism as a result of this requirement is a group-sequential design strategy that starts with a conservative sample size planning and then utilizes an alpha spending function to possibly reach the conclusion early.

  19. Subtyping of Salmonella enterica Serovar Newport Outbreak Isolates by CRISPR-MVLST and Determination of the Relationship between CRISPR-MVLST and PFGE Results

    PubMed Central

    Shariat, Nikki; Kirchner, Margaret K.; Sandt, Carol H.; Trees, Eija; Barrangou, Rodolphe

    2013-01-01

    Salmonella enterica subsp. enterica serovar Newport (S. Newport) is the third most prevalent cause of food-borne salmonellosis. Rapid, efficient, and accurate methods for identification are required to track specific strains of S. Newport during outbreaks. By exploiting the hypervariable nature of virulence genes and clustered regularly interspaced short palindromic repeats (CRISPRs), we previously developed a sequence-based subtyping approach, designated CRISPR–multi-virulence-locus sequence typing (CRISPR-MVLST). To demonstrate the applicability of this approach, we analyzed a broad set of S. Newport isolates collected over a 5-year period by using CRISPR-MVLST and pulsed-field gel electrophoresis (PFGE). Among 84 isolates, we defined 38 S. Newport sequence types (NSTs), all of which were novel compared to our previous analyses, and 62 different PFGE patterns. Our data suggest that both subtyping approaches have high discriminatory abilities (>0.95) with a potential for clustering cases with common exposures. Importantly, we found that isolates from closely related NSTs were often similar by PFGE profile as well, further corroborating the applicability of CRISPR-MVLST. In the first full application of CRISPR-MVLST, we analyzed isolates from a recent S. Newport outbreak. In this blinded study, we confirmed the utility of CRISPR-MVLST and were able to distinguish the 10 outbreak isolates, as defined by PFGE and epidemiological data, from a collection of 20 S. Newport isolates. Together, our data show that CRISPR-MVLST could be a complementary approach to PFGE subtyping for S. Newport. PMID:23678062

  20. Computational medicinal chemistry in fragment-based drug discovery: what, how and when.

    PubMed

    Rabal, Obdulia; Urbano-Cuadrado, Manuel; Oyarzabal, Julen

    2011-01-01

    The use of fragment-based drug discovery (FBDD) has increased in the last decade due to the encouraging results obtained to date. In this scenario, computational approaches, together with experimental information, play an important role to guide and speed up the process. By default, FBDD is generally considered as a constructive approach. However, such additive behavior is not always present, therefore, simple fragment maturation will not always deliver the expected results. In this review, computational approaches utilized in FBDD are reported together with real case studies, where applicability domains are exemplified, in order to analyze them, and then, maximize their performance and reliability. Thus, a proper use of these computational tools can minimize misleading conclusions, keeping the credit on FBDD strategy, as well as achieve higher impact in the drug-discovery process. FBDD goes one step beyond a simple constructive approach. A broad set of computational tools: docking, R group quantitative structure-activity relationship, fragmentation tools, fragments management tools, patents analysis and fragment-hopping, for example, can be utilized in FBDD, providing a clear positive impact if they are utilized in the proper scenario - what, how and when. An initial assessment of additive/non-additive behavior is a critical point to define the most convenient approach for fragments elaboration.

  1. Factors Affecting Radiologist's PACS Usage.

    PubMed

    Forsberg, Daniel; Rosipko, Beverly; Sunshine, Jeffrey L

    2016-12-01

    The purpose of this study was to determine if any of the factors radiologist, examination category, time of week, and week effect PACS usage, with PACS usage defined as the sequential order of computer commands issued by a radiologist in a PACS during interpretation and dictation. We initially hypothesized that only radiologist and examination category would have significant effects on PACS usage. Command logs covering 8 weeks of PACS usage were analyzed. For each command trace (describing performed activities of an attending radiologist interpreting a single examination), the PACS usage variables number of commands, number of command classes, bigram repetitiveness, and time to read were extracted. Generalized linear models were used to determine the significance of the factors on the PACS usage variables. The statistical results confirmed the initial hypothesis that radiologist and examination category affect PACS usage and that the factors week and time of week to a large extent have no significant effect. As such, this work provides direction for continued efforts to analyze system data to better understand PACS utilization, which in turn can provide input to enable optimal utilization and configuration of corresponding systems. These continued efforts were, in this work, exemplified by a more detailed analysis using PACS usage profiles, which revealed insights directly applicable to improve PACS utilization through modified system configuration.

  2. Initiatives to improve prescribing efficiency for drugs to treat Parkinson's disease in Croatia: influence and future directions.

    PubMed

    Brkicic, Ljiljana Sovic; Godman, Brian; Voncina, Luka; Sovic, Slavica; Relja, Maja

    2012-06-01

    Parkinson's disease (PD) is the second most common neurological disease affecting older adults. Consequently, this disease should be a focus among payers, with increasing utilization of newer premium-priced patent-protected add-on therapies to stabilize or even improve motor function over time. However, expenditure can be moderated by reforms. Consequently, there is a need to assess the influence of these reforms on the prescribing efficiency for drugs to treat PD in Croatia before proposing additional measures. Prescribing efficiency is defined as increasing the use of add-on therapies for similar expenditure. An observational retrospective study of the Croatian Institute for Health Insurance database of drugs to treat patients with PD in Croatia from 2000 to 2010 was carried out, with utilization measured in defined daily doses (defined as the average maintenance dose of a drug when used in its major indication in adults). The study years were chosen to reflect recent reforms. Only reimbursed expenditure is measured from a health insurance perspective. Utilization of drugs to treat PD increased by 218% between 2000 and 2010. Reimbursed expenditure increased by 360%, principally driven by increasing utilization of premium-priced patent-protected add-on therapies, including ropinirole and pramipexole. However, following recent reforms, reducing expenditure/defined daily dose for the different drugs, as well as overall expenditure, stabilized reimbursed expenditure between 2005 and 2010. Treatment of PD is complex, and add-on therapies are needed to improve care. Reimbursed expenditure should now fall following stabilization, despite increasing volumes, as successive add-on therapies lose their patents, further increasing prescribing efficiency.

  3. The Jet Propulsion Laboratory low-cost solar array project, 1974-1986

    NASA Technical Reports Server (NTRS)

    Maycock, P. D.

    1986-01-01

    The overall objective of the photovoltaic program is to ensure that photovoltaic conversion systems play a significant role in the nation's energy supply by stimulating an industry capable of providing approximately 50 GWe of installed electricity generating capacity by the year 2000. In order to achieve this overall objective, several time-phased program goals have been defined. Near-term goals are to achieve photovoltaic flat-plate module or concentrator array prices of $2 per peak watt (1975 dollars) at an annual production rate of 20 peak megawatts in 1982. At this price level, energy costs should range from 100 to 200 mills/kwh. Mid-term goals are to achieve photovoltaic flat-plate module or concentrator array prices of $0.50 per peak watt (in 1975 dollars), and an annual production rate of 500 peak megawatts in 1986. Studies project that photovoltaic systems will begin to compete for both distributed and larger load-center utility-type applications and thereby open up significant markets for large-scale photovoltaic systems. Far term goals are to achieve the photovoltaic flat-plate module or concentrator array price goal of $0.10 to $0.30 per peak watt in 1990 (in 1975 dollars), and an annual production rate of 10 to 20 peak gigawatts in 2000. At this price range, energy cost should be in the range of 40 to 60 mills. kwh and be cost effective for utility applications. Achievement of these goals can make photovoltaic systems economically competitive with other energy sources for dispersed on-site applications as well as for central power generation.

  4. Microfluidic process monitor for industrial solvent extraction system

    DOEpatents

    Gelis, Artem; Pereira, Candido; Nichols, Kevin Paul Flood

    2016-01-12

    The present invention provides a system for solvent extraction utilizing a first electrode with a raised area formed on its surface, which defines a portion of a microfluidic channel; a second electrode with a flat surface, defining another portion of the microfluidic channel that opposes the raised area of the first electrode; a reversibly deformable substrate disposed between the first electrode and second electrode, adapted to accommodate the raised area of the first electrode and having a portion that extends beyond the raised area of the first electrode, that portion defining the remaining portions of the microfluidic channel; and an electrolyte of at least two immiscible liquids that flows through the microfluidic channel. Also provided is a system for performing multiple solvent extractions utilizing several microfluidic chips or unit operations connected in series.

  5. Reference Models for Structural Technology Assessment and Weight Estimation

    NASA Technical Reports Server (NTRS)

    Cerro, Jeff; Martinovic, Zoran; Eldred, Lloyd

    2005-01-01

    Previously the Exploration Concepts Branch of NASA Langley Research Center has developed techniques for automating the preliminary design level of launch vehicle airframe structural analysis for purposes of enhancing historical regression based mass estimating relationships. This past work was useful and greatly reduced design time, however its application area was very narrow in terms of being able to handle a large variety in structural and vehicle general arrangement alternatives. Implementation of the analysis approach presented herein also incorporates some newly developed computer programs. Loft is a program developed to create analysis meshes and simultaneously define structural element design regions. A simple component defining ASCII file is read by Loft to begin the design process. HSLoad is a Visual Basic implementation of the HyperSizer Application Programming Interface, which automates the structural element design process. Details of these two programs and their use are explained in this paper. A feature which falls naturally out of the above analysis paradigm is the concept of "reference models". The flexibility of the FEA based JAVA processing procedures and associated process control classes coupled with the general utility of Loft and HSLoad make it possible to create generic program template files for analysis of components ranging from something as simple as a stiffened flat panel, to curved panels, fuselage and cryogenic tank components, flight control surfaces, wings, through full air and space vehicle general arrangements.

  6. Advanced Life Support Project Plan

    NASA Technical Reports Server (NTRS)

    2002-01-01

    Life support systems are an enabling technology and have become integral to the success of living and working in space. As NASA embarks on human exploration and development of space to open the space frontier by exploring, using and enabling the development of space and to expand the human experience into the far reaches of space, it becomes imperative, for considerations of safety, cost, and crew health, to minimize consumables and increase the autonomy of the life support system. Utilizing advanced life support technologies increases this autonomy by reducing mass, power, and volume necessary for human support, thus permitting larger payload allocations for science and exploration. Two basic classes of life support systems must be developed, those directed toward applications on transportation/habitation vehicles (e.g., Space Shuttle, International Space Station (ISS), next generation launch vehicles, crew-tended stations/observatories, planetary transit spacecraft, etc.) and those directed toward applications on the planetary surfaces (e.g., lunar or Martian landing spacecraft, planetary habitats and facilities, etc.). In general, it can be viewed as those systems compatible with microgravity and those compatible with hypogravity environments. Part B of the Appendix defines the technology development 'Roadmap' to be followed in providing the necessary systems for these missions. The purpose of this Project Plan is to define the Project objectives, Project-level requirements, the management organizations responsible for the Project throughout its life cycle, and Project-level resources, schedules and controls.

  7. Secure ICCP Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rice, Mark J.; Bonebrake, Christopher A.; Dayley, Greg K.

    Inter-Control Center Communications Protocol (ICCP), defined by the IEC 60870-6 TASE.2 standard, was developed to enable data exchange over wide area networks between electric system entities, including utility control centers, Independent System Operators (ISOs), Regional Transmission Operators (RTOs) and Independent Power Producers (IPP) also known as Non-Utility Generators (NUG). ICCP is an unprotected protocol, and as a result is vulnerable to such actions as integrity violation, interception or alteration, spoofing, and eavesdropping. Because of these vulnerabilities with unprotected ICCP communication, security enhancements, referred to as Secure ICCP, have been added and are included in the ICCP products that utilities havemore » received since 2003 when the standard was defined. This has resulted in an ICCP product whose communication can be encrypted and authenticated to address these vulnerabilities.« less

  8. Venus Quadrangle Geological Mapping: Use of Geoscience Data Visualization Systems in Mapping and Training

    NASA Technical Reports Server (NTRS)

    Head, James W.; Huffman, J. N.; Forsberg, A. S.; Hurwitz, D. M.; Basilevsky, A. T.; Ivanov, M. A.; Dickson, J. L.; Kumar, P. Senthil

    2008-01-01

    We are currently investigating new technological developments in computer visualization and analysis in order to assess their importance and utility in planetary geological analysis and mapping [1,2]. Last year we reported on the range of technologies available and on our application of these to various problems in planetary mapping [3]. In this contribution we focus on the application of these techniques and tools to Venus geological mapping at the 1:5M quadrangle scale. In our current Venus mapping projects we have utilized and tested the various platforms to understand their capabilities and assess their usefulness in defining units, establishing stratigraphic relationships, mapping structures, reaching consensus on interpretations and producing map products. We are specifically assessing how computer visualization display qualities (e.g., level of immersion, stereoscopic vs. monoscopic viewing, field of view, large vs. small display size, etc.) influence performance on scientific analysis and geological mapping. We have been exploring four different environments: 1) conventional desktops (DT), 2) semi-immersive Fishtank VR (FT) (i.e., a conventional desktop with head-tracked stereo and 6DOF input), 3) tiled wall displays (TW), and 4) fully immersive virtual reality (IVR) (e.g., "Cave Automatic Virtual Environment," or Cave system). Formal studies demonstrate that fully immersive Cave environments are superior to desktop systems for many tasks [e.g., 4].

  9. 36 CFR 14.21 - Form.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... been utilized without authority prior to the time the application is made, the application must state the date such utilization commenced and by whom, and the date the applicant alleges he obtained...

  10. 42 CFR 456.51 - Definitions.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL ASSISTANCE PROGRAMS UTILIZATION CONTROL Utilization Control: Hospitals § 456.51 Definitions. As used in this... institution for mental disease, as defined in § 440.10; (2) [Reserved] (3) Services provided in specialty...

  11. 42 CFR 456.51 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL ASSISTANCE PROGRAMS UTILIZATION CONTROL Utilization Control: Hospitals § 456.51 Definitions. As used in this... institution for mental disease, as defined in § 440.10; (2) [Reserved] (3) Services provided in specialty...

  12. 42 CFR 456.51 - Definitions.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL ASSISTANCE PROGRAMS UTILIZATION CONTROL Utilization Control: Hospitals § 456.51 Definitions. As used in this... institution for mental disease, as defined in § 440.10; (2) [Reserved] (3) Services provided in specialty...

  13. 42 CFR 456.51 - Definitions.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL ASSISTANCE PROGRAMS UTILIZATION CONTROL Utilization Control: Hospitals § 456.51 Definitions. As used in this... institution for mental disease, as defined in § 440.10; (2) [Reserved] (3) Services provided in specialty...

  14. 24 CFR 578.89 - Limitation on use of grant funds to serve persons defined as homeless under other federal laws.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... serve persons defined as homeless under other federal laws. 578.89 Section 578.89 Housing and Urban... persons defined as homeless under other federal laws. (a) Application requirement. Applicants that intend... federal laws in paragraph (3) of the homeless definition in § 576.2 must demonstrate in their application...

  15. 24 CFR 578.89 - Limitation on use of grant funds to serve persons defined as homeless under other federal laws.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... serve persons defined as homeless under other federal laws. 578.89 Section 578.89 Housing and Urban... persons defined as homeless under other federal laws. (a) Application requirement. Applicants that intend... federal laws in paragraph (3) of the homeless definition in § 576.2 must demonstrate in their application...

  16. MediaNet: a multimedia information network for knowledge representation

    NASA Astrophysics Data System (ADS)

    Benitez, Ana B.; Smith, John R.; Chang, Shih-Fu

    2000-10-01

    In this paper, we present MediaNet, which is a knowledge representation framework that uses multimedia content for representing semantic and perceptual information. The main components of MediaNet include conceptual entities, which correspond to real world objects, and relationships among concepts. MediaNet allows the concepts and relationships to be defined or exemplified by multimedia content such as images, video, audio, graphics, and text. MediaNet models the traditional relationship types such as generalization and aggregation but adds additional functionality by modeling perceptual relationships based on feature similarity. For example, MediaNet allows a concept such as car to be defined as a type of a transportation vehicle, but which is further defined and illustrated through example images, videos and sounds of cars. In constructing the MediaNet framework, we have built on the basic principles of semiotics and semantic networks in addition to utilizing the audio-visual content description framework being developed as part of the MPEG-7 multimedia content description standard. By integrating both conceptual and perceptual representations of knowledge, MediaNet has potential to impact a broad range of applications that deal with multimedia content at the semantic and perceptual levels. In particular, we have found that MediaNet can improve the performance of multimedia retrieval applications by using query expansion, refinement and translation across multiple content modalities. In this paper, we report on experiments that use MediaNet in searching for images. We construct the MediaNet knowledge base using both WordNet and an image network built from multiple example images and extracted color and texture descriptors. Initial experimental results demonstrate improved retrieval effectiveness using MediaNet in a content-based retrieval system.

  17. A sapphire loaded TE011 cavity for surface impedance measurements: design, construction, and commissioning status

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    L. Phillips; G. K. Davis; J. R. Delayen

    2005-07-10

    In order to measure the superconducting surface properties of niobium that are of interest to SRF applications, a facility which utilizes a Nb cavity operating in the TE011 mode at 7.65 GHz which provides a well-defined RF field on a disk shaped sample has been designed and fabricated. The RF losses due to the sample's surface impedance are determined by using a calorimetric technique. The system has the capability to measure such properties as Rs,(T), and penetration depth, which can then be correlated with surface properties and preparation processes. The design, fabrication, and results from initial commissioning operations will bemore » discussed, along with the near term sample evaluation program.« less

  18. Use of coblation in resection of juvenile nasopharyngeal angiofibroma.

    PubMed

    Cannon, Daniel E; Poetker, David M; Loehrl, Todd A; Chun, Robert H

    2013-06-01

    We present a series of 4 patients with juvenile nasopharyngeal angiofibroma (JNA) who underwent Coblation-assisted endoscopic resection after preoperative embolization, and discuss the use and advantages of endoscopic Coblation-assisted resection of JNA. Our limited case series suggests that Coblation may be used in the resection of JNA after embolization in a relatively safe, efficient, and effective manner. Coblation allows for decreased bleeding, less need for instrumentation, and improved visualization. There are limited published data in the literature to date on the use of Coblation in endoscopic JNA resection. We describe its use in a more extensive tumor than those previously reported. Further studies are needed to fully define the safety and utility of Coblation technology for this application.

  19. Wind shear modeling for aircraft hazard definition

    NASA Technical Reports Server (NTRS)

    Frost, W.; Camp, D. W.; Wang, S. T.

    1978-01-01

    Mathematical models of wind profiles were developed for use in fast time and manned flight simulation studies aimed at defining and eliminating these wind shear hazards. A set of wind profiles and associated wind shear characteristics for stable and neutral boundary layers, thunderstorms, and frontal winds potentially encounterable by aircraft in the terminal area are given. Engineering models of wind shear for direct hazard analysis are presented in mathematical formulae, graphs, tables, and computer lookup routines. The wind profile data utilized to establish the models are described as to location, how obtained, time of observation and number of data points up to 500 m. Recommendations, engineering interpretations and guidelines for use of the data are given and the range of applicability of the wind shear models is described.

  20. LANDSAT imagery analysis: An aid for predicting landslide prone areas for highway construction. [in Arkansas

    NASA Technical Reports Server (NTRS)

    Macdonald, H. C.; Grubbs, R. S.

    1975-01-01

    The most obvious landform features of geologic significance revealed on LANDSAT imagery are linear trends or lineaments. These trends were found to correspond, at least to a large degree, with unmapped faults or complex fracture zones. LANDSAT imagery analysis in northern Arkansas revealed a lineament complex which provides a remarkable correlation with landslide-prone areas along major highway routes. The weathering properties of various rock types, which are considered in designing stable cut slopes and drainage structures, appear to be adversely influenced by the location and trends of LANDSAT defined lineaments. Geologic interpretation of LANDSAT imagery, where applicable and utilized effectively, provides the highway engineer with a tool for predicting and evaluating landslide-prone areas.

  1. Sensored fiber reinforced polymer grate

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ross, Michael P.; Mack, Thomas Kimball

    Various technologies described herein pertain to a sensored grate that can be utilized for various security fencing applications. The sensored grate includes a grate framework and an embedded optical fiber. The grate framework is formed of a molded polymer such as, for instance, molded fiber reinforced polymer. Further, the grate framework includes a set of elongated elements, where the elongated elements are spaced to define apertures through the grate framework. The optical fiber is embedded in the elongated elements of the grate framework. Moreover, bending or breaking of one or more of the elongated elements can be detected based onmore » a change in a characteristic of input light provided to the optical fiber compared to output light received from the optical fiber.« less

  2. In-situ observations of crack initiation and growth at notches in cast Ti-48Al-2Cr-2Nb

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pollock, T.M.; Muraleedharan, K.; Mumm, D.R.

    1996-12-01

    Gamma titanium aluminide alloys have recently received a great deal of attention due to their demonstrated potential for application in aircraft engines. Although a number of studies have been conducted on the fracture behavior of these alloys under conditions where relatively long pre-existing cracks are present, there is little information on the early stages of crack initiation and growth. The objective of the study reported here was to observe the initial development of cracks at the scale of the microstructure. To confine the process of interest to well-defined regions, notched specimens were utilized. Observations regarding the initiation, growth and themore » influence of environment on failure are discussed.« less

  3. A spatial mark–resight model augmented with telemetry data

    USGS Publications Warehouse

    Sollmann, Rachel; Gardner, Beth; Parsons, Arielle W.; Stocking, Jessica J.; McClintock, Brett T.; Simons, Theodore R.; Pollock, Kenneth H.; O’Connell, Allan F.

    2013-01-01

    Abundance and population density are fundamental pieces of information for population ecology and species conservation, but they are difficult to estimate for rare and elusive species. Mark-resight models are popular for estimating population abundance because they are less invasive and expensive than traditional mark-recapture. However, density estimation using mark-resight is difficult because the area sampled must be explicitly defined, historically using ad-hoc approaches. We develop a spatial mark-resight model for estimating population density that combines spatial resighting data and telemetry data. Incorporating telemetry data allows us to inform model parameters related to movement and individual location. Our model also allows 2. The model presented here will have widespread utility in future applications, especially for species that are not naturally marked.

  4. Reply [to “Comment on ‘The Zen of Venn’” by Priestley Toulmin

    NASA Astrophysics Data System (ADS)

    Berkman, Paul Arthur

    While Venn diagrams, “strictly speaking,” may not have been designed for the “peritechnical literature” they certainly provide a symbolic framework for integrating concepts beyond the context of “mathematically defined objects.” It is interesting that Toulmin was offended and compelled to protest the application of Venn diagrams that are not bound by his “valid methodology.” Such disciplinary constraints on creativity appear contrary to the original writings of John Venn who esteemed interdisciplinary approaches and argued fiercely against those who objected to his introducing mathematical symbols into logic [Venn, 1894]. “Symbolic Logic” itself was crafted with a view toward a general utility “in the solution of complicated problems” [Venn, 1894].

  5. Virtual Organizations: Trends and Models

    NASA Astrophysics Data System (ADS)

    Nami, Mohammad Reza; Malekpour, Abbaas

    The Use of ICT in business has changed views about traditional business. With VO, organizations with out physical, geographical, or structural constraint can collaborate with together in order to fulfill customer requests in a networked environment. This idea improves resource utilization, reduces development process and costs, and saves time. Virtual Organization (VO) is always a form of partnership and managing partners and handling partnerships are crucial. Virtual organizations are defined as a temporary collection of enterprises that cooperate and share resources, knowledge, and competencies to better respond to business opportunities. This paper presents an overview of virtual organizations and main issues in collaboration such as security and management. It also presents a number of different model approaches according to their purpose and applications.

  6. A Novel Nanoionics-Based Switch for Microwave Applications

    NASA Technical Reports Server (NTRS)

    Nessel, James A.; Lee, Richard Q.; Mueller, Carl H.; Kozicki, Michael N.; Ren, Minghan; Morse, Jacki

    2008-01-01

    This paper reports the development and characterization of a novel switching device for use in microwave systems. The device utilizes a switching mechanism based on nanoionics, in which mobile ions within a solid electrolyte undergo an electrochemical process to form and remove a conductive metallic "bridge" to define the change of state. The nanoionics-based switch has demonstrated an insertion loss of approx.0.5dB, isolation of >30dB, low voltage operation (1V), low power (approx. micro-W) and low energy (approx. nJ) consumption, and excellent linearity up to 6 GHz. The switch requires fewer bias operations (due to non-volatile nature) and has a simple planar geometry allowing for novel device structures and easy integration into microwave power distribution circuits.

  7. Interdependence and contagion among industry-level US credit markets: An application of wavelet and VMD based copula approaches

    NASA Astrophysics Data System (ADS)

    Shahzad, Syed Jawad Hussain; Nor, Safwan Mohd; Kumar, Ronald Ravinesh; Mensi, Walid

    2017-01-01

    This study examines the interdependence and contagion among US industry-level credit markets. We use daily data of 11 industries from 17 December 2007 to 31 December 2014 for the time-frequency, namely, wavelet squared coherence analysis. The empirical analysis reveals that Basic Materials (Utilities) industry credit market has the highest (lowest) interdependence with other industries. Basic Materials credit market passes cyclical effect to all other industries. The little ;shift-contagion; as defined by Forbes and Rigobon (2002) is examined using elliptical and Archimedean copulas on the short-run decomposed series obtained through Variational Mode Decomposition (VMD). The contagion effects between US industry-level credit markets mainly occurred during the global financial crisis of 2007-08.

  8. What is a Trophic Cascade?

    PubMed

    Ripple, William J; Estes, James A; Schmitz, Oswald J; Constant, Vanessa; Kaylor, Matthew J; Lenz, Adam; Motley, Jennifer L; Self, Katharine E; Taylor, David S; Wolf, Christopher

    2016-11-01

    Few concepts in ecology have been so influential as that of the trophic cascade. Since the 1980s, the term has been a central or major theme of more than 2000 scientific articles. Despite this importance and widespread usage, basic questions remain about what constitutes a trophic cascade. Inconsistent usage of language impedes scientific progress and the utility of scientific concepts in management and conservation. Herein, we offer a definition of trophic cascade that is designed to be both widely applicable yet explicit enough to exclude extraneous interactions. We discuss our proposed definition and its implications, and define important related terms, thereby providing a common language for scientists, policy makers, conservationists, and other stakeholders with an interest in trophic cascades. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. Life support approaches for Mars missions

    NASA Astrophysics Data System (ADS)

    Drysdale, A. E.; Ewert, M. K.; Hanford, A. J.

    Life support approaches for Mars missions are evaluated using an equivalent system mass (ESM) approach, in which all significant costs are converted into mass units. The best approach, as defined by the lowest mission ESM, depends on several mission parameters, notably duration, environment and consequent infrastructure costs, and crew size, as well as the characteristics of the technologies which are available. Generally, for the missions under consideration, physicochemical regeneration is most cost effective. However, bioregeneration is likely to be of use for producing salad crops for any mission, for producing staple crops for medium duration missions, and for most food, air and water regeneration for long missions (durations of a decade). Potential applications of in situ resource utilization need to be considered further.

  10. Recovery Act: Oxy-Combustion Techology Development for Industrial-Scale Boiler Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Levasseur, Armand

    2014-04-30

    Alstom Power Inc. (Alstom), under U.S. DOE/NETL Cooperative Agreement No. DE-NT0005290, is conducting a development program to generate detailed technical information needed for application of oxy-combustion technology. The program is designed to provide the necessary information and understanding for the next step of large-scale commercial demonstration of oxy combustion in tangentially fired boilers and to accelerate the commercialization of this technology. The main project objectives include: • Design and develop an innovative oxyfuel system for existing tangentially-fired boiler units that minimizes overall capital investment and operating costs. • Evaluate performance of oxyfuel tangentially fired boiler systems in pilot scale testsmore » at Alstom’s 15 MWth tangentially fired Boiler Simulation Facility (BSF). • Address technical gaps for the design of oxyfuel commercial utility boilers by focused testing and improvement of engineering and simulation tools. • Develop the design, performance and costs for a demonstration scale oxyfuel boiler and auxiliary systems. • Develop the design and costs for both industrial and utility commercial scale reference oxyfuel boilers and auxiliary systems that are optimized for overall plant performance and cost. • Define key design considerations and develop general guidelines for application of results to utility and different industrial applications. The project was initiated in October 2008 and the scope extended in 2010 under an ARRA award. The project completion date was April 30, 2014. Central to the project is 15 MWth testing in the BSF, which provided in-depth understanding of oxy-combustion under boiler conditions, detailed data for improvement of design tools, and key information for application to commercial scale oxy-fired boiler design. Eight comprehensive 15 MWth oxy-fired test campaigns were performed with different coals, providing detailed data on combustion, emissions, and thermal behavior over a matrix of fuels, oxyprocess variables and boiler design parameters. Significant improvement of CFD modeling tools and validation against 15 MWth experimental data has been completed. Oxy-boiler demonstration and large reference designs have been developed, supported with the information and knowledge gained from the 15 MWth testing. The results from the 15 MWth testing in the BSF and complimentary bench-scale testing are addressed in this volume (Volume II) of the final report. The results of the modeling efforts (Volume III) and the oxy boiler design efforts (Volume IV) are reported in separate volumes.« less

  11. OpenMSI Arrayed Analysis Toolkit: Analyzing Spatially Defined Samples Using Mass Spectrometry Imaging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    de Raad, Markus; de Rond, Tristan; Rübel, Oliver

    Mass spectrometry imaging (MSI) has primarily been applied in localizing biomolecules within biological matrices. Although well-suited, the application of MSI for comparing thousands of spatially defined spotted samples has been limited. One reason for this is a lack of suitable and accessible data processing tools for the analysis of large arrayed MSI sample sets. In this paper, the OpenMSI Arrayed Analysis Toolkit (OMAAT) is a software package that addresses the challenges of analyzing spatially defined samples in MSI data sets. OMAAT is written in Python and is integrated with OpenMSI (http://openmsi.nersc.gov), a platform for storing, sharing, and analyzing MSI data.more » By using a web-based python notebook (Jupyter), OMAAT is accessible to anyone without programming experience yet allows experienced users to leverage all features. OMAAT was evaluated by analyzing an MSI data set of a high-throughput glycoside hydrolase activity screen comprising 384 samples arrayed onto a NIMS surface at a 450 μm spacing, decreasing analysis time >100-fold while maintaining robust spot-finding. The utility of OMAAT was demonstrated for screening metabolic activities of different sized soil particles, including hydrolysis of sugars, revealing a pattern of size dependent activities. Finally, these results introduce OMAAT as an effective toolkit for analyzing spatially defined samples in MSI. OMAAT runs on all major operating systems, and the source code can be obtained from the following GitHub repository: https://github.com/biorack/omaat.« less

  12. Implementation of a Shared Resource Financial Management System

    PubMed Central

    Caldwell, T.; Gerlach, R.; Israel, M.; Bobin, S.

    2010-01-01

    CF-6 Norris Cotton Cancer Center (NCCC), an NCI-designated Comprehensive Cancer Center at Dartmouth Medical School, administers 12 Life Sciences Shared Resources. These resources are diverse and offer multiple products and services. Previous methods for tracking resource use, billing, and financial management were time consuming, error prone and lacked appropriate financial management tools. To address these problems, we developed and implemented a web-based application with a built-in authorization system that uses Perl, ModPerl, Apache2, and Oracle as the software infrastructure. The application uses a role-based system to differentiate administrative users with those requesting services and includes many features requested by users and administrators. To begin development, we chose a resource that had an uncomplicated service, a large number of users, and required the use of all of the applications features. The Molecular Biology Core Facility at NCCC fit these requirements and was used as a model for developing and testing the application. After model development, institution wide deployment followed a three-stage process. The first stage was to interview the resource manager and staff to understand day-to-day operations. At the second stage, we generated and tested customized forms defining resource services. During the third stage, we added new resource users and administrators to the system before final deployment. Twelve months after deployment, resource administrators reported that the new system performed well for internal and external billing and tracking resource utilization. Users preferred the application's web-based system for distribution of DNA sequencing and other data. The sample tracking features have enhanced day-to-day resource operations, and an on-line scheduling module for shared instruments has proven a much-needed utility. Principal investigators now are able to restrict user spending to specific accounts and have final approval of the invoices before the billing, which has significantly reduced the number of unpaid invoices.

  13. Spacecraft utility and the development of confidence intervals for criticality of anomalies

    NASA Technical Reports Server (NTRS)

    Williams, R. E.

    1980-01-01

    The concept of spacecraft utility, a measure of its performance in orbit, is discussed and its formulation is described. Performance is defined in terms of the malfunctions that occur and the criticality to the mission of these malfunctions. Different approaches to establishing average or expected values of criticality are discussed and confidence intervals are developed for parameters used in the computation of utility.

  14. Zero Quantum Coherence in a Series of Covalent Spin-Correlated Radical Pairs.

    PubMed

    Nelson, Jordan N; Krzyaniak, Matthew D; Horwitz, Noah E; Rugg, Brandon K; Phelan, Brian T; Wasielewski, Michael R

    2017-03-23

    Photoinitiated subnanosecond electron transfer within covalently linked electron donor-acceptor molecules can result in the formation of a spin-correlated radical pair (SCRP) with a well-defined initial singlet spin configuration. Subsequent coherent mixing between the SCRP singlet and triplet m s = 0 spin states, the so-called zero quantum coherence (ZQC), is of potential interest in quantum information processing applications because the ZQC can be probed using pulse electron paramagnetic resonance (pulse-EPR) techniques. Here, pulse-EPR spectroscopy is utilized to examine the ZQC oscillation frequencies and ZQC dephasing in three structurally well-defined D-A systems. While transitions between the singlet and triplet m s = 0 spin states are formally forbidden (Δm s = 0), they can be addressed using specific microwave pulse turning angles to map information from the ZQC onto observable single quantum coherences. In addition, by using structural variations to tune the singlet-triplet energy gap, the ZQC frequencies determined for this series of molecules indicate a stronger dependence on the electronic g-factor than on electron-nuclear hyperfine interactions.

  15. Bone Cell Bioenergetics and Skeletal Energy Homeostasis

    PubMed Central

    Riddle, Ryan C.; Clemens, Thomas L.

    2017-01-01

    The rising incidence of metabolic diseases worldwide has prompted renewed interest in the study of intermediary metabolism and cellular bioenergetics. The application of modern biochemical methods for quantitating fuel substrate metabolism with advanced mouse genetic approaches has greatly increased understanding of the mechanisms that integrate energy metabolism in the whole organism. Examination of the intermediary metabolism of skeletal cells has been sparked by a series of unanticipated observations in genetically modified mice that suggest the existence of novel endocrine pathways through which bone cells communicate their energy status to other centers of metabolic control. The recognition of this expanded role of the skeleton has in turn led to new lines of inquiry directed at defining the fuel requirements and bioenergetic properties of bone cells. This article provides a comprehensive review of historical and contemporary studies on the metabolic properties of bone cells and the mechanisms that control energy substrate utilization and bioenergetics. Special attention is devoted to identifying gaps in our current understanding of this new area of skeletal biology that will require additional research to better define the physiological significance of skeletal cell bioenergetics in human health and disease. PMID:28202599

  16. A Single-Cell Approach to the Elusive Latent Human Cytomegalovirus Transcriptome.

    PubMed

    Goodrum, Felicia; McWeeney, Shannon

    2018-06-12

    Herpesvirus latency has been difficult to understand molecularly due to low levels of viral genomes and gene expression. In the case of the betaherpesvirus human cytomegalovirus (HCMV), this is further complicated by the heterogeneity inherent to hematopoietic subpopulations harboring genomes and, as a consequence, the various patterns of infection that simultaneously exist in a host, ranging from latent to lytic. Single-cell RNA sequencing (scRNA-seq) provides tremendous potential in measuring the gene expression profiles of heterogeneous cell populations for a wide range of applications, including in studies of cancer, immunology, and infectious disease. A recent study by Shnayder et al. (mBio 9:e00013-18, 2018, https://doi.org/10.1128/mBio.00013-18) utilized scRNA-seq to define transcriptomal characteristics of HCMV latency. They conclude that latency-associated gene expression is similar to the late lytic viral program but at lower levels of expression. The study highlights the numerous challenges, from the definition of latency to the analysis of scRNA-seq, that exist in defining a latent transcriptome. Copyright © 2018 Goodrum and McWeeney.

  17. The Spiral-Interactive Program Evaluation Model.

    ERIC Educational Resources Information Center

    Khaleel, Ibrahim Adamu

    1988-01-01

    Describes the spiral interactive program evaluation model, which is designed to evaluate vocational-technical education programs in secondary schools in Nigeria. Program evaluation is defined; utility oriented and process oriented models for evaluation are described; and internal and external evaluative factors and variables that define each…

  18. Designing a web-application to support home-based care of childhood CKD stages 3-5: qualitative study of family and professional preferences.

    PubMed

    Swallow, Veronica M; Hall, Andrew G; Carolan, Ian; Santacroce, Sheila; Webb, Nicholas J A; Smith, Trish; Hanif, Noreen

    2014-02-18

    There is a lack of online, evidence-based information and resources to support home-based care of childhood CKD stages 3-5. Qualitative interviews were undertaken with parents, patients and professionals to explore their views on content of the proposed online parent information and support (OPIS) web-application. Data were analysed using Framework Analysis, guided by the concept of Self-efficacy. 32 parents, 26 patients and 12 professionals were interviewed. All groups wanted an application that explains, demonstrates, and enables parental clinical care-giving, with condition-specific, continously available, reliable, accessible material and a closed communication system to enable contact between families living with CKD. Professionals advocated a regularly updated application to empower parents to make informed health-care decisions. To address these requirements, key web-application components were defined as: (i) Clinical care-giving support (information on treatment regimens, video-learning tools, condition-specific cartoons/puzzles, and a question and answer area) and (ii) Psychosocial support for care-giving (social-networking, case studies, managing stress, and enhancing families' health-care experiences). Developing a web-application that meets parents' information and support needs will maximise its utility, thereby augmenting parents' self-efficacy for CKD caregiving, and optimising outcomes. Self-efficacy theory provides a schema for how parents' self-efficacy beliefs about management of their child's CKD could potentially be promoted by OPIS.

  19. Utilization of Large Scale Surface Models for Detailed Visibility Analyses

    NASA Astrophysics Data System (ADS)

    Caha, J.; Kačmařík, M.

    2017-11-01

    This article demonstrates utilization of large scale surface models with small spatial resolution and high accuracy, acquired from Unmanned Aerial Vehicle scanning, for visibility analyses. The importance of large scale data for visibility analyses on the local scale, where the detail of the surface model is the most defining factor, is described. The focus is not only the classic Boolean visibility, that is usually determined within GIS, but also on so called extended viewsheds that aims to provide more information about visibility. The case study with examples of visibility analyses was performed on river Opava, near the Ostrava city (Czech Republic). The multiple Boolean viewshed analysis and global horizon viewshed were calculated to determine most prominent features and visibility barriers of the surface. Besides that, the extended viewshed showing angle difference above the local horizon, which describes angular height of the target area above the barrier, is shown. The case study proved that large scale models are appropriate data source for visibility analyses on local level. The discussion summarizes possible future applications and further development directions of visibility analyses.

  20. A Movement-Assisted Deployment of Collaborating Autonomous Sensors for Indoor and Outdoor Environment Monitoring

    PubMed Central

    Niewiadomska-Szynkiewicz, Ewa; Sikora, Andrzej; Marks, Michał

    2016-01-01

    Using mobile robots or unmanned vehicles to assist optimal wireless sensors deployment in a working space can significantly enhance the capability to investigate unknown environments. This paper addresses the issues of the application of numerical optimization and computer simulation techniques to on-line calculation of a wireless sensor network topology for monitoring and tracking purposes. We focus on the design of a self-organizing and collaborative mobile network that enables a continuous data transmission to the data sink (base station) and automatically adapts its behavior to changes in the environment to achieve a common goal. The pre-defined and self-configuring approaches to the mobile-based deployment of sensors are compared and discussed. A family of novel algorithms for the optimal placement of mobile wireless devices for permanent monitoring of indoor and outdoor dynamic environments is described. They employ a network connectivity-maintaining mobility model utilizing the concept of the virtual potential function for calculating the motion trajectories of platforms carrying sensors. Their quality and utility have been justified through simulation experiments and are discussed in the final part of the paper. PMID:27649186

  1. Computational rationality: linking mechanism and behavior through bounded utility maximization.

    PubMed

    Lewis, Richard L; Howes, Andrew; Singh, Satinder

    2014-04-01

    We propose a framework for including information-processing bounds in rational analyses. It is an application of bounded optimality (Russell & Subramanian, 1995) to the challenges of developing theories of mechanism and behavior. The framework is based on the idea that behaviors are generated by cognitive mechanisms that are adapted to the structure of not only the environment but also the mind and brain itself. We call the framework computational rationality to emphasize the incorporation of computational mechanism into the definition of rational action. Theories are specified as optimal program problems, defined by an adaptation environment, a bounded machine, and a utility function. Such theories yield different classes of explanation, depending on the extent to which they emphasize adaptation to bounds, and adaptation to some ecology that differs from the immediate local environment. We illustrate this variation with examples from three domains: visual attention in a linguistic task, manual response ordering, and reasoning. We explore the relation of this framework to existing "levels" approaches to explanation, and to other optimality-based modeling approaches. Copyright © 2014 Cognitive Science Society, Inc.

  2. Identification and estimation of the area planted with irrigated rice based on the visual interpretation of LANDSAT MSS data

    NASA Technical Reports Server (NTRS)

    Parada, N. D. J. (Principal Investigator); Moreira, M. A.; Assuncao, G. V.; Novaes, R. A.; Mendoza, A. A. B.; Bauer, C. A.; Ritter, I. T.; Barros, J. A. I.; Perez, J. E.; Thedy, J. L. O.

    1983-01-01

    The objective was to test the feasibility of the application of MSS-LANDSAT data to irrigated rice crop identification and area evaluation, within four rice growing regions of the Rio Grande do Sul state, in order to extend the methodology for the whole state. The applied methodology was visual interpretation of the following LANDSAT products: channels 5 and 7 black and white imageries and color infrared composite imageries all at the scale of 1:250.000. For crop identification and evaluation, the multispectral criterion and the seasonal variation were utilized. Based on the results it was possible to conclude that: (1) the satellite data were efficient for crop area identification and evaluation; (2) the utilization of the multispectral criterion, allied to the seasonal variation of the rice crop areas from the other crops and, (3) the large cloud cover percentage found in the satellite data made it impossible to realize a rice crop spectral monitoring and, therefore, to define the best dates for such data acquisition for rice crop assessment.

  3. Evolution of US military space doctrine: precedents, prospects, and challenges

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, D.J.

    This dissertation examines the evolution of US military space doctrine by: (1) defining military doctrine, its importance, and how it should be evaluated; (2) identifying principles of geopolitics, strategy, and war applicable to military space operations; (3) establishing how well does Air Force aerospace doctrine treat space issues and requirements for itself and the other Services: (4) identifying future directions for military space doctrine; and (5) postulating what might constitute a US military space doctrine in the future. The approach utilized incorporates analyses of the space environment, geopolitics, strategy, the principles of war, and the development of air power andmore » sea power to provide a framework of constants or invariants within which military space operations must be conducted. It also utilizes a framework of inconstants or variants, consisting of technology impacts and organizational requirements, to which military space doctrine must respond. Other doctrinal requirements are derived from the 1987 DOD space policy, the Strategic Defense Initiative, and international space law. Finally, an assessment is made of future concepts and directions of US military space doctrine.« less

  4. A detailed study of gold-nanoparticle loaded cells using X-ray based techniques for cell-tracking applications with single-cell sensitivity

    NASA Astrophysics Data System (ADS)

    Astolfo, Alberto; Arfelli, Fulvia; Schültke, Elisabeth; James, Simon; Mancini, Lucia; Menk, Ralf-Hendrik

    2013-03-01

    In the present study complementary high-resolution imaging techniques on different length scales are applied to elucidate a cellular loading protocol of gold nanoparticles and subsequently its impact on long term and high-resolution cell-tracking utilizing X-ray technology. Although demonstrated for malignant cell lines the results can be applied to non-malignant cell lines as well. In particular the accumulation of the gold marker per cell has been assessed quantitatively by virtue of electron microscopy, two-dimensional X-ray fluorescence imaging techniques and X-ray CT with micrometric and sub-micrometric resolution. Moreover, utilizing these techniques the three dimensional distribution of the incorporated nanoparticles, which are sequestered in lysosomes as a permanent marker, could be determined. The latter allowed elucidation of the gold partition during mitosis and the cell size, which subsequently enabled us to define the optimal instrument settings of a compact microCT system to visualize gold loaded cells. The results obtained demonstrate the feasibility of cell-tracking using X-ray CT with compact sources.

  5. Linking Bacillus cereus Genotypes and Carbohydrate Utilization Capacity.

    PubMed

    Warda, Alicja K; Siezen, Roland J; Boekhorst, Jos; Wells-Bennik, Marjon H J; de Jong, Anne; Kuipers, Oscar P; Nierop Groot, Masja N; Abee, Tjakko

    2016-01-01

    We characterised carbohydrate utilisation of 20 newly sequenced Bacillus cereus strains isolated from food products and food processing environments and two laboratory strains, B. cereus ATCC 10987 and B. cereus ATCC 14579. Subsequently, genome sequences of these strains were analysed together with 11 additional B. cereus reference genomes to provide an overview of the different types of carbohydrate transporters and utilization systems found in B. cereus strains. The combined application of API tests, defined growth media experiments and comparative genomics enabled us to link the carbohydrate utilisation capacity of 22 B. cereus strains with their genome content and in some cases to the panC phylogenetic grouping. A core set of carbohydrates including glucose, fructose, maltose, trehalose, N-acetyl-glucosamine, and ribose could be used by all strains, whereas utilisation of other carbohydrates like xylose, galactose, and lactose, and typical host-derived carbohydrates such as fucose, mannose, N-acetyl-galactosamine and inositol is limited to a subset of strains. Finally, the roles of selected carbohydrate transporters and utilisation systems in specific niches such as soil, foods and the human host are discussed.

  6. Potential Lunar In-Situ Resource Utilization Experiments and Mission Scenarios

    NASA Technical Reports Server (NTRS)

    Sanders, Gerald B.

    2010-01-01

    The extraction and use of resources on the Moon, known as In-Situ Resource Utilization (ISRU), can potentially reduce the cost and risk of human lunar exploration while also increasing science achieved. By not having to bring all of the shielding and mission consumables from Earth and being able to make products on the Moon, missions may require less mass to accomplish the same objectives, carry more science equipment, go to more sites of exploration, and/or provide options to recover from failures not possible with delivery of spares and consumables from Earth alone. While lunar ISRU has significant potential for mass, cost, and risk reduction for human lunar missions, it has never been demonstrated before in space. To demonstrate that ISRU can meet mission needs and to increase confidence in incorporating ISRU capabilities into mission architectures, terrestrial laboratory and analog field testing along with robotic precursor missions are required. A stepwise approach with international collaboration is recommended. This paper will outline the role of ISRU in future lunar missions, and define the approach and possible experiments to increase confidence in ISRU applications for future human lunar exploration

  7. Linking Bacillus cereus Genotypes and Carbohydrate Utilization Capacity

    PubMed Central

    Warda, Alicja K.; Siezen, Roland J.; Boekhorst, Jos; Wells-Bennik, Marjon H. J.; de Jong, Anne; Kuipers, Oscar P.; Nierop Groot, Masja N.; Abee, Tjakko

    2016-01-01

    We characterised carbohydrate utilisation of 20 newly sequenced Bacillus cereus strains isolated from food products and food processing environments and two laboratory strains, B. cereus ATCC 10987 and B. cereus ATCC 14579. Subsequently, genome sequences of these strains were analysed together with 11 additional B. cereus reference genomes to provide an overview of the different types of carbohydrate transporters and utilization systems found in B. cereus strains. The combined application of API tests, defined growth media experiments and comparative genomics enabled us to link the carbohydrate utilisation capacity of 22 B. cereus strains with their genome content and in some cases to the panC phylogenetic grouping. A core set of carbohydrates including glucose, fructose, maltose, trehalose, N-acetyl-glucosamine, and ribose could be used by all strains, whereas utilisation of other carbohydrates like xylose, galactose, and lactose, and typical host-derived carbohydrates such as fucose, mannose, N-acetyl-galactosamine and inositol is limited to a subset of strains. Finally, the roles of selected carbohydrate transporters and utilisation systems in specific niches such as soil, foods and the human host are discussed. PMID:27272929

  8. Metro passengers’ route choice model and its application considering perceived transfer threshold

    PubMed Central

    Jin, Fanglei; Zhang, Yongsheng; Liu, Shasha

    2017-01-01

    With the rapid development of the Metro network in China, the greatly increased route alternatives make passengers’ route choice behavior and passenger flow assignment more complicated, which presents challenges to the operation management. In this paper, a path sized logit model is adopted to analyze passengers’ route choice preferences considering such parameters as in-vehicle time, number of transfers, and transfer time. Moreover, the “perceived transfer threshold” is defined and included in the utility function to reflect the penalty difference caused by transfer time on passengers’ perceived utility under various numbers of transfers. Next, based on the revealed preference data collected in the Guangzhou Metro, the proposed model is calibrated. The appropriate perceived transfer threshold value and the route choice preferences are analyzed. Finally, the model is applied to a personalized route planning case to demonstrate the engineering practicability of route choice behavior analysis. The results show that the introduction of the perceived transfer threshold is helpful to improve the model’s explanatory abilities. In addition, personalized route planning based on route choice preferences can meet passengers’ diversified travel demands. PMID:28957376

  9. A Movement-Assisted Deployment of Collaborating Autonomous Sensors for Indoor and Outdoor Environment Monitoring.

    PubMed

    Niewiadomska-Szynkiewicz, Ewa; Sikora, Andrzej; Marks, Michał

    2016-09-14

    Using mobile robots or unmanned vehicles to assist optimal wireless sensors deployment in a working space can significantly enhance the capability to investigate unknown environments. This paper addresses the issues of the application of numerical optimization and computer simulation techniques to on-line calculation of a wireless sensor network topology for monitoring and tracking purposes. We focus on the design of a self-organizing and collaborative mobile network that enables a continuous data transmission to the data sink (base station) and automatically adapts its behavior to changes in the environment to achieve a common goal. The pre-defined and self-configuring approaches to the mobile-based deployment of sensors are compared and discussed. A family of novel algorithms for the optimal placement of mobile wireless devices for permanent monitoring of indoor and outdoor dynamic environments is described. They employ a network connectivity-maintaining mobility model utilizing the concept of the virtual potential function for calculating the motion trajectories of platforms carrying sensors. Their quality and utility have been justified through simulation experiments and are discussed in the final part of the paper.

  10. Assessing student written problem solutions: A problem-solving rubric with application to introductory physics

    NASA Astrophysics Data System (ADS)

    Docktor, Jennifer L.; Dornfeld, Jay; Frodermann, Evan; Heller, Kenneth; Hsu, Leonardo; Jackson, Koblar Alan; Mason, Andrew; Ryan, Qing X.; Yang, Jie

    2016-06-01

    Problem solving is a complex process valuable in everyday life and crucial for learning in the STEM fields. To support the development of problem-solving skills it is important for researchers and curriculum developers to have practical tools that can measure the difference between novice and expert problem-solving performance in authentic classroom work. It is also useful if such tools can be employed by instructors to guide their pedagogy. We describe the design, development, and testing of a simple rubric to assess written solutions to problems given in undergraduate introductory physics courses. In particular, we present evidence for the validity, reliability, and utility of the instrument. The rubric identifies five general problem-solving processes and defines the criteria to attain a score in each: organizing problem information into a Useful Description, selecting appropriate principles (Physics Approach), applying those principles to the specific conditions in the problem (Specific Application of Physics), using Mathematical Procedures appropriately, and displaying evidence of an organized reasoning pattern (Logical Progression).

  11. Accelerating North American rangeland conservation with earth observation data and user driven web applications.

    NASA Astrophysics Data System (ADS)

    Allred, B. W.; Naugle, D.; Donnelly, P.; Tack, J.; Jones, M. O.

    2016-12-01

    In 2010, the USDA Natural Resources Conservation Service (NRCS) launched the Sage Grouse Initiative (SGI) to voluntarily reduce threats facing sage-grouse and rangelands on private lands. Over the past five years, SGI has matured into a primary catalyst for rangeland and wildlife conservation across the North American west, focusing on the shared vision of wildlife conservation through sustainable working landscapes and providing win-win solutions for producers, sage grouse, and 350 other sagebrush obligate species. SGI and its partners have invested a total of $750 million into rangeland and wildlife conservation. Moving forward, SGI continues to focus on rangeland conservation. Partnering with Google Earth Engine, SGI has developed outcome monitoring and conservation planning tools at continental scales. The SGI science team is currently developing assessment and monitoring algorithms of key conservation indicators. The SGI web application utilizes Google Earth Engine for user defined analysis and planning, putting the appropriate information directly into the hands of managers and conservationists.

  12. JAX Colony Management System (JCMS): an extensible colony and phenotype data management system.

    PubMed

    Donnelly, Chuck J; McFarland, Mike; Ames, Abigail; Sundberg, Beth; Springer, Dave; Blauth, Peter; Bult, Carol J

    2010-04-01

    The Jackson Laboratory Colony Management System (JCMS) is a software application for managing data and information related to research mouse colonies, associated biospecimens, and experimental protocols. JCMS runs directly on computers that run one of the PC Windows operating systems, but can be accessed via web browser interfaces from any computer running a Windows, Macintosh, or Linux operating system. JCMS can be configured for a single user or multiple users in small- to medium-size work groups. The target audience for JCMS includes laboratory technicians, animal colony managers, and principal investigators. The application provides operational support for colony management and experimental workflows, sample and data tracking through transaction-based data entry forms, and date-driven work reports. Flexible query forms allow researchers to retrieve database records based on user-defined criteria. Recent advances in handheld computers with integrated barcode readers, middleware technologies, web browsers, and wireless networks add to the utility of JCMS by allowing real-time access to the database from any networked computer.

  13. Carbon Nanofiber Electrode Array for Neurochemical Monitoring

    NASA Technical Reports Server (NTRS)

    Koehne, Jessica E.

    2017-01-01

    A sensor platform based on vertically aligned carbon nanofibers (CNFs) has been developed. Their inherent nanometer scale, high conductivity, wide potential window, good biocompatibility and well-defined surface chemistry make them ideal candidates as biosensor electrodes. Here, we report using vertically aligned CNF as neurotransmitter recording electrodes for application in a smart deep brain stimulation (DBS) device. Our approach combines a multiplexed CNF electrode chip, developed at NASA Ames Research Center, with the Wireless Instantaneous Neurotransmitter Concentration Sensor (WINCS) system, developed at the Mayo Clinic. Preliminary results indicate that the CNF nanoelectrode arrays are easily integrated with WINCS for neurotransmitter detection in a multiplexed array format. In the future, combining CNF based stimulating and recording electrodes with WINCS may lay the foundation for an implantable smart therapeutic system that utilizes neurochemical feedback control while likely resulting in increased DBS application in various neuropsychiatric disorders. In total, our goal is to take advantage of the nanostructure of CNF arrays for biosensing studies requiring ultrahigh sensitivity, high-degree of miniaturization, and selective biofunctionalization.

  14. Regional seismic wavefield computation on a 3-D heterogeneous Earth model by means of coupled traveling wave synthesis

    USGS Publications Warehouse

    Pollitz, F.F.

    2002-01-01

    I present a new algorithm for calculating seismic wave propagation through a three-dimensional heterogeneous medium using the framework of mode coupling theory originally developed to perform very low frequency (f < ???0.01-0.05 Hz) seismic wavefield computation. It is a Greens function approach for multiple scattering within a defined volume and employs a truncated traveling wave basis set using the locked mode approximation. Interactions between incident and scattered wavefields are prescribed by mode coupling theory and account for the coupling among surface waves, body waves, and evanescent waves. The described algorithm is, in principle, applicable to global and regional wave propagation problems, but I focus on higher frequency (typically f ??????0.25 Hz) applications at regional and local distances where the locked mode approximation is best utilized and which involve wavefields strongly shaped by propagation through a highly heterogeneous crust. Synthetic examples are shown for P-SV-wave propagation through a semi-ellipsoidal basin and SH-wave propagation through a fault zone.

  15. Single cell analysis of normal and leukemic hematopoiesis.

    PubMed

    Povinelli, Benjamin J; Rodriguez-Meira, Alba; Mead, Adam J

    2018-02-01

    The hematopoietic system is well established as a paradigm for the study of cellular hierarchies, their disruption in disease and therapeutic use in regenerative medicine. Traditional approaches to study hematopoiesis involve purification of cell populations based on a small number of surface markers. However, such population-based analysis obscures underlying heterogeneity contained within any phenotypically defined cell population. This heterogeneity can only be resolved through single cell analysis. Recent advances in single cell techniques allow analysis of the genome, transcriptome, epigenome and proteome in single cells at an unprecedented scale. The application of these new single cell methods to investigate the hematopoietic system has led to paradigm shifts in our understanding of cellular heterogeneity in hematopoiesis and how this is disrupted in disease. In this review, we summarize how single cell techniques have been applied to the analysis of hematopoietic stem/progenitor cells in normal and malignant hematopoiesis, with a particular focus on recent advances in single-cell genomics, including how these might be utilized for clinical application. Copyright © 2017. Published by Elsevier Ltd.

  16. Colt: an experiment in wormhole run-time reconfiguration

    NASA Astrophysics Data System (ADS)

    Bittner, Ray; Athanas, Peter M.; Musgrove, Mark

    1996-10-01

    Wormhole run-time reconfiguration (RTR) is an attempt to create a refined computing paradigm for high performance computational tasks. By combining concepts from field programmable gate array (FPGA) technologies with data flow computing, the Colt/Stallion architecture achieves high utilization of hardware resources, and facilitates rapid run-time reconfiguration. Targeted mainly at DSP-type operations, the Colt integrated circuit -- a prototype wormhole RTR device -- compares favorably to contemporary DSP alternatives in terms of silicon area consumed per unit computation and in computing performance. Although emphasis has been placed on signal processing applications, general purpose computation has not been overlooked. Colt is a prototype that defines an architecture not only at the chip level but also in terms of an overall system design. As this system is realized, the concept of wormhole RTR will be applied to numerical computation and DSP applications including those common to image processing, communications systems, digital filters, acoustic processing, real-time control systems and simulation acceleration.

  17. Clinical Applications of Hallucinogens: A Review

    PubMed Central

    Garcia-Romeu, Albert; Kersgaard, Brennan; Addy, Peter H.

    2016-01-01

    Hallucinogens fall into several different classes, as broadly defined by pharmacological mechanism of action, and chemical structure. These include psychedelics, entactogens, dissociatives, and other atypical hallucinogens. Although these classes do not share a common primary mechanism of action, they do exhibit important similarities in their ability to occasion temporary but profound alterations of consciousness, involving acute changes in somatic, perceptual, cognitive, and affective processes. Such effects likely contribute to their recreational use. However, a growing body of evidence indicates that these drugs may have therapeutic applications beyond their potential for abuse. This review will present data on several classes of hallucinogens with a particular focus on psychedelics, entactogens, and dissociatives, for which clinical utility has been most extensively documented. Information on each class is presented in turn, tracing relevant historical insights, highlighting similarities and differences between the classes from the molecular to the behavioral level, and presenting the most up-to-date information on clinically oriented research with these substances, with important ramifications for their potential therapeutic value. PMID:27454674

  18. Material Recovery and Waste Form Development FY 2015 Accomplishments Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Todd, Terry Allen; Braase, Lori Ann

    The Material Recovery and Waste Form Development (MRWFD) Campaign under the U.S. Department of Energy (DOE) Fuel Cycle Technologies (FCT) Program is responsible for developing advanced separation and waste form technologies to support the various fuel cycle options defined in the DOE Nuclear Energy Research and Development Roadmap, Report to Congress, April 2010. The FY 2015 Accomplishments Report provides a highlight of the results of the research and development (R&D) efforts performed within the MRWFD Campaign in FY-14. Each section contains a high-level overview of the activities, results, technical point of contact, applicable references, and documents produced during the fiscalmore » year. This report briefly outlines campaign management and integration activities, but primarily focuses on the many technical accomplishments made during FY-15. The campaign continued to utilize an engineering driven-science-based approach to maintain relevance and focus. There was increased emphasis on development of technologies that support near-term applications that are relevant to the current once-through fuel cycle.« less

  19. A Development of Lightweight Grid Interface

    NASA Astrophysics Data System (ADS)

    Iwai, G.; Kawai, Y.; Sasaki, T.; Watase, Y.

    2011-12-01

    In order to help a rapid development of Grid/Cloud aware applications, we have developed API to abstract the distributed computing infrastructures based on SAGA (A Simple API for Grid Applications). SAGA, which is standardized in the OGF (Open Grid Forum), defines API specifications to access distributed computing infrastructures, such as Grid, Cloud and local computing resources. The Universal Grid API (UGAPI), which is a set of command line interfaces (CLI) and APIs, aims to offer simpler API to combine several SAGA interfaces with richer functionalities. These CLIs of the UGAPI offer typical functionalities required by end users for job management and file access to the different distributed computing infrastructures as well as local computing resources. We have also built a web interface for the particle therapy simulation and demonstrated the large scale calculation using the different infrastructures at the same time. In this paper, we would like to present how the web interface based on UGAPI and SAGA achieve more efficient utilization of computing resources over the different infrastructures with technical details and practical experiences.

  20. Burdensome and Unnecessary Reporting Requirements of the Public Utility Regulatory Policies Act Need to be Changed.

    DTIC Science & Technology

    1981-09-14

    Commissioners PURPA Public Utility Regulatory Policies Act %GLOSSAk(¥ Aavertising standard As aefineu oy PUijA, no electric utility may recover from any per- son...systems in 4o States, vuerto kico, (uam, and virgin Islanus. Automatic adjustment As detined by PURPA , no electric clause stanuard utility may increase any...Interruptiole rate standard As defined by PURPA , a rate oftereu to eacn industrial and commercial * electric consumer tnat snail retiect the cost of

  1. Collective Intelligence. Chapter 17

    NASA Technical Reports Server (NTRS)

    Wolpert, David H.

    2003-01-01

    Many systems of self-interested agents have an associated performance criterion that rates the dynamic behavior of the overall system. This chapter presents an introduction to the science of such systems. Formally, collectives are defined as any system having the following two characteristics: First, the system must contain one or more agents each of which we view as trying to maximize an associated private utility; second, the system must have an associated world utility function that rates the possible behaviors of that overall system. In practice, collectives are often very large, distributed, and support little, if any, centralized communication and control, although those characteristics are not part of their formal definition. A naturally occurring example of a collective is a human economy. One can identify the agents and their private utilities as the human individuals in the economy and the associated personal rewards they are each trying to maximize. One could then identify the world utility as the time average of the gross domestic product. ("World utility" per se is not a construction internal to a human economy, but rather something defined from the outside.) To achieve high world utility it is necessary to avoid having the agents work at cross-purposes lest phenomena like liquidity traps or the Tragedy of the Commons (TOC) occur, in which agents' individually pursuing their private utilities lowers world utility. The obvious way to avoid such phenomena is by modifying the agents utility functions to be "aligned" with the world utility. This can be done via punitive legislation. A real-world example of an attempt to do this was the creation of antitrust regulations designed to prevent monopolistic practices.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, Mathew; Bowen, Brian; Coles, Dwight

    The Middleware Automated Deployment Utilities consists the these three components: MAD: Utility designed to automate the deployment of java applications to multiple java application servers. The product contains a front end web utility and backend deployment scripts. MAR: Web front end to maintain and update the components inside database. MWR-Encrypt: Web utility to convert a text string to an encrypted string that is used by the Oracle Weblogic application server. The encryption is done using the built in functions if the Oracle Weblogic product and is mainly used to create an encrypted version of a database password.

  3. A model of cloud application assignments in software-defined storages

    NASA Astrophysics Data System (ADS)

    Bolodurina, Irina P.; Parfenov, Denis I.; Polezhaev, Petr N.; E Shukhman, Alexander

    2017-01-01

    The aim of this study is to analyze the structure and mechanisms of interaction of typical cloud applications and to suggest the approaches to optimize their placement in storage systems. In this paper, we describe a generalized model of cloud applications including the three basic layers: a model of application, a model of service, and a model of resource. The distinctive feature of the model suggested implies analyzing cloud resources from the user point of view and from the point of view of a software-defined infrastructure of the virtual data center (DC). The innovation character of this model is in describing at the same time the application data placements, as well as the state of the virtual environment, taking into account the network topology. The model of software-defined storage has been developed as a submodel within the resource model. This model allows implementing the algorithm for control of cloud application assignments in software-defined storages. Experimental researches returned this algorithm decreases in cloud application response time and performance growth in user request processes. The use of software-defined data storages allows the decrease in the number of physical store devices, which demonstrates the efficiency of our algorithm.

  4. 10 CFR 50.30 - Filing of application; oath or affirmation.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... operate, or manufacture, a production or utilization facility (including an early site permit, combined.... (e) Filing Fees. Each application for a standard design approval or production or utilization... 50.30 Energy NUCLEAR REGULATORY COMMISSION DOMESTIC LICENSING OF PRODUCTION AND UTILIZATION...

  5. Growth of Azotobacter chroococcum in chemically defined media containing p-hydroxybenzoic acid and protocatechuic acid.

    PubMed

    Juarez, B; Martinez-Toledo, M V; Gonzalez-Lopez, J

    2005-06-01

    Growth and utilization of different phenolic acids present in olive mill wastewater (OMW) by Azotobacter chroococcum were studied in chemically defined media. Growth and utilization of phenolic acids were only detected when the microorganism was cultured on p-hydroxybenzoic acid at concentration from 0.01% to 0.5% (w/v) and protocatechuic acid at concentration from 0.01% to 0.3% (w/v) as sole carbon sources suggesting that only these phenolic compounds could be utilized as a carbon source by A. chroococcum. Moreover when culture media were added with a mixture of 0.3% of protocatechuic acid and 0.3% p-hydroxybenzoic acid, the microorganism degradated in first place protocatechuic acid and once the culture medium was depleted of this compound, the degradation of p-hydroxybenzoic acid commenced very fast.

  6. Data-driven approach for assessing utility of medical tests using electronic medical records.

    PubMed

    Skrøvseth, Stein Olav; Augestad, Knut Magne; Ebadollahi, Shahram

    2015-02-01

    To precisely define the utility of tests in a clinical pathway through data-driven analysis of the electronic medical record (EMR). The information content was defined in terms of the entropy of the expected value of the test related to a given outcome. A kernel density classifier was used to estimate the necessary distributions. To validate the method, we used data from the EMR of the gastrointestinal department at a university hospital. Blood tests from patients undergoing surgery for gastrointestinal surgery were analyzed with respect to second surgery within 30 days of the index surgery. The information content is clearly reflected in the patient pathway for certain combinations of tests and outcomes. C-reactive protein tests coupled to anastomosis leakage, a severe complication show a clear pattern of information gain through the patient trajectory, where the greatest gain from the test is 3-4 days post index surgery. We have defined the information content in a data-driven and information theoretic way such that the utility of a test can be precisely defined. The results reflect clinical knowledge. In the case we used the tests carry little negative impact. The general approach can be expanded to cases that carry a substantial negative impact, such as in certain radiological techniques. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  7. Integrated Biorefinery for Biofuels Production

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, Gabriel

    This project has focused on very low grade fats, oil and greases found in municipal, commercial and industrial facilities around the country. These wastes are often disposed in landfills, wastewater treatment plants or farm fields or are blended illegally into animal feeds. Using any of these waste fatty materials that are unfit for human or animal nutrition as a clean alternative fuel makes good sense. This project defines the aforementioned wastes in terms of quality and prevalence in the US, then builds on specific promising pathways for utilizing these carbon neutral wastes. These pathways are discussed and researched at bench-scale,more » and in one instance, at pilot-scale. The three primary pathways are as follows: The production of Renewable Diesel Oil (RDO) as a stand-alone fuel or blended with standard distillate or residual hydrocarbons; The production of RDO as a platform for the further manufacture of Biodiesel utilizing acid esterification; The production of RDO as a platform for the manufacture of an ASTM Diesel Fuel using one or more catalysts to effect a decarboxylation of the carboxylics present in RDO This study shows that Biodiesel and ASTM Diesel produced at bench-scale (utilizing RDO made from grease trap waste as an input) could not meet industry specifications utilizing the technologies that were selected by the investigators. Details of these investigations are discussed in this report and will hopefully provide a starting point for other researchers interested in these pathways in future studies. Although results were inconclusive in finding ways to utilize RDO technology, in effect, as a pretreatment for commonly discussed technologies such as Biodiesel and ASTM Diesel, this study does shed light on the properties, performance and cost of utilizing waste greases directly as a retail liquid fuel (RDO). The utilization as a retail RDO as a boiler fuel, or for other such applications, is the most important finding of the study.« less

  8. Electrical-power-system data base for consumables analysis. Volume 1: Electrical equipment list, activity blocks, and time lines

    NASA Technical Reports Server (NTRS)

    Pipher, M. D.; Green, P. A.; Wolfgram, D. F.

    1975-01-01

    A standardized data base is described which consists of a space shuttle electrical equipment list, activity blocks defining electrical equipment utilization, and activity-block time lines for specific mission analyses. Information is presented to facilitate utilization of the data base, to provide the basis for the electrical equipment utilization to enable interpretation of analyses based on the data contained herein.

  9. 10 CFR 780.41 - Contents of application.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... of the Atomic Energy Act of 1954 § 780.41 Contents of application. In addition to the information... production or utilization of special nuclear material or atomic energy; (b) The applicant's contention, with... production or utilization of special nuclear material or atomic energy to which applicant proposes to apply...

  10. 10 CFR 780.41 - Contents of application.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... of the Atomic Energy Act of 1954 § 780.41 Contents of application. In addition to the information... production or utilization of special nuclear material or atomic energy; (b) The applicant's contention, with... production or utilization of special nuclear material or atomic energy to which applicant proposes to apply...

  11. 10 CFR 780.41 - Contents of application.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... of the Atomic Energy Act of 1954 § 780.41 Contents of application. In addition to the information... production or utilization of special nuclear material or atomic energy; (b) The applicant's contention, with... production or utilization of special nuclear material or atomic energy to which applicant proposes to apply...

  12. 10 CFR 780.41 - Contents of application.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... of the Atomic Energy Act of 1954 § 780.41 Contents of application. In addition to the information... production or utilization of special nuclear material or atomic energy; (b) The applicant's contention, with... production or utilization of special nuclear material or atomic energy to which applicant proposes to apply...

  13. 10 CFR 780.41 - Contents of application.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... of the Atomic Energy Act of 1954 § 780.41 Contents of application. In addition to the information... production or utilization of special nuclear material or atomic energy; (b) The applicant's contention, with... production or utilization of special nuclear material or atomic energy to which applicant proposes to apply...

  14. Design study of wind turbines, 50 kW to 3000 kW for electric utility applications: Executive summary

    NASA Technical Reports Server (NTRS)

    1977-01-01

    Preliminary designs of low power (50 to 500 kW) and high power (500 to 3000 kW) wind generator systems (WGS) for electric utility applications were developed. These designs provide the bases for detail design, fabrication, and experimental demonstration testing of these units at selected utility sites. Several feasible WGS configurations were evaluated, and the concept offering the lowest energy cost potential and minimum technical risk for utility applications was selected. The selected concept was optimized utilizing a parametric computer program prepared for this purpose. The utility requirements evaluation task examined the economic, operational and institutional factors affecting the WGS in a utility environment, and provided additional guidance for the preliminary design effort. Results of the conceptual design task indicated that a rotor operating at constant speed, driving an AC generator through a gear transmission is the most cost effective WGS configuration.

  15. Linear Programming for Vocational Education Planning. Interim Report.

    ERIC Educational Resources Information Center

    Young, Robert C.; And Others

    The purpose of the paper is to define for potential users of vocational education management information systems a quantitative analysis technique and its utilization to facilitate more effective planning of vocational education programs. Defining linear programming (LP) as a management technique used to solve complex resource allocation problems…

  16. Redefining climate regions in the United States of America using satellite remote sensing and machine learning for public health applications.

    PubMed

    Liss, Alexander; Koch, Magaly; Naumova, Elena N

    2014-12-01

    Existing climate classification has not been designed for an efficient handling of public health scenarios. This work aims to design an objective spatial climate regionalization method for assessing health risks in response to extreme weather. Specific climate regions for the conterminous United States of America (USA) were defined using satellite remote sensing (RS) data and compared with the conventional Köppen-Geiger (KG) divisions. Using the nationwide database of hospitalisations among the elderly (≥65 year olds), we examined the utility of a RS-based climate regionalization to assess public health risk due to extreme weather, by comparing the rate of hospitalisations in response to thermal extremes across climatic regions. Satellite image composites from 2002-2012 were aggregated, masked and compiled into a multi-dimensional dataset. The conterminous USA was classified into 8 distinct regions using a stepwise regionalization approach to limit noise and collinearity (LKN), which exhibited a high degree of consistency with the KG regions and a well-defined regional delineation by annual and seasonal temperature and precipitation values. The most populous was a temperate wet region (10.9 million), while the highest rate of hospitalisations due to exposure to heat and cold (9.6 and 17.7 cases per 100,000 persons at risk, respectively) was observed in the relatively warm and humid south-eastern region. RS-based regionalization demonstrates strong potential for assessing the adverse effects of severe weather on human health and for decision support. Its utility in forecasting and mitigating these effects has to be further explored.

  17. Utility functions predict variance and skewness risk preferences in monkeys

    PubMed Central

    Genest, Wilfried; Stauffer, William R.; Schultz, Wolfram

    2016-01-01

    Utility is the fundamental variable thought to underlie economic choices. In particular, utility functions are believed to reflect preferences toward risk, a key decision variable in many real-life situations. To assess the validity of utility representations, it is therefore important to examine risk preferences. In turn, this approach requires formal definitions of risk. A standard approach is to focus on the variance of reward distributions (variance-risk). In this study, we also examined a form of risk related to the skewness of reward distributions (skewness-risk). Thus, we tested the extent to which empirically derived utility functions predicted preferences for variance-risk and skewness-risk in macaques. The expected utilities calculated for various symmetrical and skewed gambles served to define formally the direction of stochastic dominance between gambles. In direct choices, the animals’ preferences followed both second-order (variance) and third-order (skewness) stochastic dominance. Specifically, for gambles with different variance but identical expected values (EVs), the monkeys preferred high-variance gambles at low EVs and low-variance gambles at high EVs; in gambles with different skewness but identical EVs and variances, the animals preferred positively over symmetrical and negatively skewed gambles in a strongly transitive fashion. Thus, the utility functions predicted the animals’ preferences for variance-risk and skewness-risk. Using these well-defined forms of risk, this study shows that monkeys’ choices conform to the internal reward valuations suggested by their utility functions. This result implies a representation of utility in monkeys that accounts for both variance-risk and skewness-risk preferences. PMID:27402743

  18. Utility functions predict variance and skewness risk preferences in monkeys.

    PubMed

    Genest, Wilfried; Stauffer, William R; Schultz, Wolfram

    2016-07-26

    Utility is the fundamental variable thought to underlie economic choices. In particular, utility functions are believed to reflect preferences toward risk, a key decision variable in many real-life situations. To assess the validity of utility representations, it is therefore important to examine risk preferences. In turn, this approach requires formal definitions of risk. A standard approach is to focus on the variance of reward distributions (variance-risk). In this study, we also examined a form of risk related to the skewness of reward distributions (skewness-risk). Thus, we tested the extent to which empirically derived utility functions predicted preferences for variance-risk and skewness-risk in macaques. The expected utilities calculated for various symmetrical and skewed gambles served to define formally the direction of stochastic dominance between gambles. In direct choices, the animals' preferences followed both second-order (variance) and third-order (skewness) stochastic dominance. Specifically, for gambles with different variance but identical expected values (EVs), the monkeys preferred high-variance gambles at low EVs and low-variance gambles at high EVs; in gambles with different skewness but identical EVs and variances, the animals preferred positively over symmetrical and negatively skewed gambles in a strongly transitive fashion. Thus, the utility functions predicted the animals' preferences for variance-risk and skewness-risk. Using these well-defined forms of risk, this study shows that monkeys' choices conform to the internal reward valuations suggested by their utility functions. This result implies a representation of utility in monkeys that accounts for both variance-risk and skewness-risk preferences.

  19. Three-Dimensional Polypeptide Architectures Through Tandem Catalysis and Click Chemistry

    NASA Astrophysics Data System (ADS)

    Rhodes, Allison Jane

    Rapid renal clearance, liver accumulation, proteolytic degradation and non-specificity are challenges small molecule drugs, peptides, proteins and nucleic acid therapeutics encounter en route to their intended destination within the body. Nanocarriers (i.e. dendritric polymers, vesicles, and micelles) of approximately 100 nm in diameter, shuttle small molecule drugs to their desired location through passive (EPR effect) and active (ligand-mediated) targeting, maximizing therapeutic efficiency. Polypeptide-based polymers are water-soluble, biocompatible, non-toxic and are therefore excellent candidates for nanocarriers. Dendritic polymers, including dendrimers, cylindrical brushes, and star polymers, are the newest class of nanomedicine drug delivery vehicles. The synthesis and characterization of dendritic polymers is challenging, with tedious and costly procedures. Dendritic polymers possess peripheral pendent functional groups that can potentially be used in ligand-mediated drug delivery vehicles and bioimaging applications. More specifically, cylindrical brushes are dendritic polymers where a single linear polymer (primary chain) has polymer chains (secondary chains) grafted to it. Recently, research groups have shown that cylindrical brush polymers are capable of nanoparticle and supramolecular structure self-assembly. The facile preparation of high-density brush copolypeptides by the "grafting from" approach will be discussed. This approach utilizes a novel, tandem catalytic methodology where alloc-alpha-aminoamide groups are installed within the side-chains of the alpha-amino-N-carboxyanhydride (NCA) monomer serving as masked initiators. These groups are inert during cobalt initiated NCA polymerization, and give alloc-alpha-aminoamide substituted polypeptide main-chains. The alloc-alpha-aminoamide groups are activated in situ using nickel to generate initiators for growth of side-chain brush segments. This method proves to be efficient, yielding well-defined, high-density brushes for applications in drug delivery and imaging. Here, we also report a method for the synthesis of soluble, well-defined, azido functionalized polypeptides in a straightforward, 3-step synthesis. Homo and diblock azidopolypeptides were prepared with controlled segment lengths via living polymerization using Co(PMe3)4 initiator. Through copper azide alkyne click chemistry (CuAAC) in organic solvent, azidopolypeptides were regioselectively and quantitatively modified with carboxylic acid (pH-responsive), amino acid and sugar functional groups. Finally, the advances towards well-defined hyperbranched polypeptides through alpha-amino-acid-N-thiocarboxyanhydrides (NTAs) will be discussed. Within the past 10 years, controlled NCA (alpha-amino acid-N-carboxyanhydride) ring-opening polymerization (ROP) has emerged, expanding the application of copolypeptide polymers in various drug delivery and tissue engineering motifs. Modification of NCA monomers to the corresponding alpha-amino-acid-N-thiocarboxyanhydride (NTA) will diversify ROP reactions, leading to more complex polypeptides (such as hyperbranched polymers), in addition to the possibility of performing these polymerizations under ambient conditions, which would greatly expand their potential utility. The project focuses on the preparation of hyperbranched polypeptides with well-defined architectures and controlled branching density in a one-pot reaction. This will be accomplished by taking advantage of the different selectivities of Co(PMe3)4 and depeNi(COD) polymerization initiators, and by exploiting the reactivity difference between NCA and the more stable NTA monomers.

  20. Development and application of a real-time testbed for multiagent system interoperability: A case study on hierarchical microgrid control

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cintuglu, Mehmet Hazar; Youssef, Tarek; Mohammed, Osama A.

    This article presents the development and application of a real-time testbed for multiagent system interoperability. As utility independent private microgrids are installed constantly, standardized interoperability frameworks are required to define behavioral models of the individual agents for expandability and plug-and-play operation. In this paper, we propose a comprehensive hybrid agent framework combining the foundation for intelligent physical agents (FIPA), IEC 61850, and data distribution service (DDS) standards. The IEC 61850 logical node concept is extended using FIPA based agent communication language (ACL) with application specific attributes and deliberative behavior modeling capability. The DDS middleware is adopted to enable a real-timemore » publisher-subscriber interoperability mechanism between platforms. The proposed multi-agent framework was validated in a laboratory based testbed involving developed intelligent electronic device (IED) prototypes and actual microgrid setups. Experimental results were demonstrated for both decentralized and distributed control approaches. Secondary and tertiary control levels of a microgrid were demonstrated for decentralized hierarchical control case study. A consensus-based economic dispatch case study was demonstrated as a distributed control example. Finally, it was shown that the developed agent platform is industrially applicable for actual smart grid field deployment.« less

  1. An Intracranial Electroencephalography (iEEG) Brain Function Mapping Tool with an Application to Epilepsy Surgery Evaluation.

    PubMed

    Wang, Yinghua; Yan, Jiaqing; Wen, Jianbin; Yu, Tao; Li, Xiaoli

    2016-01-01

    Before epilepsy surgeries, intracranial electroencephalography (iEEG) is often employed in function mapping and epileptogenic foci localization. Although the implanted electrodes provide crucial information for epileptogenic zone resection, a convenient clinical tool for electrode position registration and Brain Function Mapping (BFM) visualization is still lacking. In this study, we developed a BFM Tool, which facilitates electrode position registration and BFM visualization, with an application to epilepsy surgeries. The BFM Tool mainly utilizes electrode location registration and function mapping based on pre-defined brain models from other software. In addition, the electrode node and mapping properties, such as the node size/color, edge color/thickness, mapping method, can be adjusted easily using the setting panel. Moreover, users may manually import/export location and connectivity data to generate figures for further application. The role of this software is demonstrated by a clinical study of language area localization. The BFM Tool helps clinical doctors and researchers visualize implanted electrodes and brain functions in an easy, quick and flexible manner. Our tool provides convenient electrode registration, easy brain function visualization, and has good performance. It is clinical-oriented and is easy to deploy and use. The BFM tool is suitable for epilepsy and other clinical iEEG applications.

  2. An Intracranial Electroencephalography (iEEG) Brain Function Mapping Tool with an Application to Epilepsy Surgery Evaluation

    PubMed Central

    Wang, Yinghua; Yan, Jiaqing; Wen, Jianbin; Yu, Tao; Li, Xiaoli

    2016-01-01

    Objects: Before epilepsy surgeries, intracranial electroencephalography (iEEG) is often employed in function mapping and epileptogenic foci localization. Although the implanted electrodes provide crucial information for epileptogenic zone resection, a convenient clinical tool for electrode position registration and Brain Function Mapping (BFM) visualization is still lacking. In this study, we developed a BFM Tool, which facilitates electrode position registration and BFM visualization, with an application to epilepsy surgeries. Methods: The BFM Tool mainly utilizes electrode location registration and function mapping based on pre-defined brain models from other software. In addition, the electrode node and mapping properties, such as the node size/color, edge color/thickness, mapping method, can be adjusted easily using the setting panel. Moreover, users may manually import/export location and connectivity data to generate figures for further application. The role of this software is demonstrated by a clinical study of language area localization. Results: The BFM Tool helps clinical doctors and researchers visualize implanted electrodes and brain functions in an easy, quick and flexible manner. Conclusions: Our tool provides convenient electrode registration, easy brain function visualization, and has good performance. It is clinical-oriented and is easy to deploy and use. The BFM tool is suitable for epilepsy and other clinical iEEG applications. PMID:27199729

  3. Cogeneration technology alternatives study. Volume 1: Summary report

    NASA Technical Reports Server (NTRS)

    1980-01-01

    Data and information in the area of advanced energy conversion systems for industrial congeneration applications in the 1985-2000 time period was studied. Six current and thirty-one advanced energy conversion systems were defined and combined with appropriate balance-of-plant equipment. Twenty-six industrial processes were selected from among the high energy consuming industries to serve as a framework for the study. Each conversion system was analyzed as a cogenerator with each industrial plant. Fuel consumption, costs, and environmental intrusion were evaluated and compared to corresponding traditional values. Various cogeneration strategies were analyzed and both topping and bottoming (using industrial by-product heat) applications were included. The advanced energy conversion technologies indicated reduced fuel consumption, costs, and emissions. Typically fuel energy savings of 10 to 25 percent were predicted compared to traditional on-site furnaces and utility electricity. With the variety of industrial requirements, each advanced technology had attractive applications. Overall, fuel cells indicated the greatest fuel energy savings and emission reductions. Gas turbines and combined cycles indicated high overall annual cost savings. Steam turbines and gas turbines produced high estimated returns. In some applications, diesels were most efficient. The advanced technologies used coal-derived fuels, or coal with advanced fluid bed combustion or on-site gasification systems.

  4. Development and application of a real-time testbed for multiagent system interoperability: A case study on hierarchical microgrid control

    DOE PAGES

    Cintuglu, Mehmet Hazar; Youssef, Tarek; Mohammed, Osama A.

    2016-08-10

    This article presents the development and application of a real-time testbed for multiagent system interoperability. As utility independent private microgrids are installed constantly, standardized interoperability frameworks are required to define behavioral models of the individual agents for expandability and plug-and-play operation. In this paper, we propose a comprehensive hybrid agent framework combining the foundation for intelligent physical agents (FIPA), IEC 61850, and data distribution service (DDS) standards. The IEC 61850 logical node concept is extended using FIPA based agent communication language (ACL) with application specific attributes and deliberative behavior modeling capability. The DDS middleware is adopted to enable a real-timemore » publisher-subscriber interoperability mechanism between platforms. The proposed multi-agent framework was validated in a laboratory based testbed involving developed intelligent electronic device (IED) prototypes and actual microgrid setups. Experimental results were demonstrated for both decentralized and distributed control approaches. Secondary and tertiary control levels of a microgrid were demonstrated for decentralized hierarchical control case study. A consensus-based economic dispatch case study was demonstrated as a distributed control example. Finally, it was shown that the developed agent platform is industrially applicable for actual smart grid field deployment.« less

  5. Carbon Nanofiber Nanoelectrodes for Biosensing Applications

    NASA Technical Reports Server (NTRS)

    Koehne, Jessica Erin

    2014-01-01

    A sensor platform based on vertically aligned carbon nanofibers (CNFs) has been developed. Their inherent nanometer scale, high conductivity, wide potential window, good biocompatibility and well-defined surface chemistry make them ideal candidates as biosensor electrodes. Here, we report two studies using vertically aligned CNF nanoelectrodes for biomedical applications. CNF arrays are investigated as neural stimulation and neurotransmitter recording electrodes for application in deep brain stimulation (DBS). Polypyrrole coated CNF nanoelectrodes have shown great promise as stimulating electrodes due to their large surface area, low impedance, biocompatibility and capacity for highly localized stimulation. CNFs embedded in SiO2 have been used as sensing electrodes for neurotransmitter detection. Our approach combines a multiplexed CNF electrode chip, developed at NASA Ames Research Center, with the Wireless Instantaneous Neurotransmitter Concentration Sensor (WINCS) system, developed at the Mayo Clinic. Preliminary results indicate that the CNF nanoelectrode arrays are easily integrated with WINCS for neurotransmitter detection in a multiplexed array format. In the future, combining CNF based stimulating and recording electrodes with WINCS may lay the foundation for an implantable smart therapeutic system that utilizes neurochemical feedback control while likely resulting in increased DBS application in various neuropsychiatric disorders. In total, our goal is to take advantage of the nanostructure of CNF arrays for biosensing studies requiring ultrahigh sensitivity, high-degree of miniaturization, and selective biofunctionalization.

  6. The Japanese Surgical Reimbursement System Fails to Reflect Resource Utilization.

    PubMed

    Nakata, Yoshinori; Watanabe, Yuichi; Otake, Hiroshi; Nakamura, Toshihito; Oiso, Giichiro; Sawa, Tomohiro

    2015-01-01

    The goal of this study was to examine the current Japanese surgical payment system from the viewpoint of resource utilization. We collected data from surgical records in Teikyo University's electronic medical record system from April 1 through September 30, 2013. We defined the decision-making unit as a surgeon with the highest academic rank in the surgery. Inputs were defined as: 1) the number of medical doctors who assisted surgery and 2) the time of operation from skin incision to closure. An output was defined as the surgical fee. We calculated each surgeon's efficiency score using the output-oriented Banker-Charnes-Cooper model of data envelopment analysis. We compared the efficiency scores of each surgical specialty using the Kruskal-Wallis and Steel methods. We analyzed 2,825 surgical procedures performed by 103 surgeons. The difference in efficiency scores was significant (P = 0.0001). The thoracic surgeons were the most efficient and were more efficient than plastic, obstetric and gynecologic, urologic, otorhinolaryngologic, orthopedic, general, and emergency surgeons (P < 0.05). We demonstrated that surgeons' efficiency in operating rooms was significantly different among surgical specialties. This suggests that the Japanese surgical reimbursement scales fails to reflect resource utilization. © The Author(s) 2015.

  7. A Custom Approach for a Flexible, Real-Time and Reliable Software Defined Utility.

    PubMed

    Zaballos, Agustín; Navarro, Joan; Martín De Pozuelo, Ramon

    2018-02-28

    Information and communication technologies (ICTs) have enabled the evolution of traditional electric power distribution networks towards a new paradigm referred to as the smart grid. However, the different elements that compose the ICT plane of a smart grid are usually conceived as isolated systems that typically result in rigid hardware architectures, which are hard to interoperate, manage and adapt to new situations. In the recent years, software-defined systems that take advantage of software and high-speed data network infrastructures have emerged as a promising alternative to classic ad hoc approaches in terms of integration, automation, real-time reconfiguration and resource reusability. The purpose of this paper is to propose the usage of software-defined utilities (SDUs) to address the latent deployment and management limitations of smart grids. More specifically, the implementation of a smart grid's data storage and management system prototype by means of SDUs is introduced, which exhibits the feasibility of this alternative approach. This system features a hybrid cloud architecture able to meet the data storage requirements of electric utilities and adapt itself to their ever-evolving needs. Conducted experimentations endorse the feasibility of this solution and encourage practitioners to point their efforts in this direction.

  8. A Custom Approach for a Flexible, Real-Time and Reliable Software Defined Utility

    PubMed Central

    2018-01-01

    Information and communication technologies (ICTs) have enabled the evolution of traditional electric power distribution networks towards a new paradigm referred to as the smart grid. However, the different elements that compose the ICT plane of a smart grid are usually conceived as isolated systems that typically result in rigid hardware architectures, which are hard to interoperate, manage and adapt to new situations. In the recent years, software-defined systems that take advantage of software and high-speed data network infrastructures have emerged as a promising alternative to classic ad hoc approaches in terms of integration, automation, real-time reconfiguration and resource reusability. The purpose of this paper is to propose the usage of software-defined utilities (SDUs) to address the latent deployment and management limitations of smart grids. More specifically, the implementation of a smart grid’s data storage and management system prototype by means of SDUs is introduced, which exhibits the feasibility of this alternative approach. This system features a hybrid cloud architecture able to meet the data storage requirements of electric utilities and adapt itself to their ever-evolving needs. Conducted experimentations endorse the feasibility of this solution and encourage practitioners to point their efforts in this direction. PMID:29495599

  9. Synoptic typing: interdisciplinary application methods with three practical hydroclimatological examples

    NASA Astrophysics Data System (ADS)

    Siegert, C. M.; Leathers, D. J.; Levia, D. F.

    2017-05-01

    Synoptic classification is a methodology that represents diverse atmospheric variables and allows researchers to relate large-scale atmospheric circulation patterns to regional- and small-scale terrestrial processes. Synoptic classification has often been applied to questions concerning the surface environment. However, full applicability has been under-utilized to date, especially in disciplines such as hydroclimatology, which are intimately linked to atmospheric inputs. This paper aims to (1) outline the development of a daily synoptic calendar for the Mid-Atlantic (USA), (2) define seasonal synoptic patterns occurring in the region, and (3) provide hydroclimatological examples whereby the cascading response of precipitation characteristics, soil moisture, and streamflow are explained by synoptic classification. Together, achievement of these objectives serves as a guide for development and use of a synoptic calendar for hydroclimatological studies. In total 22 unique synoptic types were identified, derived from a combination of 12 types occurring in the winter (DJF), 13 in spring (MAM), 9 in summer (JJA), and 11 in autumn (SON). This includes six low pressure systems, four high pressure systems, one cold front, three north/northwest flow regimes, three south/southwest flow regimes, and five weakly defined regimes. Pairwise comparisons indicated that 84.3 % had significantly different rainfall magnitudes, 86.4 % had different rainfall durations, and 84.7 % had different rainfall intensities. The largest precipitation-producing classifications were not restricted to low pressure systems, but rather to patterns with access to moisture sources from the Atlantic Ocean and easterly (on-shore) winds, which transport moisture inland. These same classifications resulted in comparable rates of soil moisture recharge and streamflow discharge, illustrating the applicability of synoptic classification for a range of hydroclimatological research objectives.

  10. The Specific Features of design and process engineering in branch of industrial enterprise

    NASA Astrophysics Data System (ADS)

    Sosedko, V. V.; Yanishevskaya, A. G.

    2017-06-01

    Production output of industrial enterprise is organized in debugged working mechanisms at each stage of product’s life cycle from initial design documentation to product and finishing it with utilization. The topic of article is mathematical model of the system design and process engineering in branch of the industrial enterprise, statistical processing of estimated implementation results of developed mathematical model in branch, and demonstration of advantages at application at this enterprise. During the creation of model a data flow about driving of information, orders, details and modules in branch of enterprise groups of divisions were classified. Proceeding from the analysis of divisions activity, a data flow, details and documents the state graph of design and process engineering was constructed, transitions were described and coefficients are appropriated. To each condition of system of the constructed state graph the corresponding limiting state probabilities were defined, and also Kolmogorov’s equations are worked out. When integration of sets of equations of Kolmogorov the state probability of system activity the specified divisions and production as function of time in each instant is defined. On the basis of developed mathematical model of uniform system of designing and process engineering and manufacture, and a state graph by authors statistical processing the application of mathematical model results was carried out, and also advantage at application at this enterprise is shown. Researches on studying of loading services probability of branch and third-party contractors (the orders received from branch within a month) were conducted. The developed mathematical model of system design and process engineering and manufacture can be applied to definition of activity state probability of divisions and manufacture as function of time in each instant that will allow to keep account of loading of performance of work in branches of the enterprise.

  11. An Application of Six Sigma to Reduce Supplier Quality Cost

    NASA Astrophysics Data System (ADS)

    Gaikwad, Lokpriya Mohanrao; Teli, Shivagond Nagappa; Majali, Vijay Shashikant; Bhushi, Umesh Mahadevappa

    2016-01-01

    This article presents an application of Six Sigma to reduce supplier quality cost in manufacturing industry. Although there is a wider acceptance of Six Sigma in many organizations today, there is still a lack of in-depth case study of Six Sigma. For the present research the case study methodology was used. The company decided to reduce quality cost and improve selected processes using Six Sigma methodologies. Regarding the fact that there is a lack of case studies dealing with Six Sigma especially in individual manufacturing organization this article could be of great importance also for the practitioners. This paper discusses the quality and productivity improvement in a supplier enterprise through a case study. The paper deals with an application of Six Sigma define-measure-analyze-improve-control methodology in an industry which provides a framework to identify, quantify and eliminate sources of variation in an operational process in question, to optimize the operation variables, improve and sustain performance viz. process yield with well-executed control plans. Six Sigma improves the process performance (process yield) of the critical operational process, leading to better utilization of resources, decreases variations and maintains consistent quality of the process output.

  12. [Service quality in health care: the application of the results of marketing research].

    PubMed

    Verheggen, F W; Harteloh, P P

    1993-01-01

    This paper deals with quality assurance in health care and its relation to quality assurance in trade and industry. We present the service quality model--a model of quality from marketing research--and discuss how it can be applied to health care. Traditional quality assurance appears to have serious flaws. It lacks a general theory of the sources of hazards in the complex process of patient care and tends to stagnate, for no real improvement takes place. Departing from this criticism, modern quality assurance in health care is marked by: defining quality in a preferential sense as "fitness for use"; the use of theories and models of trade and industry (process-control); an emphasis on analyzing the process, instead of merely inspecting it; use of the Deming problem solving technique (plan, do, check, act); improvement of the process of care by altering perceptions of parties involved. We present an experience of application and utilization of this method in the University Hospital Maastricht, The Netherlands. The successful application of this model requires a favorable corporate culture and motivation of the health care workers. This model provides a useful framework to uplift the traditional approach to quality assurance in health care.

  13. The Development of Novel Recombinant Human Gelatins as Replacements for Animal-Derived Gelatin in Pharmaceutical Applications

    NASA Astrophysics Data System (ADS)

    Olsen, David; Chang, Robert; Williams, Kim E.; Polarek, James W.

    We have developed a recombinant expression system to produce a series of novel recombinant human gelatins that can substitute for animal sourced gelatin preparations currently used in pharmaceutical and nutraceutical applications. This system allows the production of human sequence gelatins, or, if desired, gelatins from any other species depending on the availability of the cloned gene. The gelatins produced with this recombinant system are of defined molecular weight, unlike the animal-sourced gelatins, which consist of numerous polypeptides of varying size. The fermentation and purification process used to prepare these recombinant gelatins does not use any human- or animal-derived components and thus this recombinant material should be free from viruses and agents that cause transmissible spongiform encephalopathies. The recombinant gelatins exhibit lot-to-lot reproducibility and we have performed extensive analytical testing on them. We have demonstrated the utility of these novel gelatins as biological stabilizers and plasma expanders, and we have shown they possess qualities that are important in applications where gel formation is critical. Finally, we provide examples of how our system allows the engineering of these recombinant gelatins to optimize the production process.

  14. The Environment for Application Software Integration and Execution (EASIE), version 1.0. Volume 2: Program integration guide

    NASA Technical Reports Server (NTRS)

    Jones, Kennie H.; Randall, Donald P.; Stallcup, Scott S.; Rowell, Lawrence F.

    1988-01-01

    The Environment for Application Software Integration and Execution, EASIE, provides a methodology and a set of software utility programs to ease the task of coordinating engineering design and analysis codes. EASIE was designed to meet the needs of conceptual design engineers that face the task of integrating many stand-alone engineering analysis programs. Using EASIE, programs are integrated through a relational data base management system. In volume 2, the use of a SYSTEM LIBRARY PROCESSOR is used to construct a DATA DICTIONARY describing all relations defined in the data base, and a TEMPLATE LIBRARY. A TEMPLATE is a description of all subsets of relations (including conditional selection criteria and sorting specifications) to be accessed as input or output for a given application. Together, these form the SYSTEM LIBRARY which is used to automatically produce the data base schema, FORTRAN subroutines to retrieve/store data from/to the data base, and instructions to a generic REVIEWER program providing review/modification of data for a given template. Automation of these functions eliminates much of the tedious, error prone work required by the usual approach to data base integration.

  15. Biomarkers in Prodromal Parkinson Disease: a Qualitative Review.

    PubMed

    Cooper, Christine A; Chahine, Lama M

    2016-11-01

    Over the past several years, the concept of prodromal Parkinson disease (PD) has been increasingly recognized. This term refers to individuals who do not fulfill motor diagnostic criteria for PD, but who have clinical, genetic, or biomarker characteristics suggesting risk of developing PD in the future. Clinical diagnosis of prodromal PD has low specificity, prompting the need for objective biomarkers with higher specificity. In this qualitative review, we discuss objectively defined putative biomarkers for PD and prodromal PD. We searched Pubmed and Embase for articles pertaining to objective biomarkers for PD and their application in prodromal cohorts. Articles were selected based on relevance and methodology. Objective biomarkers of demonstrated utility in prodromal PD include ligand-based imaging and transcranial sonography. Development of serum, cerebrospinal fluid, and tissue-based biomarkers is underway, but their application in prodromal PD has yet to meaningfully occur. Combining objective biomarkers with clinical or genetic prodromal features increases the sensitivity and specificity for identifying prodromal PD. Several objective biomarkers for prodromal PD show promise but require further study, including their application to and validation in prodromal cohorts followed longitudinally. Accurate identification of prodromal PD will likely require a multimodal approach. (JINS, 2016, 22, 956-967).

  16. Cloudbus Toolkit for Market-Oriented Cloud Computing

    NASA Astrophysics Data System (ADS)

    Buyya, Rajkumar; Pandey, Suraj; Vecchiola, Christian

    This keynote paper: (1) presents the 21st century vision of computing and identifies various IT paradigms promising to deliver computing as a utility; (2) defines the architecture for creating market-oriented Clouds and computing atmosphere by leveraging technologies such as virtual machines; (3) provides thoughts on market-based resource management strategies that encompass both customer-driven service management and computational risk management to sustain SLA-oriented resource allocation; (4) presents the work carried out as part of our new Cloud Computing initiative, called Cloudbus: (i) Aneka, a Platform as a Service software system containing SDK (Software Development Kit) for construction of Cloud applications and deployment on private or public Clouds, in addition to supporting market-oriented resource management; (ii) internetworking of Clouds for dynamic creation of federated computing environments for scaling of elastic applications; (iii) creation of 3rd party Cloud brokering services for building content delivery networks and e-Science applications and their deployment on capabilities of IaaS providers such as Amazon along with Grid mashups; (iv) CloudSim supporting modelling and simulation of Clouds for performance studies; (v) Energy Efficient Resource Allocation Mechanisms and Techniques for creation and management of Green Clouds; and (vi) pathways for future research.

  17. Equivalent damage: A critical assessment

    NASA Technical Reports Server (NTRS)

    Laflen, J. R.; Cook, T. S.

    1982-01-01

    Concepts in equivalent damage were evaluated to determine their applicability to the life prediction of hot path components of aircraft gas turbine engines. Equivalent damage was defined as being those effects which influence the crack initiation life-time beyond the damage that is measured in uniaxial, fully-reversed sinusoidal and isothermal experiments at low homologous temperatures. Three areas of equivalent damage were examined: mean stress, cumulative damage, and multiaxiality. For each area, a literature survey was conducted to aid in selecting the most appropriate theories. Where possible, data correlations were also used in the evaluation process. A set of criteria was developed for ranking the theories in each equivalent damage regime. These criteria considered aspects of engine utilization as well as the theoretical basis and correlative ability of each theory. In addition, consideration was given to the complex nature of the loading cycle at fatigue critical locations of hot path components; this loading includes non-proportional multiaxial stressing, combined temperature and strain fluctuations, and general creep-fatigue interactions. Through applications of selected equivalent damage theories to some suitable data sets it was found that there is insufficient data to allow specific recommendations of preferred theories for general applications. A series of experiments and areas of further investigations were identified.

  18. Dynamic Distribution and Layouting of Model-Based User Interfaces in Smart Environments

    NASA Astrophysics Data System (ADS)

    Roscher, Dirk; Lehmann, Grzegorz; Schwartze, Veit; Blumendorf, Marco; Albayrak, Sahin

    The developments in computer technology in the last decade change the ways of computer utilization. The emerging smart environments make it possible to build ubiquitous applications that assist users during their everyday life, at any time, in any context. But the variety of contexts-of-use (user, platform and environment) makes the development of such ubiquitous applications for smart environments and especially its user interfaces a challenging and time-consuming task. We propose a model-based approach, which allows adapting the user interface at runtime to numerous (also unknown) contexts-of-use. Based on a user interface modelling language, defining the fundamentals and constraints of the user interface, a runtime architecture exploits the description to adapt the user interface to the current context-of-use. The architecture provides automatic distribution and layout algorithms for adapting the applications also to contexts unforeseen at design time. Designers do not specify predefined adaptations for each specific situation, but adaptation constraints and guidelines. Furthermore, users are provided with a meta user interface to influence the adaptations according to their needs. A smart home energy management system serves as running example to illustrate the approach.

  19. Role of satellite remote sensing in the geographic information economics in France

    NASA Astrophysics Data System (ADS)

    Denégre, Jean

    In national and international economics, geographic information plays a role which is generally acknowledged to be important but which is however, difficult to assess quantitatively, its applications being rather miscellaneous and indirect. Computer graphics and telecommunications increae that importance still more and justify many investments and research into new cartographic forms. As part of its responsibility for participating in the promotion of those developments, by taking into account needs expressed by public or private users, the National Council for Geographic Information (C.N.I.G.) has undertaken a general evaluation of the economic and social utility of geographic information in France. The study involves an estimation of the cost of production and research activities, which are probably about 0.1% of the Cross National Product—similar to many other countries. It also devised a method of estimating "cost/advantage" ratios applicable to these "intangible" benefits. Within that framework, remote sensing emphasizes particular aspects related both to the increase of economic performances in cartographic production and to the advent of new products and new ways of utilization. A review of some significant sectors shows effective earnings of about 10-20%, or even 50% or 100% of the costs, and these are doubtless much greater for the efficacy in the exploitation of products. Finally, many applications, entirely new result from extensions in various fields which would have been impossible without remote sensing: here the "cost advantage" ratio cannot even be compared with previous processes. Studies were undertaken in parallel for defining different types of products derived from satellite imagery, as well as those domains where development effort is required in order to make new advances.

  20. CMEIAS color segmentation: an improved computing technology to process color images for quantitative microbial ecology studies at single-cell resolution.

    PubMed

    Gross, Colin A; Reddy, Chandan K; Dazzo, Frank B

    2010-02-01

    Quantitative microscopy and digital image analysis are underutilized in microbial ecology largely because of the laborious task to segment foreground object pixels from background, especially in complex color micrographs of environmental samples. In this paper, we describe an improved computing technology developed to alleviate this limitation. The system's uniqueness is its ability to edit digital images accurately when presented with the difficult yet commonplace challenge of removing background pixels whose three-dimensional color space overlaps the range that defines foreground objects. Image segmentation is accomplished by utilizing algorithms that address color and spatial relationships of user-selected foreground object pixels. Performance of the color segmentation algorithm evaluated on 26 complex micrographs at single pixel resolution had an overall pixel classification accuracy of 99+%. Several applications illustrate how this improved computing technology can successfully resolve numerous challenges of complex color segmentation in order to produce images from which quantitative information can be accurately extracted, thereby gain new perspectives on the in situ ecology of microorganisms. Examples include improvements in the quantitative analysis of (1) microbial abundance and phylotype diversity of single cells classified by their discriminating color within heterogeneous communities, (2) cell viability, (3) spatial relationships and intensity of bacterial gene expression involved in cellular communication between individual cells within rhizoplane biofilms, and (4) biofilm ecophysiology based on ribotype-differentiated radioactive substrate utilization. The stand-alone executable file plus user manual and tutorial images for this color segmentation computing application are freely available at http://cme.msu.edu/cmeias/ . This improved computing technology opens new opportunities of imaging applications where discriminating colors really matter most, thereby strengthening quantitative microscopy-based approaches to advance microbial ecology in situ at individual single-cell resolution.

  1. Analysis of Department of Defense Organic Depot Maintenance Capacity Management and Facility Utilization Factors

    DTIC Science & Technology

    1991-09-01

    System ( CAPMS ) in lieu of using DODI 4151.15H. Facility utilization rate computation is not explicitly defined; it is merely identified as a ratio of...front of a bottleneck buffers the critical resource and protects against disruption of the system. This approach optimizes facility utilization by...run titled BUFFERED BASELINE. Three different levels of inventory were used to evaluate the effect of increasing the inventory level on critical

  2. A new approach to global control of redundant manipulators

    NASA Technical Reports Server (NTRS)

    Seraji, Homayoun

    1989-01-01

    A new and simple approach to configuration control of redundant manipulators is presented. In this approach, the redundancy is utilized to control the manipulator configuration directly in task space, where the task will be performed. A number of kinematic functions are defined to reflect the desirable configuration that will be achieved for a given end-effector position. The user-defined kinematic functions and the end-effector Cartesian coordinates are combined to form a set of task-related configuration variables as generalized coordinates for the manipulator. An adaptive scheme is then utilized to globally control the configuration variables so as to achieve tracking of some desired reference trajectories. This accomplishes the basic task of desired end-effector motion, while utilizing the redundancy to achieve any additional task through the desired time variation of the kinematic functions. The control law is simple and computationally very fast, and does not require the complex manipulator dynamic model.

  3. Protein molecular data from ancient (>1 million years old) fossil material: pitfalls, possibilities and grand challenges.

    PubMed

    Schweitzer, Mary Higby; Schroeter, Elena R; Goshe, Michael B

    2014-07-15

    Advances in resolution and sensitivity of analytical techniques have provided novel applications, including the analyses of fossil material. However, the recovery of original proteinaceous components from very old fossil samples (defined as >1 million years (1 Ma) from previously named limits in the literature) is far from trivial. Here, we discuss the challenges to recovery of proteinaceous components from fossils, and the need for new sample preparation techniques, analytical methods, and bioinformatics to optimize and fully utilize the great potential of information locked in the fossil record. We present evidence for survival of original components across geological time, and discuss the potential benefits of recovery, analyses, and interpretation of fossil materials older than 1 Ma, both within and outside of the fields of evolutionary biology.

  4. Defining the system of care concept and philosophy: to update or not to update?

    PubMed

    Stroul, Beth A; Blau, Gary M

    2010-02-01

    This commentary considers the task of updating the system of care concept and philosophy within its historical context, reviewing the original intent of the definition and clarifying misconceptions about its meaning. The authors identify the aspects of the concept and philosophy that should be updated based on the latest thinking, experience, and data, such as incorporating applicability to a broader range of populations, increasing the emphasis on the core values, specifying desired outcomes, and adding accountability as a critical element. An updated definition and values and principles are proposed, and the importance of always presenting the definition along with the accompanying specification of the philosophy is emphasized in order to increase its utility in assisting the field to move from theory to practice.

  5. Adaptive Peircean decision aid project summary assessments.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Senglaub, Michael E.

    2007-01-01

    This efforts objective was to identify and hybridize a suite of technologies enabling the development of predictive decision aids for use principally in combat environments but also in any complex information terrain. The technologies required included formal concept analysis for knowledge representation and information operations, Peircean reasoning to support hypothesis generation, Mill's's canons to begin defining information operators that support the first two technologies and co-evolutionary game theory to provide the environment/domain to assess predictions from the reasoning engines. The intended application domain is the IED problem because of its inherent evolutionary nature. While a fully functioning integrated algorithm wasmore » not achieved the hybridization and demonstration of the technologies was accomplished and demonstration of utility provided for a number of ancillary queries.« less

  6. Payload crew activity planning integration. Task 2: Inflight operations and training for payloads

    NASA Technical Reports Server (NTRS)

    Hitz, F. R.

    1976-01-01

    The primary objectives of the Payload Crew Activity Planning Integration task were to: (1) Determine feasible, cost-effective payload crew activity planning integration methods. (2) Develop an implementation plan and guidelines for payload crew activity plan (CAP) integration between the JSC Orbiter planners and the Payload Centers. Subtask objectives and study activities were defined as: (1) Determine Crew Activity Planning Interfaces. (2) Determine Crew Activity Plan Type and Content. (3) Evaluate Automated Scheduling Tools. (4) Develop a draft Implementation Plan for Crew Activity Planning Integration. The basic guidelines were to develop a plan applicable to the Shuttle operations timeframe, utilize existing center resources and expertise as much as possible, and minimize unnecessary data exchange not directly productive in the development of the end-product timelines.

  7. Genomes of diverse isolates of the marine cyanobacterium Prochlorococcus

    PubMed Central

    Biller, Steven J.; Berube, Paul M.; Berta-Thompson, Jessie W.; Kelly, Libusha; Roggensack, Sara E.; Awad, Lana; Roache-Johnson, Kathryn H.; Ding, Huiming; Giovannoni, Stephen J.; Rocap, Gabrielle; Moore, Lisa R.; Chisholm, Sallie W.

    2014-01-01

    The marine cyanobacterium Prochlorococcus is the numerically dominant photosynthetic organism in the oligotrophic oceans, and a model system in marine microbial ecology. Here we report 27 new whole genome sequences (2 complete and closed; 25 of draft quality) of cultured isolates, representing five major phylogenetic clades of Prochlorococcus. The sequenced strains were isolated from diverse regions of the oceans, facilitating studies of the drivers of microbial diversity—both in the lab and in the field. To improve the utility of these genomes for comparative genomics, we also define pre-computed clusters of orthologous groups of proteins (COGs), indicating how genes are distributed among these and other publicly available Prochlorococcus genomes. These data represent a significant expansion of Prochlorococcus reference genomes that are useful for numerous applications in microbial ecology, evolution and oceanography. PMID:25977791

  8. Fabrication of a wettability-gradient surface on copper by screen-printing techniques

    NASA Astrophysics Data System (ADS)

    Huang, Ding-Jun; Leu, Tzong-Shyng

    2015-08-01

    In this study, a screen-printing technique is utilized to fabricate a wettability-gradient surface on a copper substrate. The pattern definitions on the copper surface were freely fabricated to define the regions with different wettabilities, for which the printing definition technique was developed as an alternative to the existing costly photolithography techniques. This fabrication process using screen printing in tandem with chemical modification methods can easily realize an excellent wettability-gradient surface with superhydrophobicity and superhydrophilicity. Surface analyses were performed to characterize conditions in some fabrication steps. A water droplet movement sequence is provided to clearly demonstrate the droplet-driving effectiveness of the fabricated gradient surface. The droplet-driving efficiency offers a promising solution for condensation heat transfer applications in the foreseeable future.

  9. Prevention concept in industry: improvement in occupational safety and health protection--an empirical study.

    PubMed

    Ramsauer, F

    2001-12-01

    This prevention concept offers a contribution to the expansion of the set of instruments for occupational safety and health protection within workplace prevention. The concept involves the multilateral analysis of work conditions. The utilized instruments include a strategy group, a survey, a health issue round table, and an analysis of work demands, and lead to synergy effects at the results level. Employees are drawn into the analysis of work conditions and workplace design solutions for the improvement of the work situation. The prevention concept was tested in a large company and its application established in practice. It was accepted by all participants, and the comparison with the previous situation (defined only through the analysis of work demands) demonstrated a significant improvement in health protection.

  10. Development and mechanical properties of structural materials from lunar simulants

    NASA Technical Reports Server (NTRS)

    Desai, Chandra S.; Girdner, K.; Saadatmanesh, H.; Allen, T.

    1991-01-01

    Development of the technologies for manufacture of structural and construction materials on the Moon, utilizing local lunar soil (regolith), without the use of water, is an important element for habitats and explorations in space. Here, it is vital that the mechanical behavior such as strength and flexural properties, fracture toughness, ductility and deformation characteristics be defined toward establishment of the ranges of engineering applications of the materials developed. The objective is to describe the research results in two areas for the above goal: (1) liquefaction of lunar simulant (at about 100 C) with different additives (fibers, powders, etc.); and (2) development and use of a new triaxial test device in which lunar simulants are first compressed under cycles of loading, and then tested with different vacuums and initial confining or in situ stress.

  11. A class-based link prediction using Distance Dependent Chinese Restaurant Process

    NASA Astrophysics Data System (ADS)

    Andalib, Azam; Babamir, Seyed Morteza

    2016-08-01

    One of the important tasks in relational data analysis is link prediction which has been successfully applied on many applications such as bioinformatics, information retrieval, etc. The link prediction is defined as predicting the existence or absence of edges between nodes of a network. In this paper, we propose a novel method for link prediction based on Distance Dependent Chinese Restaurant Process (DDCRP) model which enables us to utilize the information of the topological structure of the network such as shortest path and connectivity of the nodes. We also propose a new Gibbs sampling algorithm for computing the posterior distribution of the hidden variables based on the training data. Experimental results on three real-world datasets show the superiority of the proposed method over other probabilistic models for link prediction problem.

  12. Controller–Pilot Data Link Communication Security

    PubMed Central

    Polishchuk, Tatiana; Wernberg, Max

    2018-01-01

    The increased utilization of the new types of cockpit communications, including controller–pilot data link communications (CPDLC), puts the airplane at higher risk of hacking or interference than ever before. We review the technological characteristics and properties of the CPDLC and construct the corresponding threat model. Based on the limitations imposed by the system parameters, we propose several solutions for the improved security of the data messaging communication used in air traffic management (ATM). We discuss the applicability of elliptical curve cryptography (ECC), protected aircraft communications addressing and reporting systems (PACARs) and the Host Identity Protocol (HIP) as possible countermeasures to the identified security threats. In addition, we consider identity-defined networking (IDN) as an example of a genuine security solution which implies global changes in the whole air traffic communication system. PMID:29783791

  13. Life support approaches for Mars missions

    NASA Technical Reports Server (NTRS)

    Drysdale, A. E.; Ewert, M. K.; Hanford, A. J.

    2003-01-01

    Life support approaches for Mars missions are evaluated using an equivalent system mass (ESM) approach, in which all significant costs are converted into mass units. The best approach, as defined by the lowest mission ESM, depends on several mission parameters, notably duration, environment and consequent infrastructure costs, and crew size, as well as the characteristics of the technologies which are available. Generally, for the missions under consideration, physicochemical regeneration is most cost effective. However, bioregeneration is likely to be of use for producing salad crops for any mission, for producing staple crops for medium duration missions, and for most food, air and water regeneration for long missions (durations of a decade). Potential applications of in situ resource utilization need to be considered further. c2002 Published by Elsevier Science Ltd on behalf of COSPAR.

  14. The algebra of healthcare reform: hospital-physician economic alignment.

    PubMed

    Goodroe, J H; Murphy, D A

    1999-01-01

    In summary the tertiary care programs in this nation are trapped in a difficult dilemma. On one side is the ongoing reduction in provider revenue driven by real and powerful market forces. On the other side is a traditional payment system governed by necessary laws that inhibit meaningful re-engineering of tertiary care delivery. If a remedy to this situation cannot be created then it is very likely that all aspects of quality as defined earlier will suffer. It is our hope that by very careful construction of a relationship, with attention to applicable statutes and careful measurement of utilization and quality, a limited business alignment of a hospital and a group of tertiary physicians can be approved in the care of Medicare, Medicaid and all federally funded patients.

  15. A novel quantum LSB-based steganography method using the Gray code for colored quantum images

    NASA Astrophysics Data System (ADS)

    Heidari, Shahrokh; Farzadnia, Ehsan

    2017-10-01

    As one of the prevalent data-hiding techniques, steganography is defined as the act of concealing secret information in a cover multimedia encompassing text, image, video and audio, imperceptibly, in order to perform interaction between the sender and the receiver in which nobody except the receiver can figure out the secret data. In this approach a quantum LSB-based steganography method utilizing the Gray code for quantum RGB images is investigated. This method uses the Gray code to accommodate two secret qubits in 3 LSBs of each pixel simultaneously according to reference tables. Experimental consequences which are analyzed in MATLAB environment, exhibit that the present schema shows good performance and also it is more secure and applicable than the previous one currently found in the literature.

  16. Controller⁻Pilot Data Link Communication Security.

    PubMed

    Gurtov, Andrei; Polishchuk, Tatiana; Wernberg, Max

    2018-05-20

    The increased utilization of the new types of cockpit communications, including controller⁻pilot data link communications (CPDLC), puts the airplane at higher risk of hacking or interference than ever before. We review the technological characteristics and properties of the CPDLC and construct the corresponding threat model. Based on the limitations imposed by the system parameters, we propose several solutions for the improved security of the data messaging communication used in air traffic management (ATM). We discuss the applicability of elliptical curve cryptography (ECC), protected aircraft communications addressing and reporting systems (PACARs) and the Host Identity Protocol (HIP) as possible countermeasures to the identified security threats. In addition, we consider identity-defined networking (IDN) as an example of a genuine security solution which implies global changes in the whole air traffic communication system.

  17. Health lifestyle theory and the convergence of agency and structure.

    PubMed

    Cockerham, William C

    2005-03-01

    This article utilizes the agency-structure debate as a framework for constructing a health lifestyle theory. No such theory currently exists, yet the need for one is underscored by the fact that many daily lifestyle practices involve considerations of health outcomes. An individualist paradigm has influenced concepts of health lifestyles in several disciplines, but this approach neglects the structural dimensions of such lifestyles and has limited applicability to the empirical world. The direction of this article is to present a theory of health lifestyles that includes considerations of both agency and structure, with an emphasis upon restoring structure to its appropriate position. The article begins by defining agency and structure, followed by presentation of a health lifestyle model and the theoretical and empirical studies that support it.

  18. Life support approaches for Mars missions.

    PubMed

    Drysdale, A E; Ewert, M K; Hanford, A J

    2003-01-01

    Life support approaches for Mars missions are evaluated using an equivalent system mass (ESM) approach, in which all significant costs are converted into mass units. The best approach, as defined by the lowest mission ESM, depends on several mission parameters, notably duration, environment and consequent infrastructure costs, and crew size, as well as the characteristics of the technologies which are available. Generally, for the missions under consideration, physicochemical regeneration is most cost effective. However, bioregeneration is likely to be of use for producing salad crops for any mission, for producing staple crops for medium duration missions, and for most food, air and water regeneration for long missions (durations of a decade). Potential applications of in situ resource utilization need to be considered further. c2002 Published by Elsevier Science Ltd on behalf of COSPAR.

  19. Extension of TOPAS for the simulation of proton radiation effects considering molecular and cellular endpoints

    NASA Astrophysics Data System (ADS)

    Polster, Lisa; Schuemann, Jan; Rinaldi, Ilaria; Burigo, Lucas; McNamara, Aimee L.; Stewart, Robert D.; Attili, Andrea; Carlson, David J.; Sato, Tatsuhiko; Ramos Méndez, José; Faddegon, Bruce; Perl, Joseph; Paganetti, Harald

    2015-07-01

    The aim of this work is to extend a widely used proton Monte Carlo tool, TOPAS, towards the modeling of relative biological effect (RBE) distributions in experimental arrangements as well as patients. TOPAS provides a software core which users configure by writing parameter files to, for instance, define application specific geometries and scoring conditions. Expert users may further extend TOPAS scoring capabilities by plugging in their own additional C++ code. This structure was utilized for the implementation of eight biophysical models suited to calculate proton RBE. As far as physics parameters are concerned, four of these models are based on the proton linear energy transfer, while the others are based on DNA double strand break induction and the frequency-mean specific energy, lineal energy, or delta electron generated track structure. The biological input parameters for all models are typically inferred from fits of the models to radiobiological experiments. The model structures have been implemented in a coherent way within the TOPAS architecture. Their performance was validated against measured experimental data on proton RBE in a spread-out Bragg peak using V79 Chinese Hamster cells. This work is an important step in bringing biologically optimized treatment planning for proton therapy closer to the clinical practice as it will allow researchers to refine and compare pre-defined as well as user-defined models.

  20. Extension of TOPAS for the simulation of proton radiation effects considering molecular and cellular endpoints

    PubMed Central

    Polster, Lisa; Schuemann, Jan; Rinaldi, Ilaria; Burigo, Lucas; McNamara, Aimee L.; Stewart, Robert D.; Attili, Andrea; Carlson, David J.; Sato, Tatsuhiko; Méndez, José Ramos; Faddegon, Bruce; Perl, Joseph; Paganetti, Harald

    2015-01-01

    The aim of this work is to extend a widely used proton Monte Carlo tool, TOPAS, towards the modeling of relative biological effect (RBE) distributions in experimental arrangements as well as patients. TOPAS provides a software core which users configure by writing parameter files to, for instance, define application specific geometries and scoring conditions. Expert users may further extend TOPAS scoring capabilities by plugging in their own additional C++ code. This structure was utilized for the implementation of eight biophysical models suited to calculate proton RBE. As far as physics parameters are concerned, four of these models are based on the proton linear energy transfer (LET), while the others are based on DNA Double Strand Break (DSB) induction and the frequency-mean specific energy, lineal energy, or delta electron generated track structure. The biological input parameters for all models are typically inferred from fits of the models to radiobiological experiments. The model structures have been implemented in a coherent way within the TOPAS architecture. Their performance was validated against measured experimental data on proton RBE in a spread-out Bragg peak using V79 Chinese Hamster cells. This work is an important step in bringing biologically optimized treatment planning for proton therapy closer to the clinical practice as it will allow researchers to refine and compare pre-defined as well as user-defined models. PMID:26061666

  1. 18 CFR 4.60 - Applicability and notice to agencies.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... DETERMINATION OF PROJECT COSTS Application for License for Minor Water Power Projects and Major Water Power Projects 5 Megawatts or Less § 4.60 Applicability and notice to agencies. (a) Applicability. The provisions... water power project, as defined in § 4.30(b)(17); (2) Any major project—existing dam, as defined in § 4...

  2. 18 CFR 4.60 - Applicability and notice to agencies.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... DETERMINATION OF PROJECT COSTS Application for License for Minor Water Power Projects and Major Water Power Projects 5 Megawatts or Less § 4.60 Applicability and notice to agencies. (a) Applicability. The provisions... water power project, as defined in § 4.30(b)(17); (2) Any major project—existing dam, as defined in § 4...

  3. 18 CFR 4.60 - Applicability and notice to agencies.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... DETERMINATION OF PROJECT COSTS Application for License for Minor Water Power Projects and Major Water Power Projects 5 Megawatts or Less § 4.60 Applicability and notice to agencies. (a) Applicability. The provisions... water power project, as defined in § 4.30(b)(17); (2) Any major project—existing dam, as defined in § 4...

  4. 18 CFR 4.60 - Applicability and notice to agencies.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... DETERMINATION OF PROJECT COSTS Application for License for Minor Water Power Projects and Major Water Power Projects 5 Megawatts or Less § 4.60 Applicability and notice to agencies. (a) Applicability. The provisions... water power project, as defined in § 4.30(b)(17); (2) Any major project—existing dam, as defined in § 4...

  5. 18 CFR 4.60 - Applicability and notice to agencies.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... DETERMINATION OF PROJECT COSTS Application for License for Minor Water Power Projects and Major Water Power Projects 5 Megawatts or Less § 4.60 Applicability and notice to agencies. (a) Applicability. The provisions... water power project, as defined in § 4.30(b)(17); (2) Any major project—existing dam, as defined in § 4...

  6. Dynamic shear deformation in high purity Fe

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cerreta, Ellen K; Bingert, John F; Trujillo, Carl P

    2009-01-01

    The forced shear test specimen, first developed by Meyer et al. [Meyer L. et al., Critical Adiabatic Shear Strength of Low Alloyed Steel Under Compressive Loading, Metallurgical Applications of Shock Wave and High Strain Rate Phenomena (Marcel Decker, 1986), 657; Hartmann K. et al., Metallurgical Effects on Impact Loaded Materials, Shock Waves and High Strain rate Phenomena in Metals (Plenum, 1981), 325-337.], has been utilized in a number of studies. While the geometry of this specimen does not allow for the microstructure to exactly define the location of shear band formation and the overall mechanical response of a specimen ismore » highly sensitive to the geometry utilized, the forced shear specimen is useful for characterizing the influence of parameters such as strain rate, temperature, strain, and load on the microstructural evolution within a shear band. Additionally, many studies have utilized this geometry to advance the understanding of shear band development. In this study, by varying the geometry, specifically the ratio of the inner hole to the outer hat diameter, the dynamic shear localization response of high purity Fe was examined. Post mortem characterization was performed to quantify the width of the localizations and examine the microstructural and textural evolution of shear deformation in a bcc metal. Increased instability in mechanical response is strongly linked with development of enhanced intergranular misorientations, high angle boundaries, and classical shear textures characterized through orientation distribution functions.« less

  7. Utilization of Parenteral Morphine by Application of ATC/DDD Methodology: Retrospective Study in the Referral Teaching Hospital.

    PubMed

    Dragojevic-Simic, Viktorija; Rancic, Nemanja; Stamenkovic, Dusica; Simic, Radoje

    2017-01-01

    Few studies analyzed the pattern of opioid analgesic utilization in hospital settings. The aim of this study was to determine the consumption pattern of parenteral morphine in patients hospitalized in the Serbian referral teaching hospital and to correlate it with utilization at the national and international level. In retrospective study, the required data were extracted from medical records of surgical patients who received parenteral morphine in the 5-year period, from 2011 to 2015. We used the Anatomical Therapeutic Chemical Classification/Defined Daily Doses (DDD) international system for consumption evaluation. While the number of performed surgical procedures in our hospital steadily increased from 2011 to 2015, the number of inpatient bed-days decreased from 2012. However, the consumption of parenteral morphine varied and was not more than 0.867 DDD/100 bed-days in the observed period. Based on the available data, parenteral morphine consumption in our hospital was lower compared with international data. The low level of morphine use in the hospital was in accordance with national data, and compared with other countries, morphine consumption applied for medical indications in Serbia was low. Adequate legal provision to ensure the availability of opioids, better education and training of medical personnel, as well as multidisciplinary approach should enable more rational and individual pain management in the future, not only within the hospitals.

  8. The choice of primary energy source including PV installation for providing electric energy to a public utility building - a case study

    NASA Astrophysics Data System (ADS)

    Radomski, Bartosz; Ćwiek, Barbara; Mróz, Tomasz M.

    2017-11-01

    The paper presents multicriteria decision aid analysis of the choice of PV installation providing electric energy to a public utility building. From the energy management point of view electricity obtained by solar radiation has become crucial renewable energy source. Application of PV installations may occur a profitable solution from energy, economic and ecologic point of view for both existing and newly erected buildings. Featured variants of PV installations have been assessed by multicriteria analysis based on ANP (Analytic Network Process) method. Technical, economical, energy and environmental criteria have been identified as main decision criteria. Defined set of decision criteria has an open character and can be modified in the dialog process between the decision-maker and the expert - in the present case, an expert in planning of development of energy supply systems. The proposed approach has been used to evaluate three variants of PV installation acceptable for existing educational building located in Poznań, Poland - the building of Faculty of Chemical Technology, Poznań University of Technology. Multi-criteria analysis based on ANP method and the calculation software Super Decisions has proven to be an effective tool for energy planning, leading to the indication of the recommended variant of PV installation in existing and newly erected public buildings. Achieved results show prospects and possibilities of rational renewable energy usage as complex solution to public utility buildings.

  9. Developing a stochastic conflict resolution model for urban runoff quality management: Application of info-gap and bargaining theories

    NASA Astrophysics Data System (ADS)

    Ghodsi, Seyed Hamed; Kerachian, Reza; Estalaki, Siamak Malakpour; Nikoo, Mohammad Reza; Zahmatkesh, Zahra

    2016-02-01

    In this paper, two deterministic and stochastic multilateral, multi-issue, non-cooperative bargaining methodologies are proposed for urban runoff quality management. In the proposed methodologies, a calibrated Storm Water Management Model (SWMM) is used to simulate stormwater runoff quantity and quality for different urban stormwater runoff management scenarios, which have been defined considering several Low Impact Development (LID) techniques. In the deterministic methodology, the best management scenario, representing location and area of LID controls, is identified using the bargaining model. In the stochastic methodology, uncertainties of some key parameters of SWMM are analyzed using the info-gap theory. For each water quality management scenario, robustness and opportuneness criteria are determined based on utility functions of different stakeholders. Then, to find the best solution, the bargaining model is performed considering a combination of robustness and opportuneness criteria for each scenario based on utility function of each stakeholder. The results of applying the proposed methodology in the Velenjak urban watershed located in the northeastern part of Tehran, the capital city of Iran, illustrate its practical utility for conflict resolution in urban water quantity and quality management. It is shown that the solution obtained using the deterministic model cannot outperform the result of the stochastic model considering the robustness and opportuneness criteria. Therefore, it can be concluded that the stochastic model, which incorporates the main uncertainties, could provide more reliable results.

  10. Validation of high throughput sequencing and microbial forensics applications

    PubMed Central

    2014-01-01

    High throughput sequencing (HTS) generates large amounts of high quality sequence data for microbial genomics. The value of HTS for microbial forensics is the speed at which evidence can be collected and the power to characterize microbial-related evidence to solve biocrimes and bioterrorist events. As HTS technologies continue to improve, they provide increasingly powerful sets of tools to support the entire field of microbial forensics. Accurate, credible results allow analysis and interpretation, significantly influencing the course and/or focus of an investigation, and can impact the response of the government to an attack having individual, political, economic or military consequences. Interpretation of the results of microbial forensic analyses relies on understanding the performance and limitations of HTS methods, including analytical processes, assays and data interpretation. The utility of HTS must be defined carefully within established operating conditions and tolerances. Validation is essential in the development and implementation of microbial forensics methods used for formulating investigative leads attribution. HTS strategies vary, requiring guiding principles for HTS system validation. Three initial aspects of HTS, irrespective of chemistry, instrumentation or software are: 1) sample preparation, 2) sequencing, and 3) data analysis. Criteria that should be considered for HTS validation for microbial forensics are presented here. Validation should be defined in terms of specific application and the criteria described here comprise a foundation for investigators to establish, validate and implement HTS as a tool in microbial forensics, enhancing public safety and national security. PMID:25101166

  11. Validation of high throughput sequencing and microbial forensics applications.

    PubMed

    Budowle, Bruce; Connell, Nancy D; Bielecka-Oder, Anna; Colwell, Rita R; Corbett, Cindi R; Fletcher, Jacqueline; Forsman, Mats; Kadavy, Dana R; Markotic, Alemka; Morse, Stephen A; Murch, Randall S; Sajantila, Antti; Schmedes, Sarah E; Ternus, Krista L; Turner, Stephen D; Minot, Samuel

    2014-01-01

    High throughput sequencing (HTS) generates large amounts of high quality sequence data for microbial genomics. The value of HTS for microbial forensics is the speed at which evidence can be collected and the power to characterize microbial-related evidence to solve biocrimes and bioterrorist events. As HTS technologies continue to improve, they provide increasingly powerful sets of tools to support the entire field of microbial forensics. Accurate, credible results allow analysis and interpretation, significantly influencing the course and/or focus of an investigation, and can impact the response of the government to an attack having individual, political, economic or military consequences. Interpretation of the results of microbial forensic analyses relies on understanding the performance and limitations of HTS methods, including analytical processes, assays and data interpretation. The utility of HTS must be defined carefully within established operating conditions and tolerances. Validation is essential in the development and implementation of microbial forensics methods used for formulating investigative leads attribution. HTS strategies vary, requiring guiding principles for HTS system validation. Three initial aspects of HTS, irrespective of chemistry, instrumentation or software are: 1) sample preparation, 2) sequencing, and 3) data analysis. Criteria that should be considered for HTS validation for microbial forensics are presented here. Validation should be defined in terms of specific application and the criteria described here comprise a foundation for investigators to establish, validate and implement HTS as a tool in microbial forensics, enhancing public safety and national security.

  12. Class D Management Implementation Approach of the First Orbital Mission of the Earth Venture Series

    NASA Technical Reports Server (NTRS)

    Wells, James E.; Scherrer, John; Law, Richard; Bonniksen, Chris

    2013-01-01

    A key element of the National Research Council's Earth Science and Applications Decadal Survey called for the creation of the Venture Class line of low-cost research and application missions within NASA (National Aeronautics and Space Administration). One key component of the architecture chosen by NASA within the Earth Venture line is a series of self-contained stand-alone spaceflight science missions called "EV-Mission". The first mission chosen for this competitively selected, cost and schedule capped, Principal Investigator-led opportunity is the CYclone Global Navigation Satellite System (CYGNSS). As specified in the defining Announcement of Opportunity, the Principal Investigator is held responsible for successfully achieving the science objectives of the selected mission and the management approach that he/she chooses to obtain those results has a significant amount of freedom as long as it meets the intent of key NASA guidance like NPR 7120.5 and 7123. CYGNSS is classified under NPR 7120.5E guidance as a Category 3 (low priority, low cost) mission and carries a Class D risk classification (low priority, high risk) per NPR 8705.4. As defined in the NPR guidance, Class D risk classification allows for a relatively broad range of implementation strategies. The management approach that will be utilized on CYGNSS is a streamlined implementation that starts with a higher risk tolerance posture at NASA and that philosophy flows all the way down to the individual part level.

  13. Athletic Departments' Operating Expenses as a Predictor of Their Directors' Cup Standing

    ERIC Educational Resources Information Center

    Magner, Amber

    2014-01-01

    The NACDA Directors' Cup is a competition utilizing an unbiased scoring system that encourages a broad based athletic department as the standard for defining intercollegiate athletic success. Therefore, for NCAA DI athletic administrators the Directors' Cup should be the standard for defining intercollegiate athletic success. The purpose of this…

  14. 18 CFR 368.2 - General instructions.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... transaction. (2) Company means a service company or a holding company as defined in § 367.1 of this chapter... utility, licensee, or natural gas company, as defined in the Federal Power Act (16 U.S.C. §§ 824 et seq... officials to supervise the preservation or authorized destruction of its records. (c) Protection and storage...

  15. 18 CFR 368.2 - General instructions.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... transaction. (2) Company means a service company or a holding company as defined in § 367.1 of this chapter... utility, licensee, or natural gas company, as defined in the Federal Power Act (16 U.S.C. §§ 824 et seq... officials to supervise the preservation or authorized destruction of its records. (c) Protection and storage...

  16. 18 CFR 368.2 - General instructions.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... transaction. (2) Company means a service company or a holding company as defined in § 367.1 of this chapter... utility, licensee, or natural gas company, as defined in the Federal Power Act (16 U.S.C. §§ 824 et seq... officials to supervise the preservation or authorized destruction of its records. (c) Protection and storage...

  17. 18 CFR 368.2 - General instructions.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... transaction. (2) Company means a service company or a holding company as defined in § 367.1 of this chapter... utility, licensee, or natural gas company, as defined in the Federal Power Act (16 U.S.C. §§ 824 et seq... officials to supervise the preservation or authorized destruction of its records. (c) Protection and storage...

  18. The Relationship between Kolb's Learning Styles and StrengthsFinder's Talent Themes

    ERIC Educational Resources Information Center

    Caldwell, Adonna B.

    2009-01-01

    The purpose of this study was to investigate if there were relationships between college students' talent themes as defined by the Clifton StrengthsFinder(TM) Instrument and their learning style as defined by Kolb Learning Styles Inventory. Logistical regression methodology was utilized to assess the relationship between learning styles and talent…

  19. The once and future application of cost-effectiveness analysis.

    PubMed

    Berger, M L

    1999-09-01

    Cost-effectiveness analysis (CEA) is used by payers to make coverage decisions, by providers to make formulary decisions, and by large purchasers/employers and policymakers to choose health care performance measures. However, it continues to be poorly utilized in the marketplace because of overriding financial imperatives to control costs and a low apparent willingness to pay for quality. There is no obvious relationship between the cost-effectiveness of life-saving interventions and their application. Health care decision makers consider financial impact, safety, and effectiveness before cost-effectiveness. WHY IS CEA NOT MORE WIDELY APPLIED? Most health care providers have a short-term parochial financial perspective, whereas CEA takes a long-term view that captures all costs, benefits, and hazards, regardless of to whom they accrue. In addition, a history of poor standardization of methods, unrealistic expectations that CEA could answer fundamental ethical and political issues, and society's failure to accept the need for allocating scarce resources more judiciously, have contributed to relatively little use of the method by decision makers. HOW WILL CEA FIND GREATER UTILITY IN THE FUTURE? As decision makers take a longer-term view and understand that CEA can provide a quantitative perspective on important resource allocation decisions, including the distributional consequences of alternative choices, CEA is likely to find greater use. However, it must be embedded within a framework that promotes confidence in the social justice of health care decision making through ongoing dialogue about how the value of health and health care are defined.

  20. Utilizing toxicogenomic data to understand chemical mechanism of action in risk assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilson, Vickie S., E-mail: wilson.vickie@epa.gov; Keshava, Nagalakshmi; Hester, Susan

    2013-09-15

    The predominant role of toxicogenomic data in risk assessment, thus far, has been one of augmentation of more traditional in vitro and in vivo toxicology data. This article focuses on the current available examples of instances where toxicogenomic data has been evaluated in human health risk assessment (e.g., acetochlor and arsenicals) which have been limited to the application of toxicogenomic data to inform mechanism of action. This article reviews the regulatory policy backdrop and highlights important efforts to ultimately achieve regulatory acceptance. A number of research efforts on specific chemicals that were designed for risk assessment purposes have employed mechanismmore » or mode of action hypothesis testing and generating strategies. The strides made by large scale efforts to utilize toxicogenomic data in screening, testing, and risk assessment are also discussed. These efforts include both the refinement of methodologies for performing toxicogenomics studies and analysis of the resultant data sets. The current issues limiting the application of toxicogenomics to define mode or mechanism of action in risk assessment are discussed together with interrelated research needs. In summary, as chemical risk assessment moves away from a single mechanism of action approach toward a toxicity pathway-based paradigm, we envision that toxicogenomic data from multiple technologies (e.g., proteomics, metabolomics, transcriptomics, supportive RT-PCR studies) can be used in conjunction with one another to understand the complexities of multiple, and possibly interacting, pathways affected by chemicals which will impact human health risk assessment.« less

  1. Endothelial cell culture in microfluidic devices for investigating microvascular processes.

    PubMed

    Mannino, Robert G; Qiu, Yongzhi; Lam, Wilbur A

    2018-07-01

    Numerous conditions and disease states such as sickle cell disease, malaria, thrombotic microangiopathy, and stroke significantly impact the microvasculature function and its role in disease progression. Understanding the role of cellular interactions and microvascular hemodynamic forces in the context of disease is crucial to understanding disease pathophysiology. In vivo models of microvascular disease using animal models often coupled with intravital microscopy have long been utilized to investigate microvascular phenomena. However, these methods suffer from some major drawbacks, including the inability to tightly and quantitatively control experimental conditions, the difficulty of imaging multiple microvascular beds within a living organism, and the inability to isolate specific microvascular geometries such as bifurcations. Thus, there exists a need for in vitro microvascular models that can mitigate the drawbacks associated with in vivo systems. To that end, microfluidics has been widely used to develop such models, as it allows for tight control of system inputs, facile imaging, and the ability to develop robust and repeatable systems with well-defined geometries. Incorporating endothelial cells to branching microfluidic models allows for the development of "endothelialized" systems that accurately recapitulate physiological microvessels. In this review, we summarize the field of endothelialized microfluidics, specifically focusing on fabrication methods, limitations, and applications of these systems. We then speculate on future directions and applications of these cutting edge technologies. We believe that this review of the field is of importance to vascular biologists and bioengineers who aim to utilize microfluidic technologies to solve vascular problems.

  2. Critical Uses of College Resources. Part I: Personnel Utilization System.

    ERIC Educational Resources Information Center

    Vlahos, Mantha

    A Personnel Utilization System has been designed at Broward Community College, which combines payroll, personnel, course, and function information in order to determine the actual duties performed by personnel for the amount of remuneration received. Objectives of the system are (1) to define the tasks being performed by faculty, staff, and…

  3. Electric deregulation: Defining and ensuring fair competition

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kahn, A.E.

    1998-04-01

    Regulation has several important duties in the transition of the electricity industry to competition. But in fulfilling these responsibilities, regulators must refrain from policies pressed upon them by consumer representatives, on one side, and would-be rivals of utility companies, on the other, that would artificially handicap utilities and blunt the salutary forces of competition.

  4. 26 CFR 1.199-7 - Expanded affiliated groups.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... in § 1.199-3(h)), a qualified film (as defined in § 1.199-3(k)), or electricity, natural gas, or... disposing member disposes of the QPP, qualified film, or utilities, then the disposing member is treated as... QPP, qualified film, or utilities in determining whether its gross receipts are domestic production...

  5. Metagenome Sequencing of a Coastal Marine Microbial Community from Monterey Bay, California

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mueller, Ryan S.; Bryson, Sam; Kieft, Brandon

    Heterotrophic microbes are critical components of aquatic food webs. Linkages between populations and the substrates they utilize are not well defined. Here we present the metagenome of microbial communities from the coastal Pacific Ocean exposed to various nutrient additions in order to better understand substrate utilization and partitioning in this environment.

  6. Metagenome Sequencing of a Coastal Marine Microbial Community from Monterey Bay, California

    DOE PAGES

    Mueller, Ryan S.; Bryson, Sam; Kieft, Brandon; ...

    2015-04-30

    Heterotrophic microbes are critical components of aquatic food webs. Linkages between populations and the substrates they utilize are not well defined. Here we present the metagenome of microbial communities from the coastal Pacific Ocean exposed to various nutrient additions in order to better understand substrate utilization and partitioning in this environment.

  7. A Software Defined Integrated T1 Digital Network for Voice, Data and Video.

    ERIC Educational Resources Information Center

    Hill, James R.

    The Dallas County Community College District developed and implemented a strategic plan for communications that utilizes a county-wide integrated network to carry voice, data, and video information to nine locations within the district. The network, which was installed and operational by March 1987, utilizes microwave, fiber optics, digital cross…

  8. Sociotechnical Systems Approach: An Internal Assessment of a Blended Doctoral Program

    ERIC Educational Resources Information Center

    Erichsen, Elizabeth Anne; DeLorme, Lyn; Connelley, Rosalinda; Okurut-Ibore, Christine; McNamara, Lisa; Aljohani, Obaidalah

    2013-01-01

    An internal assessment was conducted utilizing a sociotechnical systems approach and cultural lens as a means of exploring the dynamics of a blended doctoral program. Blended learning environments were conceived of as sociotechnical systems, and blended programs were defined as programs that utilize multimodal means for the mediation of…

  9. Screening for Social, Emotional, and Behavioral Problems at Kindergarten Entry: Utility and Incremental Validity of Parent Report

    ERIC Educational Resources Information Center

    Owens, Julie Sarno; Storer, Jennifer; Holdaway, Alex S.; Serrano, Verenea J.; Watabe, Yuko; Himawan, Lina K.; Krelko, Rebecca E.; Vause, Katherine J.; Girio-Herrera, Erin; Andrews, Nina

    2015-01-01

    The current study examined the utility and incremental validity of parent ratings on the Strengths and Difficulties Questionnaire and Disruptive Behavior Disorders rating scale completed at kindergarten registration in identifying risk status as defined by important criterion variables (teacher ratings, daily behavioral performance, and quarterly…

  10. 10 CFR 766.2 - Applicability.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 4 2012-01-01 2012-01-01 false Applicability. 766.2 Section 766.2 Energy DEPARTMENT OF ENERGY URANIUM ENRICHMENT DECONTAMINATION AND DECOMMISSIONING FUND; PROCEDURES FOR SPECIAL ASSESSMENT OF DOMESTIC UTILITIES General § 766.2 Applicability. This part applies to all domestic utilities in the United...

  11. 10 CFR 766.2 - Applicability.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 4 2014-01-01 2014-01-01 false Applicability. 766.2 Section 766.2 Energy DEPARTMENT OF ENERGY URANIUM ENRICHMENT DECONTAMINATION AND DECOMMISSIONING FUND; PROCEDURES FOR SPECIAL ASSESSMENT OF DOMESTIC UTILITIES General § 766.2 Applicability. This part applies to all domestic utilities in the United...

  12. 10 CFR 766.2 - Applicability.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 4 2013-01-01 2013-01-01 false Applicability. 766.2 Section 766.2 Energy DEPARTMENT OF ENERGY URANIUM ENRICHMENT DECONTAMINATION AND DECOMMISSIONING FUND; PROCEDURES FOR SPECIAL ASSESSMENT OF DOMESTIC UTILITIES General § 766.2 Applicability. This part applies to all domestic utilities in the United...

  13. 10 CFR 766.2 - Applicability.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 4 2011-01-01 2011-01-01 false Applicability. 766.2 Section 766.2 Energy DEPARTMENT OF ENERGY URANIUM ENRICHMENT DECONTAMINATION AND DECOMMISSIONING FUND; PROCEDURES FOR SPECIAL ASSESSMENT OF DOMESTIC UTILITIES General § 766.2 Applicability. This part applies to all domestic utilities in the United...

  14. 10 CFR 766.2 - Applicability.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 4 2010-01-01 2010-01-01 false Applicability. 766.2 Section 766.2 Energy DEPARTMENT OF ENERGY URANIUM ENRICHMENT DECONTAMINATION AND DECOMMISSIONING FUND; PROCEDURES FOR SPECIAL ASSESSMENT OF DOMESTIC UTILITIES General § 766.2 Applicability. This part applies to all domestic utilities in the United...

  15. 42 CFR 456.505 - Applicability of waiver.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... (CONTINUED) MEDICAL ASSISTANCE PROGRAMS UTILIZATION CONTROL Utilization Review Plans: FFP, Waivers, and Variances for Hospitals and Mental Hospitals Ur Plan: Waiver of Requirements § 456.505 Applicability of...

  16. Decision analysis defining optimal management of clinical stage 1 high-risk nonseminomatous germ cell testicular cancer with lymphovascular invasion.

    PubMed

    Avulova, Svetlana; Allen, Clayton; Morgans, Alicia; Moses, Kelvin A

    2018-05-10

    Risk of recurrent disease for men with clinical stage 1 high-risk nonseminomatous germ cell testicular cancer (CS1 NSGCT) with lymphovascular invasion (LVI) after orchiectomy is 50% and current treatment options (surveillance [S], retroperitoneal lymph node dissection [RPLND], or 1 cycle of BEP [BEP ×1]) are associated with a 99% disease specific survival, therefore practice patterns vary. We performed a decision analysis using updated data of long-term complications for men with CS1 NSGCT with LVI to quantify and assess relative treatment values. Decision analysis included previously defined utilities (via standard gamble) for posttreatment states of living from 0 (death from disease) to 1 (alive in perfect health) and updated morbidity probabilities. We quantified the values of S, RPLND, and BEP ×1 via the rollback method. Sensitivity analyses including a range of orchiectomy cure rates and utility values were performed. Estimated probabilities favoring treatment with RPLND (0.97) or BEP ×1 (0.97) were equivalent and superior to surveillance (0.88). Sensitivity analysis of orchiectomy cure rates (50%-100%) failed to find a cure rate that favored S over BEP ×1 or RPLND. Varying utility values for cure after S from 0.92 (previously defined utility) to 1 (perfect health), failed to find a viable utility state favoring S over BEP ×1 or RPLND. An orchiectomy cure rate of ≥82% would be required for S to equal treatment of either type. We demonstrate that for surveillance to be superior to treatment with BEP ×1 or RPLND, the orchiectomy cure rate must be at least 82%, which is not expected in a patient population with high-risk CS1 NSGCT. Copyright © 2018 Elsevier Inc. All rights reserved.

  17. Rapid Induction of Cerebral Organoids From Human Induced Pluripotent Stem Cells Using a Chemically Defined Hydrogel and Defined Cell Culture Medium.

    PubMed

    Lindborg, Beth A; Brekke, John H; Vegoe, Amanda L; Ulrich, Connor B; Haider, Kerri T; Subramaniam, Sandhya; Venhuizen, Scott L; Eide, Cindy R; Orchard, Paul J; Chen, Weili; Wang, Qi; Pelaez, Francisco; Scott, Carolyn M; Kokkoli, Efrosini; Keirstead, Susan A; Dutton, James R; Tolar, Jakub; O'Brien, Timothy D

    2016-07-01

    Tissue organoids are a promising technology that may accelerate development of the societal and NIH mandate for precision medicine. Here we describe a robust and simple method for generating cerebral organoids (cOrgs) from human pluripotent stem cells by using a chemically defined hydrogel material and chemically defined culture medium. By using no additional neural induction components, cOrgs appeared on the hydrogel surface within 10-14 days, and under static culture conditions, they attained sizes up to 3 mm in greatest dimension by day 28. Histologically, the organoids showed neural rosette and neural tube-like structures and evidence of early corticogenesis. Immunostaining and quantitative reverse-transcription polymerase chain reaction demonstrated protein and gene expression representative of forebrain, midbrain, and hindbrain development. Physiologic studies showed responses to glutamate and depolarization in many cells, consistent with neural behavior. The method of cerebral organoid generation described here facilitates access to this technology, enables scalable applications, and provides a potential pathway to translational applications where defined components are desirable. Tissue organoids are a promising technology with many potential applications, such as pharmaceutical screens and development of in vitro disease models, particularly for human polygenic conditions where animal models are insufficient. This work describes a robust and simple method for generating cerebral organoids from human induced pluripotent stem cells by using a chemically defined hydrogel material and chemically defined culture medium. This method, by virtue of its simplicity and use of defined materials, greatly facilitates access to cerebral organoid technology, enables scalable applications, and provides a potential pathway to translational applications where defined components are desirable. ©AlphaMed Press.

  18. An IEEE 1451.1 Architecture for ISHM Applications

    NASA Technical Reports Server (NTRS)

    Morris, Jon A.; Turowski, Mark; Schmalzel, John L.; Figueroa, Jorge F.

    2007-01-01

    The IEEE 1451.1 Standard for a Smart Transducer Interface defines a common network information model for connecting and managing smart elements in control and data acquisition networks using network-capable application processors (NCAPs). The Standard is a network-neutral design model that is easily ported across operating systems and physical networks for implementing complex acquisition and control applications by simply plugging in the appropriate network level drivers. To simplify configuration and tracking of transducer and actuator details, the family of 1451 standards defines a Transducer Electronic Data Sheet (TEDS) that is associated with each physical element. The TEDS contains all of the pertinent information about the physical operations of a transducer (such as operating regions, calibration tables, and manufacturer information), which the NCAP uses to configure the system to support a specific transducer. The Integrated Systems Health Management (ISHM) group at NASA's John C. Stennis Space Center (SSC) has been developing an ISHM architecture that utilizes IEEE 1451.1 as the primary configuration and data acquisition mechanism for managing and collecting information from a network of distributed intelligent sensing elements. This work has involved collaboration with other NASA centers, universities and aerospace industries to develop IEEE 1451.1 compliant sensors and interfaces tailored to support health assessment of complex systems. This paper and presentation describe the development and implementation of an interface for the configuration, management and communication of data, information and knowledge generated by a distributed system of IEEE 1451.1 intelligent elements monitoring a rocket engine test system. In this context, an intelligent element is defined as one incorporating support for the IEEE 1451.x standards and additional ISHM functions. Our implementation supports real-time collection of both measurement data (raw ADC counts and converted engineering units) and health statistics produced by each intelligent element. The handling of configuration, calibration and health information is automated by using the TEDS in combination with other electronic data sheets extensions to convey health parameters. By integrating the IEEE 1451.1 Standard for a Smart Transducer Interface with ISHM technologies, each element within a complex system becomes a highly flexible computation engine capable of self-validation and performing other measures of the quality of information it is producing.

  19. Consensus Statement on Electronic Health Predictive Analytics: A Guiding Framework to Address Challenges

    PubMed Central

    Amarasingham, Ruben; Audet, Anne-Marie J.; Bates, David W.; Glenn Cohen, I.; Entwistle, Martin; Escobar, G. J.; Liu, Vincent; Etheredge, Lynn; Lo, Bernard; Ohno-Machado, Lucila; Ram, Sudha; Saria, Suchi; Schilling, Lisa M.; Shahi, Anand; Stewart, Walter F.; Steyerberg, Ewout W.; Xie, Bin

    2016-01-01

    Context: The recent explosion in available electronic health record (EHR) data is motivating a rapid expansion of electronic health care predictive analytic (e-HPA) applications, defined as the use of electronic algorithms that forecast clinical events in real time with the intent to improve patient outcomes and reduce costs. There is an urgent need for a systematic framework to guide the development and application of e-HPA to ensure that the field develops in a scientifically sound, ethical, and efficient manner. Objectives: Building upon earlier frameworks of model development and utilization, we identify the emerging opportunities and challenges of e-HPA, propose a framework that enables us to realize these opportunities, address these challenges, and motivate e-HPA stakeholders to both adopt and continuously refine the framework as the applications of e-HPA emerge. Methods: To achieve these objectives, 17 experts with diverse expertise including methodology, ethics, legal, regulation, and health care delivery systems were assembled to identify emerging opportunities and challenges of e-HPA and to propose a framework to guide the development and application of e-HPA. Findings: The framework proposed by the panel includes three key domains where e-HPA differs qualitatively from earlier generations of models and algorithms (Data Barriers, Transparency, and Ethics) and areas where current frameworks are insufficient to address the emerging opportunities and challenges of e-HPA (Regulation and Certification; and Education and Training). The following list of recommendations summarizes the key points of the framework: Data Barriers: Establish mechanisms within the scientific community to support data sharing for predictive model development and testing.Transparency: Set standards around e-HPA validation based on principles of scientific transparency and reproducibility.Ethics: Develop both individual-centered and society-centered risk-benefit approaches to evaluate e-HPA.Regulation and Certification: Construct a self-regulation and certification framework within e-HPA.Education and Training: Make significant changes to medical, nursing, and paraprofessional curricula by including training for understanding, evaluating, and utilizing predictive models. PMID:27141516

  20. Consensus Statement on Electronic Health Predictive Analytics: A Guiding Framework to Address Challenges.

    PubMed

    Amarasingham, Ruben; Audet, Anne-Marie J; Bates, David W; Glenn Cohen, I; Entwistle, Martin; Escobar, G J; Liu, Vincent; Etheredge, Lynn; Lo, Bernard; Ohno-Machado, Lucila; Ram, Sudha; Saria, Suchi; Schilling, Lisa M; Shahi, Anand; Stewart, Walter F; Steyerberg, Ewout W; Xie, Bin

    2016-01-01

    The recent explosion in available electronic health record (EHR) data is motivating a rapid expansion of electronic health care predictive analytic (e-HPA) applications, defined as the use of electronic algorithms that forecast clinical events in real time with the intent to improve patient outcomes and reduce costs. There is an urgent need for a systematic framework to guide the development and application of e-HPA to ensure that the field develops in a scientifically sound, ethical, and efficient manner. Building upon earlier frameworks of model development and utilization, we identify the emerging opportunities and challenges of e-HPA, propose a framework that enables us to realize these opportunities, address these challenges, and motivate e-HPA stakeholders to both adopt and continuously refine the framework as the applications of e-HPA emerge. To achieve these objectives, 17 experts with diverse expertise including methodology, ethics, legal, regulation, and health care delivery systems were assembled to identify emerging opportunities and challenges of e-HPA and to propose a framework to guide the development and application of e-HPA. The framework proposed by the panel includes three key domains where e-HPA differs qualitatively from earlier generations of models and algorithms (Data Barriers, Transparency, and ETHICS) and areas where current frameworks are insufficient to address the emerging opportunities and challenges of e-HPA (Regulation and Certification; and Education and Training). The following list of recommendations summarizes the key points of the framework: Data Barriers: Establish mechanisms within the scientific community to support data sharing for predictive model development and testing.Transparency: Set standards around e-HPA validation based on principles of scientific transparency and reproducibility. Develop both individual-centered and society-centered risk-benefit approaches to evaluate e-HPA.Regulation and Certification: Construct a self-regulation and certification framework within e-HPA.Education and Training: Make significant changes to medical, nursing, and paraprofessional curricula by including training for understanding, evaluating, and utilizing predictive models.

Top