EPA FRS Facilities State Single File CSV Download
This page provides state comma separated value (CSV) files containing key information of all facilities and sites within the Facility Registry System (FRS). Each state zip file contains a single CSV file of key facility-level information.
This page is the starting point for EZ Query. This page describes how to select key data elements from EPA's Facility Information Database and Geospatial Reference Database to build a tabular report or a Comma Separated Value (CSV) files for downloading.
ERIC Educational Resources Information Center
Manpower Administration (DOL), Washington, DC. Job Corps.
This self-study program for the high-school level contains lessons in the following subjects: Spelling Endings Added to e; Capitalization; Question Marks and Exclamation Points; Quotation Marks; Spelling Double Letter Demons; Colons and Dashes; Punctuating Series with Commas and Semicolons; More Confusing Word Pairs; Separating Sentence Parts with…
Storage and Database Management for Big Data
2015-07-27
and value ), each cell is actually a seven tuple where the column is broken into three parts, and there is an additional field for a timestamp as seen...questions require a careful understanding of the technology field in addition to the types of problems that are being solved. This chapter aims to address...formats such as comma separated values (CSV), JavaScript Object Notation (JSON) [21], or other proprietary sensor formats. Most often, this raw data
UAV Swarm Tactics: An Agent-Based Simulation and Markov Process Analysis
2013-06-01
CRN Common Random Numbers CSV Comma Separated Values DoE Design of Experiment GLM Generalized Linear Model HVT High Value Target JAR Java ARchive JMF... Java Media Framework JRE Java runtime environment Mason Multi-Agent Simulator Of Networks MOE Measure Of Effectiveness MOP Measures Of Performance...with every set several times, and to write a CSV file with the results. Rather than scripting the agent behavior deterministically, the agents should
A Virtual Environment for Resilient Infrastructure Modeling and Design
2015-09-01
Security CI Critical Infrastructure CID Center for Infrastructure Defense CSV Comma Separated Value DAD Defender-Attacker-Defender DHS Department...responses to disruptive events (e.g., cascading failure behavior) in a context- rich , controlled environment for exercises, education, and training...The general attacker-defender (AD) and defender-attacker-defender ( DAD ) models for CI are defined in Brown et al. (2006). These models help
Generating Ship-to-Shore Bulk Fuel Delivery Schedules for the Marine Expeditionary Unit
2017-06-01
Amphibious Ready Group . . . . . . . . . . . . . . . . . . . . . 9 2.2 Amphibious Connectors . . . . . . . . . . . . . . . . . . . . . 11 2.3 Fuel Containers...ARG Amphibious Ready Group BLT Battalion Landing Team COMPHIBRON Commander, Amphibious Squadron CSV Comma Separated Values LCAC Landing Craft Air...in the world. The MEU and the Amphibious Ready Group (ARG) create a highly capable amphibious force able to strike and conduct operations from the sea
DOE Office of Scientific and Technical Information (OSTI.GOV)
The software processes recorded thermal video and detects the flight tracks of birds and bats that passed through the camera's field of view. The output is a set of images that show complete flight tracks for any detections, with the direction of travel indicated and the thermal image of the animal delineated. A report of the descriptive features of each detected track is also output in the form of a comma-separated value text file.
Network Science Research Laboratory (NSRL) Telemetry Warehouse
2016-06-01
Functionality and architecture of the NSRL Telemetry Warehouse are also described as well as the web interface, data structure, security aspects, and...Experiment Controller 6 4.5 Telemetry Sensors 7 4.6 Custom Data Processing Nodes 7 5. Web Interface 8 6. Data Structure 8 6.1 Measurements 8...telemetry in comma-separated value (CSV) format from the web interface or via custom applications developed by researchers using the client application
Supporting Marine Corps Enhanced Company Operations: A Quantitative Analysis
2010-06-01
by decomposition into simple independent parts. o Agents interact with each other in non-linear ways, and “ adapt ” to their local environment . (p...Center Co Company CoLT Company Landing Team CAS Complex Adaptive Systems CSV Comma-separated Value DO Distributed Operations DODIC Department...SUMMARY The modern irregular warfare environment has dramatically impacted the battle space assignments and mission scope of tactical units that now
Transferable Output ASCII Data (TOAD) gateway: Version 1.0 user's guide
NASA Technical Reports Server (NTRS)
Bingel, Bradford D.
1991-01-01
The Transferable Output ASCII Data (TOAD) Gateway, release 1.0 is described. This is a software tool for converting tabular data from one format into another via the TOAD format. This initial release of the Gateway allows free data interchange among the following file formats: TOAD; Standard Interface File (SIF); Program to Optimize Simulated Trajectories (POST) input; Comma Separated Value (TSV); and a general free-form file format. As required, additional formats can be accommodated quickly and easily.
DOE Office of Scientific and Technical Information (OSTI.GOV)
North, Michael J.
SchemaOnRead provides tools for implementing schema-on-read including a single function call (e.g., schemaOnRead("filename")) that reads text (TXT), comma separated value (CSV), raster image (BMP, PNG, GIF, TIFF, and JPG), R data (RDS), HDF5, NetCDF, spreadsheet (XLS, XLSX, ODS, and DIF), Weka Attribute-Relation File Format (ARFF), Epi Info (REC), Pajek network (PAJ), R network (NET), Hypertext Markup Language (HTML), SPSS (SAV), Systat (SYS), and Stata (DTA) files. It also recursively reads folders (e.g., schemaOnRead("folder")), returning a nested list of the contained elements.
Analysis of pendulum period with an iPod touch/iPhone
NASA Astrophysics Data System (ADS)
Briggle, Justin
2013-05-01
We describe the use of Apple’s iPod touch/iPhone, acting as the pendulum bob, as a means of measuring pendulum period, making use of the device’s three-axis digital accelerometer and the freely available SPARKvue app from PASCO scientific. The method can be readily incorporated into an introductory physics laboratory experiment. Moreover, the principles described may be carried out with any number of smartphone devices containing an integrated accelerometer and paired with an appropriate application for collecting and sending accelerometer data as a comma-separated value file.
Wadeable Streams Assessment Data
The Wadeable Streams Assessment (WSA) is a first-ever statistically-valid survey of the biological condition of small streams throughout the U.S. The U.S. Environmental Protection Agency (EPA) worked with the states to conduct the assessment in 2004-2005. Data for each parameter sampled in the Wadeable Streams Assessment (WSA) are available for downloading in a series of files as comma separated values (*.csv). Each *.csv data file has a companion text file (*.txt) that lists a dataset label and individual descriptions for each variable. Users should view the *.txt files first to help guide their understanding and use of the data.
TRAP: automated classification, quantification and annotation of tandemly repeated sequences.
Sobreira, Tiago José P; Durham, Alan M; Gruber, Arthur
2006-02-01
TRAP, the Tandem Repeats Analysis Program, is a Perl program that provides a unified set of analyses for the selection, classification, quantification and automated annotation of tandemly repeated sequences. TRAP uses the results of the Tandem Repeats Finder program to perform a global analysis of the satellite content of DNA sequences, permitting researchers to easily assess the tandem repeat content for both individual sequences and whole genomes. The results can be generated in convenient formats such as HTML and comma-separated values. TRAP can also be used to automatically generate annotation data in the format of feature table and GFF files.
Hyper-Fractal Analysis: A visual tool for estimating the fractal dimension of 4D objects
NASA Astrophysics Data System (ADS)
Grossu, I. V.; Grossu, I.; Felea, D.; Besliu, C.; Jipa, Al.; Esanu, T.; Bordeianu, C. C.; Stan, E.
2013-04-01
This work presents a new version of a Visual Basic 6.0 application for estimating the fractal dimension of images and 3D objects (Grossu et al. (2010) [1]). The program was extended for working with four-dimensional objects stored in comma separated values files. This might be of interest in biomedicine, for analyzing the evolution in time of three-dimensional images. New version program summaryProgram title: Hyper-Fractal Analysis (Fractal Analysis v03) Catalogue identifier: AEEG_v3_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEEG_v3_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: Standard CPC license, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 745761 No. of bytes in distributed program, including test data, etc.: 12544491 Distribution format: tar.gz Programming language: MS Visual Basic 6.0 Computer: PC Operating system: MS Windows 98 or later RAM: 100M Classification: 14 Catalogue identifier of previous version: AEEG_v2_0 Journal reference of previous version: Comput. Phys. Comm. 181 (2010) 831-832 Does the new version supersede the previous version? Yes Nature of problem: Estimating the fractal dimension of 4D images. Solution method: Optimized implementation of the 4D box-counting algorithm. Reasons for new version: Inspired by existing applications of 3D fractals in biomedicine [3], we extended the optimized version of the box-counting algorithm [1, 2] to the four-dimensional case. This might be of interest in analyzing the evolution in time of 3D images. The box-counting algorithm was extended in order to support 4D objects, stored in comma separated values files. A new form was added for generating 2D, 3D, and 4D test data. The application was tested on 4D objects with known dimension, e.g. the Sierpinski hypertetrahedron gasket, Df=ln(5)/ln(2) (Fig. 1). The algorithm could be extended, with minimum effort, to higher number of dimensions. Easy integration with other applications by using the very simple comma separated values file format for storing multi-dimensional images. Implementation of χ2 test as a criterion for deciding whether an object is fractal or not. User friendly graphical interface. Hyper-Fractal Analysis-Test on the Sierpinski hypertetrahedron 4D gasket (Df=ln(5)/ln(2)≅2.32). Running time: In a first approximation, the algorithm is linear [2]. References: [1] V. Grossu, D. Felea, C. Besliu, Al. Jipa, C.C. Bordeianu, E. Stan, T. Esanu, Computer Physics Communications, 181 (2010) 831-832. [2] I.V. Grossu, C. Besliu, M.V. Rusu, Al. Jipa, C. C. Bordeianu, D. Felea, Computer Physics Communications, 180 (2009) 1999-2001. [3] J. Ruiz de Miras, J. Navas, P. Villoslada, F.J. Esteban, Computer Methods and Programs in Biomedicine, 104 Issue 3 (2011) 452-460.
Elements of a next generation time-series ASCII data file format for Earth Sciences
NASA Astrophysics Data System (ADS)
Webster, C. J.
2015-12-01
Data in ASCII comma separated value (CSV) format are recognized as the most simple, straightforward and readable type of data present in the geosciences. Many scientific workflows developed over the years rely on data using this simple format. However, there is a need for a lightweight ASCII header format standard that is easy to create and easy to work with. Current OGC grade XML standards are complex and difficult to implement for researchers with few resources. Ideally, such a format should provide the data in CSV for easy consumption by generic applications such as spreadsheets. The format should use an existing time standard. The header should be easily human readable as well as machine parsable. The metadata format should be extendable to allow vocabularies to be adopted as they are created by external standards bodies. The creation of such a format will increase the productivity of software engineers and scientists because fewer translators and checkers would be required. Data in ASCII comma separated value (CSV) format are recognized as the most simple, straightforward and readable type of data present in the geosciences. Many scientific workflows developed over the years rely on data using this simple format. However, there is a need for a lightweight ASCII header format standard that is easy to create and easy to work with. Current OGC grade XML standards are complex and difficult to implement for researchers with few resources. Ideally, such a format would provide the data in CSV for easy consumption by generic applications such as spreadsheets. The format would use existing time standard. The header would be easily human readable as well as machine parsable. The metadata format would be extendable to allow vocabularies to be adopted as they are created by external standards bodies. The creation of such a format would increase the productivity of software engineers and scientists because fewer translators would be required.
Niikuni, Keiyu; Muramoto, Toshiaki
2014-06-01
This study explored the effects of a comma on the processing of structurally ambiguous Japanese sentences with a semantic bias. A previous study has shown that a comma which is incompatible with an ambiguous sentence's semantic bias affects the processing of the sentence, but the effects of a comma that is compatible with the bias are unclear. In the present study, we examined the role of a comma compatible with the sentence's semantic bias using the self-paced reading method, which enabled us to determine the reading times for the region of the sentence where readers would be expected to solve the ambiguity using semantic information (the "target region"). The results show that a comma significantly increases the reading time of the punctuated word but decreases the reading time in the target region. We concluded that even if the semantic information provided might be sufficient for disambiguation, the insertion of a comma would affect the processing cost of the ambiguity, indicating that readers use both the comma and semantic information in parallel for sentence processing.
Stern, Michelle A.; Anderson, Frank A.; Flint, Lorraine E.; Flint, Alan L.
2018-05-03
In situ soil moisture datasets are important inputs used to calibrate and validate watershed, regional, or statewide modeled and satellite-based soil moisture estimates. The soil moisture dataset presented in this report includes hourly time series of the following: soil temperature, volumetric water content, water potential, and total soil water content. Data were collected by the U.S. Geological Survey at five locations in California: three sites in the central Sierra Nevada and two sites in the northern Coast Ranges. This report provides a description of each of the study areas, procedures and equipment used, processing steps, and time series data from each site in the form of comma-separated values (.csv) tables.
The new on-line Czech Food Composition Database.
Machackova, Marie; Holasova, Marie; Maskova, Eva
2013-10-01
The new on-line Czech Food Composition Database (FCDB) was launched on http://www.czfcdb.cz in December 2010 as a main freely available channel for dissemination of Czech food composition data. The application is based on a complied FCDB documented according to the EuroFIR standardised procedure for full value documentation and indexing of foods by the LanguaL™ Thesaurus. A content management system was implemented for administration of the website and performing data export (comma-separated values or EuroFIR XML transport package formats) by a compiler. Reference/s are provided for each published value with linking to available freely accessible on-line sources of data (e.g. full texts, EuroFIR Document Repository, on-line national FCDBs). LanguaL™ codes are displayed within each food record as searchable keywords of the database. A photo (or a photo gallery) is used as a visual descriptor of a food item. The application is searchable on foods, components, food groups, alphabet and a multi-field advanced search. Copyright © 2013 Elsevier Ltd. All rights reserved.
Modeling Healthcare Processes Using Commitments: An Empirical Evaluation.
Telang, Pankaj R; Kalia, Anup K; Singh, Munindar P
2015-01-01
The two primary objectives of this paper are: (a) to demonstrate how Comma, a business modeling methodology based on commitments, can be applied in healthcare process modeling, and (b) to evaluate the effectiveness of such an approach in producing healthcare process models. We apply the Comma approach on a breast cancer diagnosis process adapted from an HHS committee report, and presents the results of an empirical study that compares Comma with a traditional approach based on the HL7 Messaging Standard (Traditional-HL7). Our empirical study involved 47 subjects, and two phases. In the first phase, we partitioned the subjects into two approximately equal groups. We gave each group the same requirements based on a process scenario for breast cancer diagnosis. Members of one group first applied Traditional-HL7 and then Comma whereas members of the second group first applied Comma and then Traditional-HL7-each on the above-mentioned requirements. Thus, each subject produced two models, each model being a set of UML Sequence Diagrams. In the second phase, we repartitioned the subjects into two groups with approximately equal distributions from both original groups. We developed exemplar Traditional-HL7 and Comma models; we gave one repartitioned group our Traditional-HL7 model and the other repartitioned group our Comma model. We provided the same changed set of requirements to all subjects and asked them to modify the provided exemplar model to satisfy the new requirements. We assessed solutions produced by subjects in both phases with respect to measures of flexibility, time, difficulty, objective quality, and subjective quality. Our study found that Comma is superior to Traditional-HL7 in flexibility and objective quality as validated via Student's t-test to the 10% level of significance. Comma is a promising new approach for modeling healthcare processes. Further gains could be made through improved tooling and enhanced training of modeling personnel.
Modeling Healthcare Processes Using Commitments: An Empirical Evaluation
2015-01-01
The two primary objectives of this paper are: (a) to demonstrate how Comma, a business modeling methodology based on commitments, can be applied in healthcare process modeling, and (b) to evaluate the effectiveness of such an approach in producing healthcare process models. We apply the Comma approach on a breast cancer diagnosis process adapted from an HHS committee report, and presents the results of an empirical study that compares Comma with a traditional approach based on the HL7 Messaging Standard (Traditional-HL7). Our empirical study involved 47 subjects, and two phases. In the first phase, we partitioned the subjects into two approximately equal groups. We gave each group the same requirements based on a process scenario for breast cancer diagnosis. Members of one group first applied Traditional-HL7 and then Comma whereas members of the second group first applied Comma and then Traditional-HL7—each on the above-mentioned requirements. Thus, each subject produced two models, each model being a set of UML Sequence Diagrams. In the second phase, we repartitioned the subjects into two groups with approximately equal distributions from both original groups. We developed exemplar Traditional-HL7 and Comma models; we gave one repartitioned group our Traditional-HL7 model and the other repartitioned group our Comma model. We provided the same changed set of requirements to all subjects and asked them to modify the provided exemplar model to satisfy the new requirements. We assessed solutions produced by subjects in both phases with respect to measures of flexibility, time, difficulty, objective quality, and subjective quality. Our study found that Comma is superior to Traditional-HL7 in flexibility and objective quality as validated via Student’s t-test to the 10% level of significance. Comma is a promising new approach for modeling healthcare processes. Further gains could be made through improved tooling and enhanced training of modeling personnel. PMID:26539985
George Washington: A Grounded Leader
2011-04-08
not well educated in a formal sense, George Washington was highly intellectual. His commitment to self -improvement, coupled with native abilities, and...and additional qualifiers separated by commas, e.g. Smith, Richard, Jr. 7. PERFORMING ORGANIZATION NAME{S) AND ADDRESS{ES). Self -explanatory. 8...PERMITTED PROVIDED PROPER ACKNOWLEDGEMENT IS MADE. ii Acknowledgements In the beginning was Dr. Donald F. Bittner. Dr. Bittner was my esteemed faculty
Endo, K; Yamanaka, A; Mitsumasu, K; Sakurama, T; Tanaka, D
1997-02-21
A neuropeptide from brain-suboesophageal ganglion (Br-SG) complexes of the silkmoth, Bombyx mori, shows summer-morph-producing hormone (SMPH) activity in the Asian comma butterfly, P. c-aureum. The SMPH-active peptide was extracted and demonstrated to be almost the same molecular size as bombyxin (4-5kD), a nueropeptide which shows prothoracicotropic hormone (PTTH) activity when assayed in vitro with prothoracic glands (PGs) of 4th-instar B. mori larvae in vitro. A Sephadex G-50 fraction of 3-8kD molecules prepared from Br-SG complexes of B. mori adults was applied to CM-, SP-, DEAE- or QAE- Toyoperal columns at pH 5.6 (or pH 6.9). The SMPH-activity could be separated from the PTTH-activity (or bombyxin) by subjecting a SMPH- and PTTH-active preparation of B. mori to anion-exchange chromatography at pH 6.9. By reversed-phase HPLC following an anion-exchange chromatography, SMPH-activity was recovered in two fractions of 40-45% acetonitril. Results demonstrate that the B. mori peptide showing the SMPH-activity in P. c-aureum is a different molecule than bombyxin.
2016-02-02
understanding is the experimental verification of a new model of light-induced loss spectra, employing continuum-dressed basis states, which agrees in...and additional qualifiers separated by commas, e.g. Smith, Richard, J, Jr. 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES). Self -explanatory... verification of a new model of light-induced loss spectra, employing continuum-dressed basis states, which agrees in shape and magnitude with all of our
Performance Characterization of Polyimide-Carbon Fiber Composites for Future Hypersonic Vehicles
2010-08-01
already being processed for leading edge primary structures and engine components for present and future stealth aircraft. In addition to describing our...The form of entry is the last name, first name, middle initial, and additional qualifiers separated by commas, e.g. Smith, Richard, Jr. 7...availability or distribution limitations of the report. If additional limitations/restrictions or special markings are indicated, follow agency
LBA-ECO TG-07 Soil Trace Gas Flux and Root Mortality, Tapajos National Forest
R.K. Varner; M.M. Keller
2009-01-01
This data set reports the results of an experiment that tested the short-term effects of root mortality on the soil-atmosphere fluxes of nitrous oxide, nitric oxide, methane, and carbon dioxide in a tropical evergreen forest. Weekly trace gas fluxes are provided for treatment and control plots on sand and clay tropical forest soils in two comma separated ASCII files....
NASA Astrophysics Data System (ADS)
Eremina, Svetlana V.; Gureyev, Vladimir V.
2005-06-01
Use ofthe comma is the most problematic place ofthe English language for the Russian users. Some rules on the comma usage do not work, others are incorrectly formulated. The list of cases of use amounts from a dozen to thirty, and intuition is not the one to rely on. The material for the case analysis has been taken from the most recent grammar manuals compiled by the English-speaking linguists.
Punctuation and Intonation Effects on Clause and Sentence Wrap-Up: Evidence from Eye Movements
ERIC Educational Resources Information Center
Hirotani, Masako; Frazier, Lyn; Rayner, Keith
2006-01-01
Three eye movement studies examined the role of punctuation in reading. In Experiment 1, although a comma at the end of a clause facilitated overall reading times for the sentence, first pass times were longer at the end of comma-marked clauses than clauses without a comma (or the same material in clause medial position). The data supported the…
A Method for Correcting Broken Hyphenations in Noisy English Text
2012-04-01
words, such as a frequency list . An algorithm that would make use of word validation, taking into account the various usages of hyphens in English, is...commas, and question marks from the surrounding words. The British National Corpus (2) (BNC) frequency list was used to perform the validation...rather than a separate spell checking program. This was primarily because implementation of the algorithm using a frequency list was quite trivial
Testing a Low-Interaction Honeypot against Live Cyber Attackers
2011-09-01
to run Snort were the Sourcefire Vulnerability Research Team ( VRT ) rules, which are the official rules available for the program. We used the...latest VRT rules that were available free to registered users, rules an average of 30 days old when released. The software provides a detailed alert log...although it is production system. On the Windows machine we installed Snort 2.9 with the VRT rules. Snort was configured to log comma-separated
CentiServer: A Comprehensive Resource, Web-Based Application and R Package for Centrality Analysis.
Jalili, Mahdi; Salehzadeh-Yazdi, Ali; Asgari, Yazdan; Arab, Seyed Shahriar; Yaghmaie, Marjan; Ghavamzadeh, Ardeshir; Alimoghaddam, Kamran
2015-01-01
Various disciplines are trying to solve one of the most noteworthy queries and broadly used concepts in biology, essentiality. Centrality is a primary index and a promising method for identifying essential nodes, particularly in biological networks. The newly created CentiServer is a comprehensive online resource that provides over 110 definitions of different centrality indices, their computational methods, and algorithms in the form of an encyclopedia. In addition, CentiServer allows users to calculate 55 centralities with the help of an interactive web-based application tool and provides a numerical result as a comma separated value (csv) file format or a mapped graphical format as a graph modeling language (GML) file. The standalone version of this application has been developed in the form of an R package. The web-based application (CentiServer) and R package (centiserve) are freely available at http://www.centiserver.org/.
jsNMR: an embedded platform-independent NMR spectrum viewer.
Vosegaard, Thomas
2015-04-01
jsNMR is a lightweight NMR spectrum viewer written in JavaScript/HyperText Markup Language (HTML), which provides a cross-platform spectrum visualizer that runs on all computer architectures including mobile devices. Experimental (and simulated) datasets are easily opened in jsNMR by (i) drag and drop on a jsNMR browser window, (ii) by preparing a jsNMR file from the jsNMR web site, or (iii) by mailing the raw data to the jsNMR web portal. jsNMR embeds the original data in the HTML file, so a jsNMR file is a self-transforming dataset that may be exported to various formats, e.g. comma-separated values. The main applications of jsNMR are to provide easy access to NMR data without the need for dedicated software installed and to provide the possibility to visualize NMR spectra on web sites. Copyright © 2015 John Wiley & Sons, Ltd.
Users' Manual and Installation Guide for the EverVIEW Slice and Dice Tool (Version 1.0 Beta)
Roszell, Dustin; Conzelmann, Craig; Chimmula, Sumani; Chandrasekaran, Anuradha; Hunnicut, Christina
2009-01-01
Network Common Data Form (NetCDF) is a self-describing, machine-independent file format for storing array-oriented scientific data. Over the past few years, there has been a growing movement within the community of natural resource managers in The Everglades, Fla., to use NetCDF as the standard data container for datasets based on multidimensional arrays. As a consequence, a need arose for additional tools to view and manipulate NetCDF datasets, specifically to create subsets of large NetCDF files. To address this need, we created the EverVIEW Slice and Dice Tool to allow users to create subsets of grid-based NetCDF files. The major functions of this tool are (1) to subset NetCDF files both spatially and temporally; (2) to view the NetCDF data in table form; and (3) to export filtered data to a comma-separated value file format.
CentiServer: A Comprehensive Resource, Web-Based Application and R Package for Centrality Analysis
Jalili, Mahdi; Salehzadeh-Yazdi, Ali; Asgari, Yazdan; Arab, Seyed Shahriar; Yaghmaie, Marjan; Ghavamzadeh, Ardeshir; Alimoghaddam, Kamran
2015-01-01
Various disciplines are trying to solve one of the most noteworthy queries and broadly used concepts in biology, essentiality. Centrality is a primary index and a promising method for identifying essential nodes, particularly in biological networks. The newly created CentiServer is a comprehensive online resource that provides over 110 definitions of different centrality indices, their computational methods, and algorithms in the form of an encyclopedia. In addition, CentiServer allows users to calculate 55 centralities with the help of an interactive web-based application tool and provides a numerical result as a comma separated value (csv) file format or a mapped graphical format as a graph modeling language (GML) file. The standalone version of this application has been developed in the form of an R package. The web-based application (CentiServer) and R package (centiserve) are freely available at http://www.centiserver.org/ PMID:26571275
MetaQuant: a tool for the automatic quantification of GC/MS-based metabolome data.
Bunk, Boyke; Kucklick, Martin; Jonas, Rochus; Münch, Richard; Schobert, Max; Jahn, Dieter; Hiller, Karsten
2006-12-01
MetaQuant is a Java-based program for the automatic and accurate quantification of GC/MS-based metabolome data. In contrast to other programs MetaQuant is able to quantify hundreds of substances simultaneously with minimal manual intervention. The integration of a self-acting calibration function allows the parallel and fast calibration for several metabolites simultaneously. Finally, MetaQuant is able to import GC/MS data in the common NetCDF format and to export the results of the quantification into Systems Biology Markup Language (SBML), Comma Separated Values (CSV) or Microsoft Excel (XLS) format. MetaQuant is written in Java and is available under an open source license. Precompiled packages for the installation on Windows or Linux operating systems are freely available for download. The source code as well as the installation packages are available at http://bioinformatics.org/metaquant
MOST: a software environment for constraint-based metabolic modeling and strain design.
Kelley, James J; Lane, Anatoliy; Li, Xiaowei; Mutthoju, Brahmaji; Maor, Shay; Egen, Dennis; Lun, Desmond S
2015-02-15
MOST (metabolic optimization and simulation tool) is a software package that implements GDBB (genetic design through branch and bound) in an intuitive user-friendly interface with excel-like editing functionality, as well as implementing FBA (flux balance analysis), and supporting systems biology markup language and comma-separated values files. GDBB is currently the fastest algorithm for finding gene knockouts predicted by FBA to increase production of desired products, but GDBB has only been available on a command line interface, which is difficult to use for those without programming knowledge, until the release of MOST. MOST is distributed for free on the GNU General Public License. The software and full documentation are available at http://most.ccib.rutgers.edu/. dslun@rutgers.edu. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Definition and maintenance of a telemetry database dictionary
NASA Technical Reports Server (NTRS)
Knopf, William P. (Inventor)
2007-01-01
A telemetry dictionary database includes a component for receiving spreadsheet workbooks of telemetry data over a web-based interface from other computer devices. Another component routes the spreadsheet workbooks to a specified directory on the host processing device. A process then checks the received spreadsheet workbooks for errors, and if no errors are detected the spreadsheet workbooks are routed to another directory to await initiation of a remote database loading process. The loading process first converts the spreadsheet workbooks to comma separated value (CSV) files. Next, a network connection with the computer system that hosts the telemetry dictionary database is established and the CSV files are ported to the computer system that hosts the telemetry dictionary database. This is followed by a remote initiation of a database loading program. Upon completion of loading a flatfile generation program is manually initiated to generate a flatfile to be used in a mission operations environment by the core ground system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Medina, D.; Oborn, C.J.; Li, M.L.
1987-09-01
The COMMA-D mammary cell line exhibits mammary-specific functional differentiation under appropriate conditions in cell culture. The cytologically heterogeneous COMMA-D parental line and the clonal lines DB-1, TA-5, and FA-1 derived from the COMMA-D parent were examined for similar properties of functional differentiation. In monolayer cell culture, the cell lines DB-1, TA-5, FA-1, and MA-4 were examined for expression of mammary-specific and epithelial-specific proteins by an indirect immunofluorescence assay. The clonal cell lines were relatively homogeneous in their respective staining properties and seemed to represent three subpopulations found in the heterogeneous parental COMMA-D lines. None of the four clonal lines appearedmore » to represent myoepithelial cells. The cell lines were examined for expression of {beta}-casein mRNA in the presence or absence of prolactin. The inducibility of {beta}-casein in the COMMA-D cell line was further enhanced by a reconstituted basement membrane preparation enriched in laminin, collagen IV, and proteoglycans. These results support the hypothesis that the functional response of inducible mammary cell populations is a result of interaction among hormones, multiple extracellular matrix components, and specific cell types.« less
Introduccion al estudio de la coma (Final) [Introduction to the Study of the Coma (Final Part)].
ERIC Educational Resources Information Center
Amilcar Cipriano, Nestor
1979-01-01
This concluding article in a series concerning the use of the comma in Spanish gives specific examples of its use from Spanish literature. Concluding remarks outline the major purposes of the comma. (NCR)
Auxiliary Library Explorer (ALEX) Development
2016-02-01
non-empty cells. This is a laborious manual task and could probably have been avoided by using Java code to read the data directly from Excel. In fact...it might be even easier to leave the data as a comma separated variables (CSV) file and read the data in with Java , although this could create other...This is first implemented using the MakeFullDatabaseapp Java project, which performs an SQL query on the DSpace data to return a list of items for which
Synthetic Vision Technology Demonstration. Volume 3. Flight Tests
1993-12-01
diameter increments for the FSSP and secondary PMS probes ( PMS2 ). For each approach recorded for the sortie, general information recorded in the file... PMS2 ). This final file was given a suffix of "PM2". Each of these files contained multiple dataSit separated by commas rather than spaces. Since the...s I P1.r: 0666011 APPU0O: a TIN PMUCO(SA): 163453-164661 HTEG. PROFLE OF PMS2 RAIN RATE MTEG. PROFLE OF PROBES LWC 819092, Approach #7 MGMjs2
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eisenstein, R.S.; Rosen, J.M.
The mechanism by which individual peptide and steroid hormones and cell-substratum interactions regulate milk protein gene expression has been studied in the COMMA-D mammary epithelial cell line. In the presence of insulin, hydrocortisone, and prolactin, growth of COMMA-D cells on floating collagen gels in comparison with that on a plastic substratum resulted in a 2.5- to 3-fold increase in the relative rate of ..beta..-casein gene transcription but a 37-fold increase in ..beta..-casein mRNA accumulation. In contrast, whey acidic protein gene transcription was constitutive in COMMA-D cells grown on either substratum, but its mRNA was unstable and little intact mature mRNAmore » was detected. Culturing COMMA-D cells on collagen also promoted increased expression of other genes expressed in differentiated mammary epithelial cells, including those encoding ..cap alpha..- and ..gamma..-casein, transferrin, malic enzyme, and phosphoenolpyruvate carboxykinase but decreased the expression of actin and histone genes. Using COMMA-D cells, the authors defined further the role of individual hormones in influencing ..beta..-casein gene transcription. With insulin alone, a basal level of ..beta..-casein gene transcription was detected in COMMA-D cells grown on floating collagen gels. Addition of prolactin but not hydrocortisone resulted in a 2.5- to 3.0-fold increase in ..beta..-casein gene transcription, but both hormones were required to elicit the maximal 73-fold induction in mRNA accumulation. The posttranscriptional effect of hormones on casein mRNA accummulation preceded any detectable changes in the relative rate of transcription. Thus, regulation by both hormones and cell substratum of casein gene expression is exerted primarily at the post transcriptional level.« less
EPA Enforcement and Compliance History Online
The Environmental Protection Agency's Enforcement and Compliance History Online (ECHO) website provides customizable and downloadable information about environmental inspections, violations, and enforcement actions for EPA-regulated facilities related to the Clean Air Act, Clean Water Act, Resource Conservation and Recovery Act, and Safe Drinking Water Act. These data are updated weekly as part of the ECHO data refresh, and ECHO offers many user-friendly options to explore data, including:? Facility Search: ECHO information is searchable by varied criteria, including location, facility type, and compliance status. Search results are customizable and downloadable.? Comparative Maps and State Dashboards: These tools offer aggregated information about facility compliance status, regulatory agency compliance monitoring, and enforcement activity at the national and state level.? Bulk Data Downloads: One of ECHO??s most popular features is the ability to work offline by downloading large data sets. Users can take advantage of the ECHO Exporter, which provides summary information about each facility in comma-separated values (csv) file format, or download data sets by program as zip files.
MODIS Interactive Subsetting Tool (MIST)
NASA Astrophysics Data System (ADS)
McAllister, M.; Duerr, R.; Haran, T.; Khalsa, S. S.; Miller, D.
2008-12-01
In response to requests from the user community, NSIDC has teamed with the Oak Ridge National Laboratory Distributive Active Archive Center (ORNL DAAC) and the Moderate Resolution Data Center (MrDC) to provide time series subsets of satellite data covering stations in the Greenland Climate Network (GC-NET) and the International Arctic Systems for Observing the Atmosphere (IASOA) network. To serve these data NSIDC created the MODIS Interactive Subsetting Tool (MIST). MIST works with 7 km by 7 km subset time series of certain Version 5 (V005) MODIS products over GC-Net and IASOA stations. User- selected data are delivered in a text Comma Separated Value (CSV) file format. MIST also provides online analysis capabilities that include generating time series and scatter plots. Currently, MIST is a Beta prototype and NSIDC intends that user requests will drive future development of the tool. The intent of this poster is to introduce MIST to the MODIS data user audience and illustrate some of the online analysis capabilities.
DOE Office of Scientific and Technical Information (OSTI.GOV)
North, Michael J.
Schema-on-read is an agile approach to data storage and retrieval that defers investments in data organization until production queries need to be run by working with data directly in native form. Schema-on-read functions have been implemented in a wide range of analytical systems, most notably Hadoop. SchemaOnRead is a CRAN package that uses R’s flexible data representations to provide transparent and convenient support for the schema-on-read paradigm in R. The schema-on- read tools within the package include a single function call that recursively reads folders with text, comma separated value, raster image, R data, HDF5, NetCDF, spreadsheet, Weka, Epi Info,more » Pajek network, R network, HTML, SPSS, Systat, and Stata files. The provided tools can be used as-is or easily adapted to implement customized schema-on-read tool chains in R. This paper’s contribution is that it introduces and describes SchemaOnRead, the first R package specifically focused on providing explicit schema-on-read support in R.« less
Converting CSV Files to RKSML Files
NASA Technical Reports Server (NTRS)
Trebi-Ollennu, Ashitey; Liebersbach, Robert
2009-01-01
A computer program converts, into a format suitable for processing on Earth, files of downlinked telemetric data pertaining to the operation of the Instrument Deployment Device (IDD), which is a robot arm on either of the Mars Explorer Rovers (MERs). The raw downlinked data files are in comma-separated- value (CSV) format. The present program converts the files into Rover Kinematics State Markup Language (RKSML), which is an Extensible Markup Language (XML) format that facilitates representation of operations of the IDD and enables analysis of the operations by means of the Rover Sequencing Validation Program (RSVP), which is used to build sequences of commanded operations for the MERs. After conversion by means of the present program, the downlinked data can be processed by RSVP, enabling the MER downlink operations team to play back the actual IDD activity represented by the telemetric data against the planned IDD activity. Thus, the present program enhances the diagnosis of anomalies that manifest themselves as differences between actual and planned IDD activities.
NASA Technical Reports Server (NTRS)
Carroll, Mark L.; Brown, Molly E.; Wooten, Margaret R.; Donham, Joel E.; Hubbard, Alfred B.; Ridenhour, William B.
2016-01-01
As our climate changes through time there is an ever-increasing need to quantify how and where it is changing so that mitigation strategies can be implemented. Urban areas have a disproportionate amount of warming due, in part, to the conductive properties of concrete and asphalt surfaces, surface albedo, heat capacity, lack of water, etc. that make up an urban environment. The NASA Climate Adaptation Science Investigation working group at Goddard Space Flight Center in Greenbelt, MD, conducted a study to collect temperature and humidity data at 15 min intervals from 12 sites at the center. These sites represent the major surface types at the center: asphalt, building roof, grass field, forest, and rain garden. The data show a strong distinction in the thermal properties of these surfaces at the center and the difference between the average values for the center compared to a local meteorological station. The data have been submitted to Oak Ridge National Laboratory Distributed Active Archive Center (ORNL-DAAC) for archival in comma separated value (csv) file format (Carroll et al.,2016) and can be found by following this link: http:daac.ornl.govcgi-bindsviewer.pl?ds_id1319.
Gandy, Lisa M; Gumm, Jordan; Fertig, Benjamin; Thessen, Anne; Kennish, Michael J; Chavan, Sameer; Marchionni, Luigi; Xia, Xiaoxin; Shankrit, Shambhavi; Fertig, Elana J
2017-01-01
Scientists have unprecedented access to a wide variety of high-quality datasets. These datasets, which are often independently curated, commonly use unstructured spreadsheets to store their data. Standardized annotations are essential to perform synthesis studies across investigators, but are often not used in practice. Therefore, accurately combining records in spreadsheets from differing studies requires tedious and error-prone human curation. These efforts result in a significant time and cost barrier to synthesis research. We propose an information retrieval inspired algorithm, Synthesize, that merges unstructured data automatically based on both column labels and values. Application of the Synthesize algorithm to cancer and ecological datasets had high accuracy (on the order of 85-100%). We further implement Synthesize in an open source web application, Synthesizer (https://github.com/lisagandy/synthesizer). The software accepts input as spreadsheets in comma separated value (CSV) format, visualizes the merged data, and outputs the results as a new spreadsheet. Synthesizer includes an easy to use graphical user interface, which enables the user to finish combining data and obtain perfect accuracy. Future work will allow detection of units to automatically merge continuous data and application of the algorithm to other data formats, including databases.
Gumm, Jordan; Fertig, Benjamin; Thessen, Anne; Kennish, Michael J.; Chavan, Sameer; Marchionni, Luigi; Xia, Xiaoxin; Shankrit, Shambhavi; Fertig, Elana J.
2017-01-01
Scientists have unprecedented access to a wide variety of high-quality datasets. These datasets, which are often independently curated, commonly use unstructured spreadsheets to store their data. Standardized annotations are essential to perform synthesis studies across investigators, but are often not used in practice. Therefore, accurately combining records in spreadsheets from differing studies requires tedious and error-prone human curation. These efforts result in a significant time and cost barrier to synthesis research. We propose an information retrieval inspired algorithm, Synthesize, that merges unstructured data automatically based on both column labels and values. Application of the Synthesize algorithm to cancer and ecological datasets had high accuracy (on the order of 85–100%). We further implement Synthesize in an open source web application, Synthesizer (https://github.com/lisagandy/synthesizer). The software accepts input as spreadsheets in comma separated value (CSV) format, visualizes the merged data, and outputs the results as a new spreadsheet. Synthesizer includes an easy to use graphical user interface, which enables the user to finish combining data and obtain perfect accuracy. Future work will allow detection of units to automatically merge continuous data and application of the algorithm to other data formats, including databases. PMID:28437440
A new version of Visual tool for estimating the fractal dimension of images
NASA Astrophysics Data System (ADS)
Grossu, I. V.; Felea, D.; Besliu, C.; Jipa, Al.; Bordeianu, C. C.; Stan, E.; Esanu, T.
2010-04-01
This work presents a new version of a Visual Basic 6.0 application for estimating the fractal dimension of images (Grossu et al., 2009 [1]). The earlier version was limited to bi-dimensional sets of points, stored in bitmap files. The application was extended for working also with comma separated values files and three-dimensional images. New version program summaryProgram title: Fractal Analysis v02 Catalogue identifier: AEEG_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEEG_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 9999 No. of bytes in distributed program, including test data, etc.: 4 366 783 Distribution format: tar.gz Programming language: MS Visual Basic 6.0 Computer: PC Operating system: MS Windows 98 or later RAM: 30 M Classification: 14 Catalogue identifier of previous version: AEEG_v1_0 Journal reference of previous version: Comput. Phys. Comm. 180 (2009) 1999 Does the new version supersede the previous version?: Yes Nature of problem: Estimating the fractal dimension of 2D and 3D images. Solution method: Optimized implementation of the box-counting algorithm. Reasons for new version:The previous version was limited to bitmap image files. The new application was extended in order to work with objects stored in comma separated values (csv) files. The main advantages are: Easier integration with other applications (csv is a widely used, simple text file format); Less resources consumed and improved performance (only the information of interest, the "black points", are stored); Higher resolution (the points coordinates are loaded into Visual Basic double variables [2]); Possibility of storing three-dimensional objects (e.g. the 3D Sierpinski gasket). In this version the optimized box-counting algorithm [1] was extended to the three-dimensional case. Summary of revisions:The application interface was changed from SDI (single document interface) to MDI (multi-document interface). One form was added in order to provide a graphical user interface for the new functionalities (fractal analysis of 2D and 3D images stored in csv files). Additional comments: User friendly graphical interface; Easy deployment mechanism. Running time: In the first approximation, the algorithm is linear. References:[1] I.V. Grossu, C. Besliu, M.V. Rusu, Al. Jipa, C.C. Bordeianu, D. Felea, Comput. Phys. Comm. 180 (2009) 1999-2001.[2] F. Balena, Programming Microsoft Visual Basic 6.0, Microsoft Press, US, 1999.
Geospatial database for regional environmental assessment of central Colorado.
Church, Stan E.; San Juan, Carma A.; Fey, David L.; Schmidt, Travis S.; Klein, Terry L.; DeWitt, Ed H.; Wanty, Richard B.; Verplanck, Philip L.; Mitchell, Katharine A.; Adams, Monique G.; Choate, LaDonna M.; Todorov, Todor I.; Rockwell, Barnaby W.; McEachron, Luke; Anthony, Michael W.
2012-01-01
In conjunction with the future planning needs of the U.S. Department of Agriculture, Forest Service, the U.S. Geological Survey conducted a detailed environmental assessment of the effects of historical mining on Forest Service lands in central Colorado. Stream sediment, macroinvertebrate, and various filtered and unfiltered water quality samples were collected during low-flow over a four-year period from 2004–2007. This report summarizes the sampling strategy, data collection, and analyses performed on these samples. The data are presented in Geographic Information System, Microsoft Excel, and comma-delimited formats. Reports on data interpretation are being prepared separately.
Drury, John E; Baum, Shari R; Valeriote, Hope; Steinhauer, Karsten
2016-01-01
This study presents the first two ERP reading studies of comma-induced effects of covert (implicit) prosody on syntactic parsing decisions in English. The first experiment used a balanced 2 × 2 design in which the presence/absence of commas determined plausibility (e.g., John, said Mary, was the nicest boy at the party vs. John said Mary was the nicest boy at the party ). The second reading experiment replicated a previous auditory study investigating the role of overt prosodic boundaries in closure ambiguities (Pauker et al., 2011). In both experiments, commas reliably elicited CPS components and generally played a dominant role in determining parsing decisions in the face of input ambiguity. The combined set of findings provides further evidence supporting the claim that mechanisms subserving speech processing play an active role during silent reading.
A study of two cases of comma-cloud cyclogenesis using a semigeostrophic model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Craig, G.C.; Cho, Hanru
1992-12-01
The linear stability of two atmospheric flows is studied, with basic-state data taken from environments where comma clouds are observed to flow. Each basic state features a baroclinic zone associated with an upper-level jet, with conditional instability on the north side. The semigeostrophic approximation is utilized, along with a simple parameterization for cumulus heating, and the eigenvalue problem is solved employing a Chebyshev spectral technique. 47 refs.
Drury, John E.; Baum, Shari R.; Valeriote, Hope; Steinhauer, Karsten
2016-01-01
This study presents the first two ERP reading studies of comma-induced effects of covert (implicit) prosody on syntactic parsing decisions in English. The first experiment used a balanced 2 × 2 design in which the presence/absence of commas determined plausibility (e.g., John, said Mary, was the nicest boy at the party vs. John said Mary was the nicest boy at the party). The second reading experiment replicated a previous auditory study investigating the role of overt prosodic boundaries in closure ambiguities (Pauker et al., 2011). In both experiments, commas reliably elicited CPS components and generally played a dominant role in determining parsing decisions in the face of input ambiguity. The combined set of findings provides further evidence supporting the claim that mechanisms subserving speech processing play an active role during silent reading. PMID:27695428
Data from the National Aquatic Resource Surveys: The following data are available for download as comma separated values (.csv) files. Sort the table using the pull down menus or headers to more easily locate the data. Right click on the file name and select Save Link As to save the file to your computer. Make sure to also download the companion metadata file (.txt) for the list of field labels. See the survey technical document for more information on the data analyses.This dataset is associated with the following publications:Yurista , P., J. Kelly , and J. Scharold. Great Lakes nearshore-offshore: Distinct water quality regions. JOURNAL OF GREAT LAKES RESEARCH. International Association for Great Lakes Research, Ann Arbor, MI, USA, 42: 375-385, (2016).Kelly , J., P. Yurista , M. Starry, J. Scharold , W. Bartsch , and A. Cotter. The first US National Coastal Condition Assessment survey in the Great Lakes: Development of the GIS frame and exploration of spatial variation in nearshore water quality results. JOURNAL OF GREAT LAKES RESEARCH. International Association for Great Lakes Research, Ann Arbor, MI, USA, 41: 1060-1074, (2015).
Marshall, Garrett J; Thompson, Scott M; Shamsaei, Nima
2016-06-01
An OPTOMEC Laser Engineered Net Shaping (LENS(™)) 750 system was retrofitted with a melt pool pyrometer and in-chamber infrared (IR) camera for nondestructive thermal inspection of the blown-powder, direct laser deposition (DLD) process. Data indicative of temperature and heat transfer within the melt pool and heat affected zone atop a thin-walled structure of Ti-6Al-4V during its additive manufacture are provided. Melt pool temperature data were collected via the dual-wavelength pyrometer while the dynamic, bulk part temperature distribution was collected using the IR camera. Such data are provided in Comma Separated Values (CSV) file format, containing a 752×480 matrix and a 320×240 matrix of temperatures corresponding to individual pixels of the pyrometer and IR camera, respectively. The IR camera and pyrometer temperature data are provided in blackbody-calibrated, raw forms. Provided thermal data can aid in generating and refining process-property-performance relationships between laser manufacturing and its fabricated materials.
Marshall, Garrett J.; Thompson, Scott M.; Shamsaei, Nima
2016-01-01
An OPTOMEC Laser Engineered Net Shaping (LENS™) 750 system was retrofitted with a melt pool pyrometer and in-chamber infrared (IR) camera for nondestructive thermal inspection of the blown-powder, direct laser deposition (DLD) process. Data indicative of temperature and heat transfer within the melt pool and heat affected zone atop a thin-walled structure of Ti–6Al–4V during its additive manufacture are provided. Melt pool temperature data were collected via the dual-wavelength pyrometer while the dynamic, bulk part temperature distribution was collected using the IR camera. Such data are provided in Comma Separated Values (CSV) file format, containing a 752×480 matrix and a 320×240 matrix of temperatures corresponding to individual pixels of the pyrometer and IR camera, respectively. The IR camera and pyrometer temperature data are provided in blackbody-calibrated, raw forms. Provided thermal data can aid in generating and refining process-property-performance relationships between laser manufacturing and its fabricated materials. PMID:27054180
In this study, modeled gas- and aerosol phase ammonia, nitric acid, and hydrogen chloride are compared to measurements taken during a field campaign conducted in northern Colorado in February and March 2011. We compare the modeled and observed gas-particle partitioning, and assess potential reasons for discrepancies between the model and measurements. This data set contains scripts and data used for each figure in the associated manuscript. Figures are generated using the R project statistical programming language. Data files are in either comma-separated value (CSV) format or netCDF, a standard self-describing binary data format commonly used in the earth and atmospheric sciences. This dataset is associated with the following publication:Kelly , J., K. Baker , C. Nolte, S. Napelenok , W.C. Keene, and A.A.P. Pszenny. Simulating the phase partitioning of NH3, HNO3, and HCl with size-resolved particles over northern Colorado in winter. ATMOSPHERIC ENVIRONMENT. Elsevier Science Ltd, New York, NY, USA, 131: 67-77, (2016).
The sedimentological characteristics and geochronology of the marshes of Dauphin Island, Alabama
Ellis, Alisha M.; Smith, Christopher G.; Marot, Marci E.
2018-03-22
In August 2015, scientists from the U.S. Geological Survey, St. Petersburg Coastal and Marine Science Center collected 11 push cores from the marshes of Dauphin Island and Little Dauphin Island, Alabama. Sample site environments included high marshes, low salt marshes, and salt flats, and varied in distance from the shoreline. The sampling efforts were part of a larger study to assess the feasibility and sustainability of proposed restoration efforts for Dauphin Island, Alabama, and to identify trends in shoreline erosion and accretion. The data presented in this publication can provide a basis for assessing organic and inorganic sediment accumulation rates and temporal changes in accumulation rates over multiple decades at multiple locations across the island. This study was funded by the National Fish and Wildlife Foundation, via the Gulf Environmental Benefit Fund. This report serves as an archive for the sedimentological and geochemical data derived from the marsh cores. Downloadable data are available and include Microsoft Excel spreadsheets (.xlsx), comma-separated values (.csv) text files, JPEG files, and formal Federal Geographic Data Committee metadata in a U.S. Geological Survey data release.
PuffinPlot: A versatile, user-friendly program for paleomagnetic analysis
NASA Astrophysics Data System (ADS)
Lurcock, P. C.; Wilson, G. S.
2012-06-01
PuffinPlot is a user-friendly desktop application for analysis of paleomagnetic data, offering a unique combination of features. It runs on several operating systems, including Windows, Mac OS X, and Linux; supports both discrete and long core data; and facilitates analysis of very weakly magnetic samples. As well as interactive graphical operation, PuffinPlot offers batch analysis for large volumes of data, and a Python scripting interface for programmatic control of its features. Available data displays include demagnetization/intensity, Zijderveld, equal-area (for sample, site, and suite level demagnetization data, and for magnetic susceptibility anisotropy data), a demagnetization data table, and a natural remanent magnetization intensity histogram. Analysis types include principal component analysis, Fisherian statistics, and great-circle path intersections. The results of calculations can be exported as CSV (comma-separated value) files; graphs can be printed, and can also be saved as publication-quality vector files in SVG or PDF format. PuffinPlot is free, and the program, user manual, and fully documented source code may be downloaded from http://code.google.com/p/puffinplot/.
Internet Distribution of Spacecraft Telemetry Data
NASA Technical Reports Server (NTRS)
Specht, Ted; Noble, David
2006-01-01
Remote Access Multi-mission Processing and Analysis Ground Environment (RAMPAGE) is a Java-language server computer program that enables near-real-time display of spacecraft telemetry data on any authorized client computer that has access to the Internet and is equipped with Web-browser software. In addition to providing a variety of displays of the latest available telemetry data, RAMPAGE can deliver notification of an alarm by electronic mail. Subscribers can then use RAMPAGE displays to determine the state of the spacecraft and formulate a response to the alarm, if necessary. A user can query spacecraft mission data in either binary or comma-separated-value format by use of a Web form or a Practical Extraction and Reporting Language (PERL) script to automate the query process. RAMPAGE runs on Linux and Solaris server computers in the Ground Data System (GDS) of NASA's Jet Propulsion Laboratory and includes components designed specifically to make it compatible with legacy GDS software. The client/server architecture of RAMPAGE and the use of the Java programming language make it possible to utilize a variety of competitive server and client computers, thereby also helping to minimize costs.
SchemaOnRead: A Package for Schema-on-Read in R
DOE Office of Scientific and Technical Information (OSTI.GOV)
North, Michael J.
Schema-on-read is an agile approach to data storage and retrieval that defers investments in data organization until production queries need to be run by working with data directly in native form. Schema-on-read functions have been implemented in a wide range of analytical systems, most notably Hadoop. SchemaOnRead is a CRAN package that uses R’s flexible data representations to provide transparent and convenient support for the schema-on-read paradigm in R. The schema-on- read tools within the package include a single function call that recursively reads folders with text, comma separated value, raster image, R data, HDF5, NetCDF, spreadsheet, Weka, Epi Info,more » Pajek network, R network, HTML, SPSS, Systat, and Stata files. The provided tools can be used as-is or easily adapted to implement customized schema-on-read tool chains in R. This paper’s contribution is that it introduces and describes SchemaOnRead, the first R package specifically focused on providing explicit schema-on-read support in R.« less
Automatic and efficient methods applied to the binarization of a subway map
NASA Astrophysics Data System (ADS)
Durand, Philippe; Ghorbanzadeh, Dariush; Jaupi, Luan
2015-12-01
The purpose of this paper is the study of efficient methods for image binarization. The objective of the work is the metro maps binarization. the goal is to binarize, avoiding noise to disturb the reading of subway stations. Different methods have been tested. By this way, a method given by Otsu gives particularly interesting results. The difficulty of the binarization is the choice of this threshold in order to reconstruct. Image sticky as possible to reality. Vectorization is a step subsequent to that of the binarization. It is to retrieve the coordinates points containing information and to store them in the two matrices X and Y. Subsequently, these matrices can be exported to a file format 'CSV' (Comma Separated Value) enabling us to deal with them in a variety of software including Excel. The algorithm uses quite a time calculation in Matlab because it is composed of two "for" loops nested. But the "for" loops are poorly supported by Matlab, especially in each other. This therefore penalizes the computation time, but seems the only method to do this.
NASA Astrophysics Data System (ADS)
Fröhlich, K.; Schmidt, T.; Ern, M.; Preusse, P.; de La Torre, A.; Wickert, J.; Jacobi, Ch.
2007-12-01
Five years of global temperatures retrieved from radio occultations measured by Champ (Challenging Minisatellite Payload) and SAC-C (Satelite de Aplicaciones Cientificas-C) are analyzed for gravity waves (GWs). In order to separate GWs from other atmospheric variations, a high-pass filter was applied on the vertical profile. Resulting temperature fluctuations correspond to vertical wavelengths between 400 m (instrumental resolution) and 10 km (limit of the high-pass filter). The temperature fluctuations can be converted into GW potential energy, but for comparison with parameterization schemes GW momentum flux is required. We therefore used representative values for the vertical and horizontal wavelength to infer GW momentum flux from the GPS measurements. The vertical wavelength value is determined by high-pass filtering, the horizontal wavelength is adopted from a latitude-dependent climatology. The obtained momentum flux distributions agree well, both in global distribution and in absolute values, with simulations using the Warner and McIntyre parameterization (WM) scheme. However, discrepancies are found in the annual cycle. Online simulations, implementing the WM scheme in the mechanistic COMMA-LIM (Cologne Model of the Middle Atmosphere—Leipzig Institute for Meteorology) general circulation model (GCM), do not converge, demonstrating that a good representation of GWs in a GCM requires both a realistic launch distribution and an adequate representation of GW breaking and momentum transfer.
Shinozaki, Masafumi; Muramatsu, Yoshihisa; Sasaki, Toru
2014-01-01
A new technical standard for X-ray computed tomography (CT) has been published by the National Electrical Manufacturers Association (NEMA) that allows the Alert Value and Notification Value for cumulative dose to be configurable by CT systems operators in conjunction with the XR-25 (Dose check) standard. In this study, a decision method of the Notification Values for reducing the radiation dose was examined using the dose index registry (DIR) system, during 122 continuous days from August 1, 2012 to November 30, 2012. CT images were obtained using the Discovery CT 750HD (GE Healthcare) and the dose index was calculated using the DoseWatch DIR system. The CT dose index-volume (CTDIvol) and dose-length product (DLP) were output from the DIR system in comma-separated value (CSV) file format for each examination protocol. All data were shown as a schematic boxplot using statistical processing software. The CTDIvol of a routine chest examination showed the following values (maximum: 23.84 mGy; minimum: 2.55 mGy; median: 7.60 mGy; 75% tile: 10.01 mGy; 25% tile: 6.54 mGy). DLP showed the following values (maximum: 944.56 mGy·cm; minimum: 97.25 mGy·cm; median: 307.35 mGy·cm; 75% tile: 406.87 mGy·cm; 25% tile: 255.75 mGy·cm). These results indicate that the 75% tile of CTDIvol and DLP as an initial value proved to be safe and efficient for CT examination and operation. We have thus established one way of determining the Notification Value from the output of the DIR system. Transfer back to the protocol of the CT and automated processing each numeric value in the DIR system is desired.
NASA Astrophysics Data System (ADS)
Carroll, Mark L.; Brown, Molly E.; Wooten, Margaret R.; Donham, Joel E.; Hubbard, Alfred B.; Ridenhour, William B.
2016-09-01
As our climate changes through time there is an ever-increasing need to quantify how and where it is changing so that mitigation strategies can be implemented. Urban areas have a disproportionate amount of warming due, in part, to the conductive properties of concrete and asphalt surfaces, surface albedo, heat capacity, lack of water, etc. that make up an urban environment. The NASA Climate Adaptation Science Investigation working group at Goddard Space Flight Center in Greenbelt, MD, conducted a study to collect temperature and humidity data at 15 min intervals from 12 sites at the center. These sites represent the major surface types at the center: asphalt, building roof, grass field, forest, and rain garden. The data show a strong distinction in the thermal properties of these surfaces at the center and the difference between the average values for the center compared to a local meteorological station. The data have been submitted to Oak Ridge National Laboratory Distributed Active Archive Center (ORNL-DAAC) for archival in comma separated value (csv) file format (Carroll et al., 2016) and can be found by following this link: http://daac.ornl.gov/cgi-bin/dsviewer.pl?ds_id=1319.
A multidimensional representation model of geographic features
Usery, E. Lynn; Timson, George; Coletti, Mark
2016-01-28
A multidimensional model of geographic features has been developed and implemented with data from The National Map of the U.S. Geological Survey. The model, programmed in C++ and implemented as a feature library, was tested with data from the National Hydrography Dataset demonstrating the capability to handle changes in feature attributes, such as increases in chlorine concentration in a stream, and feature geometry, such as the changing shoreline of barrier islands over time. Data can be entered directly, from a comma separated file, or features with attributes and relationships can be automatically populated in the model from data in the Spatial Data Transfer Standard format.
Cyclonic Vortices in Polar Airmasses
NASA Astrophysics Data System (ADS)
Businger, Steven
Cyclonic vortices in polar airmasses are investigated to determine their storm-scale and mesoscale structures and the nature of the environments conducive to their formation. Case studies of polar low outbreaks show that the environments conducive to the development of strong polar lows include deep outflow of arctic air over open water and a broad closed-low aloft. Once favorable environmental conditions for the formation of polar lows have developed, several storms may form in close proximity to each other during a relatively short time interval. Furthermore, these conditions may persist for several days. To develope a climatology of the synoptic environments conducive to the formation of polar lows, NMC gridded data were composited. The results reveal the presence of significant negative anomalies in the temperature and height fields at the 500 mb level on the days when mature polar lows were present, indicating the presence of strong positive vorticity and low static stability over the area. Aircraft observations made during the 1984 FOX field study indicate that convection in an incipient comma cloud was organized into distinct rainbands ((TURN)50 km wavelength), with tops extending to the tropopause. Equivalent -potential vorticity, computed from cross sections of the dropwindsonde data, showed that the region in which the convective activity was embedded was unstable to moist -symmetric overturnings. As the comma cloud approached a pre-existing polar front, a wave cyclone rapidly developed on the front. Surface data showed unexpectedly strong winds and heavy rain squalls when the comma cloud passed Juneau. Comprehensive data sets were collected in two comma cloud systems during CYCLES. Rainbands, with a wavelength of (TURN)50 km, were present in both comma-cloud systems. Precipitation cores, produced by embedded convection within the rainbands contained updraft speeds of (TURN)1-2 m s('-1) and relatively high liquid water counts; they retained their identities over periods of several hours. The spacing and orientation of the rainbands may be explained by the theory for mixed dynamic/convective instability developed by Sun (1978).
Prototype Methodology for Designing and Developing Computer-Assisted Instruction
1986-08-01
contains essential information and is not set off with commas. For example: The lawn mower that is broken is in the garage. Use "which" whenever the...phrase that follows contains supplementary or incidental information. "Which" clauses are set of by a pair of commas. For example: The lawn mower , which...is broken, is in the garage. If the lawn mower that is broken is in the garage, whereas the lawn mower that is working is in the yard, then the
Elghblawi, Ebtisam
2016-01-01
Dermoscopy is a method of growing significance in the diagnoses of dermatological pigmented skin diseases. However, in my case, mycology culture was negative and successful treatment was given on the basis of trichoscopy and wood lamp examination. I hereby describe a young boy with tinea capitis, multiple “comma hairs” and “zigzag hair” and a subtle additional feature “Morse code-like hair” when intensification was applied. Dermatoscopic aspects found skin Type 2 in a child of as a distinctive dermoscopic finding. PMID:28442876
Elghblawi, Ebtisam
2016-01-01
Dermoscopy is a method of growing significance in the diagnoses of dermatological pigmented skin diseases. However, in my case, mycology culture was negative and successful treatment was given on the basis of trichoscopy and wood lamp examination. I hereby describe a young boy with tinea capitis, multiple "comma hairs" and "zigzag hair" and a subtle additional feature "Morse code-like hair" when intensification was applied. Dermatoscopic aspects found skin Type 2 in a child of as a distinctive dermoscopic finding.
High-temperature apparatus for chaotic mixing of natural silicate melts.
Morgavi, D; Petrelli, M; Vetere, F P; González-García, D; Perugini, D
2015-10-01
A unique high-temperature apparatus was developed to trigger chaotic mixing at high-temperature (up to 1800 °C). This new apparatus, which we term Chaotic Magma Mixing Apparatus (COMMA), is designed to carry out experiments with high-temperature and high-viscosity (up to 10(6) Pa s) natural silicate melts. This instrument allows us to follow in time and space the evolution of the mixing process and the associated modulation of chemical composition. This is essential to understand the dynamics of magma mixing and related chemical exchanges. The COMMA device is tested by mixing natural melts from Aeolian Islands (Italy). The experiment was performed at 1180 °C using shoshonite and rhyolite melts, resulting in a viscosity ratio of more than three orders of magnitude. This viscosity ratio is close to the maximum possible ratio of viscosity between high-temperature natural silicate melts. Results indicate that the generated mixing structures are topologically identical to those observed in natural volcanic rocks highlighting the enormous potential of the COMMA to replicate, as a first approximation, the same mixing patterns observed in the natural environment. COMMA can be used to investigate in detail the space and time development of magma mixing providing information about this fundamental petrological and volcanological process that would be impossible to investigate by direct observations. Among the potentials of this new experimental device is the construction of empirical relationships relating the mixing time, obtained through experimental time series, and chemical exchanges between the melts to constrain the mixing-to-eruption time of volcanic systems, a fundamental topic in volcanic hazard assessment.
Borneo vortex and mesoscale convective rainfall
NASA Astrophysics Data System (ADS)
Koseki, S.; Koh, T.-Y.; Teo, C.-K.
2014-05-01
We have investigated how the Borneo vortex develops over the equatorial South China Sea under cold surge conditions in December during the Asian winter monsoon. Composite analysis using reanalysis and satellite data sets has revealed that absolute vorticity and water vapour are transported by strong cold surges from upstream of the South China Sea to around the Equator. Rainfall is correspondingly enhanced over the equatorial South China Sea. A semi-idealized experiment reproduced the Borneo vortex over the equatorial South China Sea during a "perpetual" cold surge. The Borneo vortex is manifested as a meso-α cyclone with a comma-shaped rainband in the northeast sector of the cyclone. Vorticity budget analysis showed that the growth/maintenance of the meso-α cyclone was achieved mainly by the vortex stretching. This vortex stretching is due to the upward motion forced by the latent heat release around the cyclone centre. The comma-shaped rainband consists of clusters of meso-β-scale rainfall cells. The intense rainfall in the comma head (comma tail) is generated by the confluence of the warmer and wetter cyclonic easterly flow (cyclonic southeasterly flow) and the cooler and drier northeasterly surge in the northwestern (northeastern) sector of the cyclone. Intense upward motion and heavy rainfall resulted due to the low-level convergence and the favourable thermodynamic profile at the confluence zone. In particular, the convergence in the northwestern sector is responsible for maintenance of the meso-α cyclone system. At both meso-α and meso-β scales, the convergence is ultimately caused by the deviatoric strain in the confluence wind pattern but is significantly self-enhanced by the nonlinear dynamics.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bond, J.W.
1988-01-01
Data-compression codes offer the possibility of improving the thruput of existing communication systems in the near term. This study was undertaken to determine if data-compression codes could be utilized to provide message compression in a channel with up to a 0.10-bit error rate. The data-compression capabilities of codes were investigated by estimating the average number of bits-per-character required to transmit narrative files. The performance of the codes in a channel with errors (a noisy channel) was investigated in terms of the average numbers of characters-decoded-in-error and of characters-printed-in-error-per-bit-error. Results were obtained by encoding four narrative files, which were resident onmore » an IBM-PC and use a 58-character set. The study focused on Huffman codes and suffix/prefix comma-free codes. Other data-compression codes, in particular, block codes and some simple variants of block codes, are briefly discussed to place the study results in context. Comma-free codes were found to have the most-promising data compression because error propagation due to bit errors are limited to a few characters for these codes. A technique was found to identify a suffix/prefix comma-free code giving nearly the same data compressions as a Huffman code with much less error propagation than the Huffman codes. Greater data compression can be achieved through the use of this comma-free code word assignments based on conditioned probabilities of character occurrence.« less
Low-cost, email-based system for self blood pressure monitoring at home.
Nakajima, Kazuki; Nambu, Masayuki; Kiryu, Tohru; Tamura, Toshiyo; Sasaki, Kazuo
2006-01-01
We have developed a low-cost monitoring system, which allows subjects to send blood pressure (BP) data obtained at home to health-care professionals by email. The system consists of a wrist BP monitor and a personal computer (PC) with an Internet connection. The wrist BP monitor includes an advanced positioning sensor to verify that the wrist is placed properly at heart level. Subjects at home can self-measure their BP every day, automatically transfer the BP data to their PC each week, and then send a comma-separated values (CSV) file to their health-care professional by email. In a feasibility study, 10 subjects used the system for a mean period of 207 days (SD 149). The mean percent achievement of measurement in the 10 subjects was 84% (SD 12). There was a seasonal variation in systolic and diastolic BP, which was inversely correlated with temperature. Eight of the 10 subjects evaluated the system favourably. The results of the present study demonstrate the feasibility of our email-based system for self-monitoring of blood pressure. Its low cost means that it may have widespread application in future home telecare studies.
COMPASS: a suite of pre- and post-search proteomics software tools for OMSSA
Wenger, Craig D.; Phanstiel, Douglas H.; Lee, M. Violet; Bailey, Derek J.; Coon, Joshua J.
2011-01-01
Here we present the Coon OMSSA Proteomic Analysis Software Suite (COMPASS): a free and open-source software pipeline for high-throughput analysis of proteomics data, designed around the Open Mass Spectrometry Search Algorithm. We detail a synergistic set of tools for protein database generation, spectral reduction, peptide false discovery rate analysis, peptide quantitation via isobaric labeling, protein parsimony and protein false discovery rate analysis, and protein quantitation. We strive for maximum ease of use, utilizing graphical user interfaces and working with data files in the original instrument vendor format. Results are stored in plain text comma-separated values files, which are easy to view and manipulate with a text editor or spreadsheet program. We illustrate the operation and efficacy of COMPASS through the use of two LC–MS/MS datasets. The first is a dataset of a highly annotated mixture of standard proteins and manually validated contaminants that exhibits the identification workflow. The second is a dataset of yeast peptides, labeled with isobaric stable isotope tags and mixed in known ratios, to demonstrate the quantitative workflow. For these two datasets, COMPASS performs equivalently or better than the current de facto standard, the Trans-Proteomic Pipeline. PMID:21298793
A high-speed drug interaction search system for ease of use in the clinical environment.
Takada, Masahiro; Inada, Hiroshi; Nakazawa, Kazuo; Tani, Shoko; Iwata, Michiaki; Sugimoto, Yoshihisa; Nagata, Satoru
2012-12-01
With the advancement of pharmaceutical development, drug interactions have become increasingly complex. As a result, a computer-based drug interaction search system is required to organize the whole of drug interaction data. To overcome problems faced with the existing systems, we developed a drug interaction search system using a hash table, which offers higher processing speeds and easier maintenance operations compared with relational databases (RDB). In order to compare the performance of our system and MySQL RDB in terms of search speed, drug interaction searches were repeated for all 45 possible combinations of two out of a group of 10 drugs for two cases: 5,604 and 56,040 drug interaction data. As the principal result, our system was able to process the search approximately 19 times faster than the system using the MySQL RDB. Our system also has several other merits such as that drug interaction data can be created in comma-separated value (CSV) format, thereby facilitating data maintenance. Although our system uses the well-known method of a hash table, it is expected to resolve problems common to existing systems and to be an effective system that enables the safe management of drugs.
Filtering NetCDF Files by Using the EverVIEW Slice and Dice Tool
Conzelmann, Craig; Romañach, Stephanie S.
2010-01-01
Network Common Data Form (NetCDF) is a self-describing, machine-independent file format for storing array-oriented scientific data. It was created to provide a common interface between applications and real-time meteorological and other scientific data. Over the past few years, there has been a growing movement within the community of natural resource managers in The Everglades, Fla., to use NetCDF as the standard data container for datasets based on multidimensional arrays. As a consequence, a need surfaced for additional tools to view and manipulate NetCDF datasets, specifically to filter the files by creating subsets of large NetCDF files. The U.S. Geological Survey (USGS) and the Joint Ecosystem Modeling (JEM) group are working to address these needs with applications like the EverVIEW Slice and Dice Tool, which allows users to filter grid-based NetCDF files, thus targeting those data most important to them. The major functions of this tool are as follows: (1) to create subsets of NetCDF files temporally, spatially, and by data value; (2) to view the NetCDF data in table form; and (3) to export the filtered data to a comma-separated value (CSV) file format. The USGS and JEM will continue to work with scientists and natural resource managers across The Everglades to solve complex restoration problems through technological advances.
High-temperature apparatus for chaotic mixing of natural silicate melts
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morgavi, D.; Petrelli, M.; Vetere, F. P.
2015-10-15
A unique high-temperature apparatus was developed to trigger chaotic mixing at high-temperature (up to 1800 °C). This new apparatus, which we term Chaotic Magma Mixing Apparatus (COMMA), is designed to carry out experiments with high-temperature and high-viscosity (up to 10{sup 6} Pa s) natural silicate melts. This instrument allows us to follow in time and space the evolution of the mixing process and the associated modulation of chemical composition. This is essential to understand the dynamics of magma mixing and related chemical exchanges. The COMMA device is tested by mixing natural melts from Aeolian Islands (Italy). The experiment was performed atmore » 1180 °C using shoshonite and rhyolite melts, resulting in a viscosity ratio of more than three orders of magnitude. This viscosity ratio is close to the maximum possible ratio of viscosity between high-temperature natural silicate melts. Results indicate that the generated mixing structures are topologically identical to those observed in natural volcanic rocks highlighting the enormous potential of the COMMA to replicate, as a first approximation, the same mixing patterns observed in the natural environment. COMMA can be used to investigate in detail the space and time development of magma mixing providing information about this fundamental petrological and volcanological process that would be impossible to investigate by direct observations. Among the potentials of this new experimental device is the construction of empirical relationships relating the mixing time, obtained through experimental time series, and chemical exchanges between the melts to constrain the mixing-to-eruption time of volcanic systems, a fundamental topic in volcanic hazard assessment.« less
Borneo Vortex and Meso-scale Convective Rainfall
NASA Astrophysics Data System (ADS)
Koh, T. Y.; Koseki, S.; Teo, C. K.
2014-12-01
We have investigated how the Borneo vortex develops over the equatorial South China Sea under cold surge conditions in December during the Asian winter monsoon. Composite analysis using reanalysis and satellite datasets has revealed that absolute vorticity and water vapour are transported by strong cold surges from upstream of the South China Sea to around the equator. Rainfall is correspondingly enhanced over the equatorial South China Sea. A semi-idealized experiment reproduced the Borneo vortex over the equatorial South China Sea during a perpetual cold surge. The Borneo vortex is manifested as a meso-alpha cyclone with a comma-shaped rainband in the northeast sector of the cyclone. Vorticity budget analysis showed that the growth/maintenance of the meso-alpha cyclone was achieved mainly by the vortex stretching. This vortex stretching is due to the upward motion forced by the latent heat release around the cyclone centre. The comma-shaped rainband consists of clusters of meso-beta scale rainfall cells. The intense rainfall in the comma-head (comma-tail) is generated by the confluence of the warmer and wetter cyclonic easterly flow (cyclonic southeasterly flow) and the cooler and drier northeasterly surge in the northwestern (northeastern) sector of the cyclone. Intense upward motion and heavy rainfall resulted due to the low-level convergence and the favourable thermodynamic profile at the confluence zone. In particular, the convergence in the northwestern sector is responsible for maintenance of the meso-alpha cyclone system. At both meso-alpha and meso-beta scales, the convergence is ultimately caused by the deviatoric strain in the confluence wind pattern but is significantly self-enhanced by the nonlinear dynamics. Reference: Koseki, S., T.-Y. Koh and C.-K. Teo (2014), Atmospheric Chemistry and Physics, 14, 4539-4562, doi:10.5194/acp-14-4539-2014, 2014.
Nygren, G H; Nylin, S; Stefanescu, C
2006-11-01
Comma butterflies (Nymphalidae: Polygonia c-album L.) from one Belgian site and three Spanish sites were crossed with butterflies from a Swedish population in order to investigate inheritance of female host plant choice, egg mass and larval growth rate. We found three different modes of inheritance for the three investigated traits. In line with earlier results from crosses between Swedish and English populations, the results regarding female oviposition preference (choice between Urtica dioica and Salix caprea) showed X-linked inheritance to be of importance for the variation between Sweden and the other sites. Egg mass and growth rate did not show any sex-linked inheritance. Egg mass differences between populations seem to be controlled mainly by additive autosomal genes, as hybrids showed intermediate values. The growth rates of both hybrid types following reciprocal crossings were similar to each other but consistently higher than for the two source populations, suggesting a nonadditive mode of inheritance which is not sex-linked. The different modes of inheritance for host plant preference vs. important life history traits are likely to result in hybrids with unfit combinations of traits. This type of potential reproductive barrier based on multiple ecologically important traits deserves more attention, as it should be a common situation for instance in the early stages of population divergence in host plant usage, facilitating ecological speciation.
PASMet: a web-based platform for prediction, modelling and analyses of metabolic systems
Sriyudthsak, Kansuporn; Mejia, Ramon Francisco; Arita, Masanori; Hirai, Masami Yokota
2016-01-01
PASMet (Prediction, Analysis and Simulation of Metabolic networks) is a web-based platform for proposing and verifying mathematical models to understand the dynamics of metabolism. The advantages of PASMet include user-friendliness and accessibility, which enable biologists and biochemists to easily perform mathematical modelling. PASMet offers a series of user-functions to handle the time-series data of metabolite concentrations. The functions are organised into four steps: (i) Prediction of a probable metabolic pathway and its regulation; (ii) Construction of mathematical models; (iii) Simulation of metabolic behaviours; and (iv) Analysis of metabolic system characteristics. Each function contains various statistical and mathematical methods that can be used independently. Users who may not have enough knowledge of computing or programming can easily and quickly analyse their local data without software downloads, updates or installations. Users only need to upload their files in comma-separated values (CSV) format or enter their model equations directly into the website. Once the time-series data or mathematical equations are uploaded, PASMet automatically performs computation on server-side. Then, users can interactively view their results and directly download them to their local computers. PASMet is freely available with no login requirement at http://pasmet.riken.jp/ from major web browsers on Windows, Mac and Linux operating systems. PMID:27174940
Exploratory analysis regarding the domain definitions for computer based analytical models
NASA Astrophysics Data System (ADS)
Raicu, A.; Oanta, E.; Barhalescu, M.
2017-08-01
Our previous computer based studies dedicated to structural problems using analytical methods defined the composite cross section of a beam as a result of Boolean operations with so-called ‘simple’ shapes. Using generalisations, in the class of the ‘simple’ shapes were included areas bounded by curves approximated using spline functions and areas approximated as polygons. However, particular definitions lead to particular solutions. In order to ascend above the actual limitations, we conceived a general definition of the cross sections that are considered now calculus domains consisting of several subdomains. The according set of input data use complex parameterizations. This new vision allows us to naturally assign a general number of attributes to the subdomains. In this way there may be modelled new phenomena that use map-wise information, such as the metal alloys equilibrium diagrams. The hierarchy of the input data text files that use the comma-separated-value format and their structure are also presented and discussed in the paper. This new approach allows us to reuse the concepts and part of the data processing software instruments already developed. The according software to be subsequently developed will be modularised and generalised in order to be used in the upcoming projects that require rapid development of computer based models.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kraus, Terrence D.
2017-04-01
This report specifies the electronic file format that was agreed upon to be used as the file format for normalized radiological data produced by the software tool developed under this TI project. The NA-84 Technology Integration (TI) Program project (SNL17-CM-635, Normalizing Radiological Data for Analysis and Integration into Models) investigators held a teleconference on December 7, 2017 to discuss the tasks to be completed under the TI program project. During this teleconference, the TI project investigators determined that the comma-separated values (CSV) file format is the most suitable file format for the normalized radiological data that will be outputted frommore » the normalizing tool developed under this TI project. The CSV file format was selected because it provides the requisite flexibility to manage different types of radiological data (i.e., activity concentration, exposure rate, dose rate) from other sources [e.g., Radiological Assessment and Monitoring System (RAMS), Aerial Measuring System (AMS), Monitoring and Sampling). The CSV file format also is suitable for the file format of the normalized radiological data because this normalized data can then be ingested by other software [e.g., RAMS, Visual Sampling Plan (VSP)] used by the NA-84’s Consequence Management Program.« less
Coplen, Tyler B.; Wassenaar, Leonard I
2015-01-01
Although laser absorption spectrometry (LAS) instrumentation is easy to use, its incorporation into laboratory operations is not easy, owing to extensive offline manipulation of comma-separated-values files for outlier detection, between-sample memory correction, nonlinearity (δ-variation with water amount) correction, drift correction, normalization to VSMOW-SLAP scales, and difficulty in performing long-term QA/QC audits. METHODS: A Microsoft Access relational-database application, LIMS (Laboratory Information Management System) for Lasers 2015, was developed. It automates LAS data corrections and manages clients, projects, samples, instrument-sample lists, and triple-isotope (δ(17) O, δ(18) O, and δ(2) H values) instrumental data for liquid-water samples. It enables users to (1) graphically evaluate sample injections for variable water yields and high isotope-delta variance; (2) correct for between-sample carryover, instrumental drift, and δ nonlinearity; and (3) normalize final results to VSMOW-SLAP scales. RESULTS: Cost-free LIMS for Lasers 2015 enables users to obtain improved δ(17) O, δ(18) O, and δ(2) H values with liquid-water LAS instruments, even those with under-performing syringes. For example, LAS δ(2) HVSMOW measurements of USGS50 Lake Kyoga (Uganda) water using an under-performing syringe having ±10 % variation in water concentration gave +31.7 ± 1.6 ‰ (2-σ standard deviation), compared with the reference value of +32.8 ± 0.4 ‰, after correction for variation in δ value with water concentration, between-sample memory, and normalization to the VSMOW-SLAP scale. CONCLUSIONS: LIMS for Lasers 2015 enables users to create systematic, well-founded instrument templates, import δ(2) H, δ(17) O, and δ(18) O results, evaluate performance with automatic graphical plots, correct for δ nonlinearity due to variable water concentration, correct for between-sample memory, adjust for drift, perform VSMOW-SLAP normalization, and perform long-term QA/QC audits easily.
Lim, Y A; Kim, H H; Joung, U S; Kim, C Y; Shin, Y H; Lee, S W; Kim, H J
2010-04-01
We developed a web-based program for a national surveillance system to determine baseline data regarding the supply and demand of blood products at sentinel hospitals in South Korea. Sentinel hospitals were invited to participate in a 1-month pilot-test. The data for receipts and exports of blood from each hospital information system were converted into comma-separated value files according to a specific conversion rule. The daily data from the sites could be transferred to the web-based program server using a semi-automated submission procedure: pressing a key allowed the program to automatically compute the blood inventory level as well as other indices including the minimal inventory ratio (MIR), ideal inventory ratio (IIR), supply index (SI) and utilisation index (UI). The national surveillance system was referred to as the Korean Blood Inventory Monitoring System (KBIMS) and the web-based program for KBIMS was referred to as the Blood Inventory Monitoring System (BMS). A total of 30 256 red blood cell (RBC) units were submitted as receipt data, however, only 83% of the receipt data were submitted to the BMS server as export data (25 093 RBC units). Median values were 2.67 for MIR, 1.08 for IIR, 1.00 for SI, 0.88 for UI and 5.33 for the ideal inventory day. The BMS program was easy to use and is expected to provide a useful tool for monitoring hospital inventory levels. This information will provide baseline data regarding the supply and demand of blood products in South Korea.
Efficient Delivery and Visualization of Long Time-Series Datasets Using Das2 Tools
NASA Astrophysics Data System (ADS)
Piker, C.; Granroth, L.; Faden, J.; Kurth, W. S.
2017-12-01
For over 14 years the University of Iowa Radio and Plasma Wave Group has utilized a network transparent data streaming and visualization system for most daily data review and collaboration activities. This system, called Das2, was originally designed in support of the Cassini Radio and Plasma Wave Science (RPWS) investigation, but is now relied on for daily review and analysis of Voyager, Polar, Cluster, Mars Express, Juno and other mission results. In light of current efforts to promote automatic data distribution in space physics it seems prudent to provide an overview of our open source Das2 programs and interface definitions to the wider community and to recount lessons learned. This submission will provide an overview of interfaces that define the system, describe the relationship between the Das2 effort and Autoplot and will examine handling Cassini RPWS Wideband waveforms and dynamic spectra as examples of dealing with long time-series data sets. In addition, the advantages and limitations of the current Das2 tool set will be discussed, as well as lessons learned that are applicable to other data sharing initiatives. Finally, plans for future developments including improved catalogs to support 'no-software' data sources and redundant multi-server fail over, as well as new adapters for CSV (Comma Separated Values) and JSON (Javascript Object Notation) output to support Cassini closeout and the HAPI (Heliophysics Application Programming Interface) initiative are outlined.
Web-Based Customizable Viewer for Mars Network Overflight Opportunities
NASA Technical Reports Server (NTRS)
Gladden, Roy E.; Wallick, Michael N.; Allard, Daniel A.
2012-01-01
This software displays a full summary of information regarding the overflight opportunities between any set of lander and orbiter pairs that the user has access to view. The information display can be customized, allowing the user to choose which fields to view/hide and filter. The software works from a Web browser on any modern operating system. A full summary of information pertaining to an overflight is available, including the proposed, tentative, requested, planned, and implemented. This gives the user a chance to quickly check for inconsistencies and fix any problems. Overflights from multiple lander/ orbiter pairs can be compared instantly, and information can be filtered through the query and shown/hidden, giving the user a customizable view of the data. The information can be exported to a CSV (comma separated value) or XML (extensible markup language) file. The software only grants access to users who are authorized to view the information. This application is an addition to the MaROS Web suite. Prior to this addition, information pertaining to overflight opportunities would have a limited amount of data (displayed graphically) and could only be shown in strict temporal ordering. This new display shows more information, allows direct comparisons between overflights, and allows the data to be manipulated in ways that it was unable to be done in the past. The current software solution is to use CSV files to view the overflight opportunities.
DOE Office of Scientific and Technical Information (OSTI.GOV)
J. R. Belt
2006-10-01
HPPCALC 2.1 was developed to analyze the raw data from a PNGV Hybrid Pulse Power Characterization (HPPC) test and produce seven standard plots that consist of resistance, power and available energy relationships. The purpose of the HPPC test is to extrapolate the total power capability within predetermined voltage limits of a prototype or full production cell regardless of chemistry with respect to the PNGV goals as outlined in the PNGV Testing Manual, Revision 3. The power capability gives the Electrochemical Energy Storage team the tools to compare different battery sizes and chemistries for possible use in a hybrid electric vehicle.more » The visual basic program HPPCALC 2.1 opens the comma separated value file that is produced from a Maccor, Bitrode or Energy Systems tester. It extracts the necessary information and performs the appropriate calculations. This information is arranged into seven graphs: Resistance versus Depth of Discharge, Power versus Depth of Discharge, Power versus Energy, Power versus Energy, Energy versus Power, Available Energy versus Power, Available Energy versus Power, and Power versus Depth of Discharge. These are the standard plots that are produced for each HPPC test. The primary metric for the HPPC test is the PNGV power, which is the power at which the available energy is equal to 300 Wh. The PNGV power is used to monitor the power degradation of the battery over the course of cycle or calendar life testing.« less
[Tinea capitis. Dermoscopic findings in 37 patients].
Arrazola-Guerrero, Jisel; Isa-Isa, Rafael; Torres-Guerrero, Edoardo; Arenas, Roberto
2015-01-01
Tinea capitis is a common fungal infection in children. Diagnosis is confirmed by mycological study, including direct examination of the samples with potassium hydroxide/chlorazol black and culture. Previous studies have reported the presence of "comma hairs" and "corkscrew hairs", as well as short hairs and black dots. To describe the dermoscopic patterns in the trichoscopic examination in patients with tinea capitis. A descriptive, observational and cross-sectional study was conducted on 37 patients with tinea capitis, studied during May, 2012, at Dr. Manuel Gea González General Hospital in Mexico, and the Instituto Dermatológico y Cirugía de Piel Dr. Huberto Bogaert Díaz, in the Dominican Republic. Clinical, mycological and dermoscopic evaluations were performed. Of the 37 patients included, 28 were of mixed race from Dominican Republic and 9 mixed race cases from Mexico. Seventy six percent were male and 24% female, and 94% were children. The following dermoscopic patterns were confirmed: "comma hairs" (41%), "corkscrew hairs" (22%), short hairs (49%), and black dots (33%). The presence of scales (89%), peripilar casts (46%), alopecia (65%), pustules (8%), and meliceric crusts (16%), were also observed. Dermoscopy in tinea capitis showed the presence of "comma hairs", and "corkscrew hairs". Scales, peripilar casts and alopecia were also found. It would be desirable to establish this diagnostic tool, particularly when an optical microscope or a mycology reference laboratory are not available. Copyright © 2013 Revista Iberoamericana de Micología. Published by Elsevier Espana. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hurst, Aaron M.; Bernstein, Lee A.; Chong, Su-Ann
A Structured Query Language (SQL) relational database has been developed based on the original (n,n'gamma) work carried out by A.M. Demidov et al., at the Nuclear Research Institute in Baghdad, Iraq [``Atlas of Gamma-Ray Spectra from the Inelastic Scattering of Reactor Fast Neutrons'', Nuclear Research Institute, Baghdad, Iraq (Moscow, Atomizdat 1978)] for 105 independent measurements comprising 76 elemental samples of natural composition and 29 isotopically-enriched samples. The information from this ATLAS includes: gamma-ray energies and intensities; nuclide and level data corresponding to where the gamma-ray originated from; target (sample) experimental-measurement data. Taken together, this information allows for the extraction ofmore » the flux-weighted (n,n'gamma) cross sections for a given transition relative to a defined value. Currently, we are using the fast-neutron flux-weighted partial gamma-ray cross section from ENDF/B-VII.1 for the production of the 847-keV transition from the first excited 2+ state to the 0+ ground state in 56Fe, 468 mb. This value also takes into account contributions to the 847-keV transition following beta(-) decay of 56Mn formed in the 56Fe(n,p) reaction. However, this value can easily be adjusted to accommodate the user preference. The (n,n'gamma) data has been compiled into a series of ASCII comma separated value tables and a suite of Python scripts and C modules are provided to build the database. Upon building, the database can then be interacted with directly via the SQLite engine or accessed via the Jupyter Notebook Python-browser interface. Several examples exploiting these utilities are also provided with the complete software package.« less
The Vocabulary of English Punctuation (Coming to Terms).
ERIC Educational Resources Information Center
Zuidema, Leah A.
1996-01-01
Discusses the vocabulary of English punctuation terms, largely unchanged since the Norman conquest in 1066. Discusses the meaning of the period, colon, comma, question mark, exclamation point, slash, parenthesis, brackets, asterisk, hyphen, and ampersand. (RS)
Vanhoutte, Kurt J A; Stavenga, Doekele G
2005-05-01
The visual pigments in the compound eye of the comma butterfly, Polygonia c-album, were investigated in a specially designed epi-illumination microspectrophotometer. Absorption changes due to photochemical conversions of the visual pigments, or due to light-independent visual pigment decay and regeneration, were studied by measuring the eye shine, i.e., the light reflected from the tapetum located in each ommatidium proximal to the visual pigment-bearing rhabdom. The obtained absorbance difference spectra demonstrated the dominant presence of a green visual pigment. The rhodopsin and its metarhodopsin have absorption peak wavelengths at 532 nm and 492 nm, respectively. The metarhodopsin is removed from the rhabdom with a time constant of 15 min and the rhodopsin is regenerated with a time constant of 59 min (room temperature). A UV rhodopsin with metarhodopsin absorbing maximally at 467 nm was revealed, and evidence for a blue rhodopsin was obtained indirectly.
Method for coding low entrophy data
NASA Technical Reports Server (NTRS)
Yeh, Pen-Shu (Inventor)
1995-01-01
A method of lossless data compression for efficient coding of an electronic signal of information sources of very low information rate is disclosed. In this method, S represents a non-negative source symbol set, (s(sub 0), s(sub 1), s(sub 2), ..., s(sub N-1)) of N symbols with s(sub i) = i. The difference between binary digital data is mapped into symbol set S. Consecutive symbols in symbol set S are then paired into a new symbol set Gamma which defines a non-negative symbol set containing the symbols (gamma(sub m)) obtained as the extension of the original symbol set S. These pairs are then mapped into a comma code which is defined as a coding scheme in which every codeword is terminated with the same comma pattern, such as a 1. This allows a direct coding and decoding of the n-bit positive integer digital data differences without the use of codebooks.
NASA Astrophysics Data System (ADS)
Treverrow, Adam; Jun, Li; Jacka, Tim H.
2016-06-01
We present measurements of crystal c-axis orientations and mean grain area from the Dome Summit South (DSS) ice core drilled on Law Dome, East Antarctica. All measurements were made on location at the borehole site during drilling operations. The data are from 185 individual thin sections obtained between a depth of 117 m below the surface and the bottom of the DSS core at a depth of 1196 m. The median number of c-axis orientations recorded in each thin section was 100, with values ranging from 5 through to 111 orientations. The data from all 185 thin sections are provided in a single comma-separated value (csv) formatted file which contains the c-axis orientations in polar coordinates, depth information for each core section from which the data were obtained, the mean grain area calculated for each thin section and other data related to the drilling site. The data set is also available as a MATLAB™ structure array. Additionally, the c-axis orientation data from each of the 185 thin sections are summarized graphically in figures containing a Schmidt diagram, histogram of c-axis colatitudes and rose plot of c-axis azimuths. All these data are referenced by doi:10.4225/15/5669050CC1B3B and are available free of charge at https://data.antarctica.gov.au.<
Effects of polyamine inhibitors on zinc uptake by COMMA-1D mammary epithelial cells
DOE Office of Scientific and Technical Information (OSTI.GOV)
Allen, J.C.; Haedrich, L.H.
1991-03-15
Zn uptake or transport is stimulated by glucocorticoids in many types of epithelial cells, including the COMMA-1D mouse mammary cell line. The current objective was to determine whether polyamines also mediate glucocorticoid stimulation of Zn-uptake. Initially, cells grown in lactogenic hormone supplemented-media had approximately 65% greater {sup 65}Zn-uptake over 24 h than cells in nonsupplemented growth media (GM). {sup 65}Zn-uptake from HM with 10{sup {minus}5}M methylglyoxal-bis(guanylhydrazone) (MGBG) (s-adenosyl-methionine decarboxylase inhibitor to block polyamine synthesis) added was less than from GM. Exogenous spermidine added to the MGBG-HM media increased {sup 65}Zn-uptake. However, up to 10mM difluoromethylornithine (DFMO), a more specific inhibitormore » of sperimidine synthesis, had no significant effect on 24-h {sup 65}Zn-uptake by cells in HM. In GM, DFMO caused a slight dose-dependent decrease in {sup 65}Zn-uptake over the range 10{sup {minus}6} to 5 {times} 10{sup 3}M. Also, with 8 h of incubation, DFMO tended to decrease {sup 65}Zn-uptake in HM-stimulated cells. These data cannot yet distinguish between the possibilities that DFMO is inactivated during the 24-h incubation or that the dramatic effects of MGBG on {sup 65}Zn-uptake in these mammary-derived cells is not related to its inhibition of polyamine synthesis. Because COMMA-1D cells alter Zn uptake in response to lactogenic hormones and MGBG, the model system is suitable for further studies of the mechanisms of zinc transport in epithelia.« less
Gamberale-Stille, Gabriella; Söderlind, Lina; Janz, Niklas; Nylin, Sören
2014-08-01
In most phytophagous insects, the larval diet strongly affects future fitness and in species that do not feed on plant parts as adults, larval diet is the main source of nitrogen. In many of these insect-host plant systems, the immature larvae are considered to be fully dependent on the choice of the mothers, who, in turn, possess a highly developed host recognition system. This circumstance allows for a potential mother-offspring conflict, resulting in the female maximizing her fecundity at the expense of larval performance on suboptimal hosts. In two experiments, we aimed to investigate this relationship in the polyphagous comma butterfly, Polygonia c-album, by comparing the relative acceptance of low- and medium-ranked hosts between females and neonate larvae both within individuals between life stages, and between mothers and their offspring. The study shows a variation between females in oviposition acceptance of low-ranked hosts, and that the degree of acceptance in the mothers correlates with the probability of acceptance of the same host in the larvae. We also found a negative relationship between stages within individuals as there was a higher acceptance of lower ranked hosts in females who had abandoned said host as a larva. Notably, however, neonate larvae of the comma butterfly did not unconditionally accept to feed from the least favorable host species even when it was the only food source. Our results suggest the possibility that the disadvantages associated with a generalist oviposition strategy can be decreased by larval participation in host plant choice. © 2013 Institute of Zoology, Chinese Academy of Sciences.
NextGen flight deck data comm: auxiliary synthetic speech - phase I
DOT National Transportation Integrated Search
2012-10-22
Data Comma digital, text-based controller-pilot communication systemis critical to many NextGen improvements. With Data Comm, communication becomes a visual task. Although Data Comm brings many advantages, interacting with a visual display may ...
NextGen Flight Deck Data Comm : Auxiliary Synthetic Speech Phase II
DOT National Transportation Integrated Search
2015-07-01
Data Comma text-based controller-pilot communication systemis expected to yield several NextGen safety and efficiency benefits. With Data Comm, communication becomes a visual task, and may potentially increase head-down time on the flight deck ...
NextGen flight deck Data Comm : auxiliary synthetic speech phase I
DOT National Transportation Integrated Search
2012-12-31
Data Comma text-based controller-pilot communication systemis critical to many NextGen improvements. With Data Comm, communication becomes a visual task. Interacting with a visual Data Comm display may yield an unsafe increase in head-down time...
Patthi, Basavaraj; Kumar, Jishnu Krishna; Singla, Ashish; Gupta, Ritu; Prasad, Monika; Ali, Irfan; Dhama, Kuldeep; Niraj, Lav Kumar
2017-09-01
Oral diseases are pandemic cause of morbidity with widespread geographic distribution. This technology based era has brought about easy knowledge transfer than traditional dependency on information obtained from family doctors. Hence, harvesting this system of trends can aid in oral disease quantification. To conduct an exploratory analysis of the changes in internet search volumes of oral diseases by using Google Trends © (GT © ). GT © were utilized to provide real world facts based on search terms related to categories, interest by region and interest over time. Time period chosen was from January 2004 to December 2016. Five different search terms were explored and compared based on the highest relative search volumes along with comma separated value files to obtain an insight into highest search traffic. The search volume measured over the time span noted the term "Dental caries" to be the most searched in Japan, "Gingivitis" in Jordan, "Oral Cancer" in Taiwan, "No Teeth" in Australia, "HIV symptoms" in Zimbabwe, "Broken Teeth" in United Kingdom, "Cleft palate" in Philippines, "Toothache" in Indonesia and the comparison of top five searched terms provided the "Gingivitis" with highest search volume. The results from the present study offers an insight into a competent tool that can analyse and compare oral diseases over time. The trend research platform can be used on emerging diseases and their drift in geographic population with great acumen. This tool can be utilized in forecasting, modulating marketing strategies and planning disability limitation techniques.
Interoperability In The New Planetary Science Archive (PSA)
NASA Astrophysics Data System (ADS)
Rios, C.; Barbarisi, I.; Docasal, R.; Macfarlane, A. J.; Gonzalez, J.; Arviset, C.; Grotheer, E.; Besse, S.; Martinez, S.; Heather, D.; De Marchi, G.; Lim, T.; Fraga, D.; Barthelemy, M.
2015-12-01
As the world becomes increasingly interconnected, there is a greater need to provide interoperability with software and applications that are commonly being used globally. For this purpose, the development of the new Planetary Science Archive (PSA), by the European Space Astronomy Centre (ESAC) Science Data Centre (ESDC), is focused on building a modern science archive that takes into account internationally recognised standards in order to provide access to the archive through tools from third parties, for example by the NASA Planetary Data System (PDS), the VESPA project from the Virtual Observatory of Paris as well as other international institutions. The protocols and standards currently being supported by the new Planetary Science Archive at this time are the Planetary Data Access Protocol (PDAP), the EuroPlanet-Table Access Protocol (EPN-TAP) and Open Geospatial Consortium (OGC) standards. The architecture of the PSA consists of a Geoserver (an open-source map server), the goal of which is to support use cases such as the distribution of search results, sharing and processing data through a OGC Web Feature Service (WFS) and a Web Map Service (WMS). This server also allows the retrieval of requested information in several standard output formats like Keyhole Markup Language (KML), Geography Markup Language (GML), shapefile, JavaScript Object Notation (JSON) and Comma Separated Values (CSV), among others. The provision of these various output formats enables end-users to be able to transfer retrieved data into popular applications such as Google Mars and NASA World Wind.
A web-based relational database for monitoring and analyzing mosquito population dynamics.
Sucaet, Yves; Van Hemert, John; Tucker, Brad; Bartholomay, Lyric
2008-07-01
Mosquito population dynamics have been monitored on an annual basis in the state of Iowa since 1969. The primary goal of this project was to integrate light trap data from these efforts into a centralized back-end database and interactive website that is available through the internet at http://iowa-mosquito.ent.iastate.edu. For comparative purposes, all data were categorized according to the week of the year and normalized according to the number of traps running. Users can readily view current, weekly mosquito abundance compared with data from previous years. Additional interactive capabilities facilitate analyses of the data based on mosquito species, distribution, or a time frame of interest. All data can be viewed in graphical and tabular format and can be downloaded to a comma separated value (CSV) file for import into a spreadsheet or more specialized statistical software package. Having this long-term dataset in a centralized database/website is useful for informing mosquito and mosquito-borne disease control and for exploring the ecology of the species represented therein. In addition to mosquito population dynamics, this database is available as a standardized platform that could be modified and applied to a multitude of projects that involve repeated collection of observational data. The development and implementation of this tool provides capacity for the user to mine data from standard spreadsheets into a relational database and then view and query the data in an interactive website.
Coplen, Tyler B.; Wassenaar, Leonard I
2015-01-01
RationaleAlthough laser absorption spectrometry (LAS) instrumentation is easy to use, its incorporation into laboratory operations is not easy, owing to extensive offline manipulation of comma-separated-values files for outlier detection, between-sample memory correction, nonlinearity (δ-variation with water amount) correction, drift correction, normalization to VSMOW-SLAP scales, and difficulty in performing long-term QA/QC audits.MethodsA Microsoft Access relational-database application, LIMS (Laboratory Information Management System) for Lasers 2015, was developed. It automates LAS data corrections and manages clients, projects, samples, instrument-sample lists, and triple-isotope (δ17O, δ18O, and δ2H values) instrumental data for liquid-water samples. It enables users to (1) graphically evaluate sample injections for variable water yields and high isotope-delta variance; (2) correct for between-sample carryover, instrumental drift, and δ nonlinearity; and (3) normalize final results to VSMOW-SLAP scales.ResultsCost-free LIMS for Lasers 2015 enables users to obtain improved δ17O, δ18O, and δ2H values with liquid-water LAS instruments, even those with under-performing syringes. For example, LAS δ2HVSMOW measurements of USGS50 Lake Kyoga (Uganda) water using an under-performing syringe having ±10 % variation in water concentration gave +31.7 ± 1.6 ‰ (2-σ standard deviation), compared with the reference value of +32.8 ± 0.4 ‰, after correction for variation in δ value with water concentration, between-sample memory, and normalization to the VSMOW-SLAP scale.ConclusionsLIMS for Lasers 2015 enables users to create systematic, well-founded instrument templates, import δ2H, δ17O, and δ18O results, evaluate performance with automatic graphical plots, correct for δ nonlinearity due to variable water concentration, correct for between-sample memory, adjust for drift, perform VSMOW-SLAP normalization, and perform long-term QA/QC audits easily. Published in 2015. This article is a U.S. Government work and is in the public domain in the USA.
Heidel-Fischer, Hanna M; Freitak, Dalial; Janz, Niklas; Söderlind, Lina; Vogel, Heiko; Nylin, Sören
2009-01-01
Background The mechanisms that shape the host plant range of herbivorous insect are to date not well understood but knowledge of these mechanisms and the selective forces that influence them can expand our understanding of the larger ecological interaction. Nevertheless, it is well established that chemical defenses of plants influence the host range of herbivorous insects. While host plant chemistry is influenced by phylogeny, also the growth forms of plants appear to influence the plant defense strategies as first postulated by Feeny (the "plant apparency" hypothesis). In the present study we aim to investigate the molecular basis of the diverse host plant range of the comma butterfly (Polygonia c-album) by testing differential gene expression in the caterpillars on three host plants that are either closely related or share the same growth form. Results In total 120 genes were identified to be differentially expressed in P. c-album after feeding on different host plants, 55 of them in the midgut and 65 in the restbody of the caterpillars. Expression patterns could be confirmed with an independent method for 14 of 27 tested genes. Pairwise similarities in upregulation in the midgut of the caterpillars were higher between plants that shared either growth form or were phylogenetically related. No known detoxifying enzymes were found to be differently regulated in the midgut after feeding on different host plants. Conclusion Our data suggest a complex picture of gene expression in response to host plant feeding. While each plant requires a unique gene regulation in the caterpillar, both phylogenetic relatedness and host plant growth form appear to influence the expression profile of the polyphagous comma butterfly, in agreement with phylogenetic studies of host plant utilization in butterflies. PMID:19878603
Heidel-Fischer, Hanna M; Freitak, Dalial; Janz, Niklas; Söderlind, Lina; Vogel, Heiko; Nylin, Sören
2009-10-31
The mechanisms that shape the host plant range of herbivorous insect are to date not well understood but knowledge of these mechanisms and the selective forces that influence them can expand our understanding of the larger ecological interaction. Nevertheless, it is well established that chemical defenses of plants influence the host range of herbivorous insects. While host plant chemistry is influenced by phylogeny, also the growth forms of plants appear to influence the plant defense strategies as first postulated by Feeny (the "plant apparency" hypothesis). In the present study we aim to investigate the molecular basis of the diverse host plant range of the comma butterfly (Polygonia c-album) by testing differential gene expression in the caterpillars on three host plants that are either closely related or share the same growth form. In total 120 genes were identified to be differentially expressed in P. c-album after feeding on different host plants, 55 of them in the midgut and 65 in the restbody of the caterpillars. Expression patterns could be confirmed with an independent method for 14 of 27 tested genes. Pairwise similarities in upregulation in the midgut of the caterpillars were higher between plants that shared either growth form or were phylogenetically related. No known detoxifying enzymes were found to be differently regulated in the midgut after feeding on different host plants. Our data suggest a complex picture of gene expression in response to host plant feeding. While each plant requires a unique gene regulation in the caterpillar, both phylogenetic relatedness and host plant growth form appear to influence the expression profile of the polyphagous comma butterfly, in agreement with phylogenetic studies of host plant utilization in butterflies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miura, Yuka; Hagiwara, Natsumi; Radisky, Derek C.
2014-09-10
Activation of the epithelial-mesenchymal transition (EMT) program promotes cell invasion and metastasis, and is reversed through mesenchymal-epithelial transition (MET) after formation of distant metastases. Here, we show that an imbalance of gene products encoded by the transcriptional factor C/EBPβ, LAP (liver-enriched activating protein) and LIP (liver-enriched inhibitory protein), can regulate both EMT- and MET-like phenotypic changes in mouse mammary epithelial cells. By using tetracycline repressive LIP expression constructs, we found that SCp2 cells, a clonal epithelial line of COMMA1-D cells, expressed EMT markers, lost the ability to undergo alveolar-like morphogenesis in 3D Matrigel, and acquired properties of benign adenoma cells.more » Conversely, we found that inducible expression of LAP in SCg6 cells, a clonal fibroblastic line of COMMA1-D cells, began to express epithelial keratins with suppression of proliferation. The overexpression of the C/EBPβ gene products in these COMMA1-D derivatives was suppressed by long-term cultivation on tissue culture plastic, but gene expression was maintained in cells grown on Matrigel or exposed to proteasome inhibitors. Thus, imbalances of C/EBPβ gene products in mouse mammary epithelial cells, which are affected by contact with basement membrane, are defined as a potential regulator of metastatic potential. - Highlights: • We created a temporal imbalance of C/EBPβ gene products in the mammary model cells. • The temporal up-regulation of LIP protein induced EMT-like cell behaviors. • The temporal up-regulation of LAP protein induced MET-like cell behaviors. • Excess amount of C/EBPβ gene products were eliminated by proteasomal-degradation. • Basement membrane components attenuated proteasome-triggered protein elimination.« less
Gamo, R; Floristan, U; Pampín, A; Caro, D; Pinedo, F; López-Estebaranz, J L
2015-10-01
The clinical distinction between basal cell carcinoma (BCC) and intradermal melanocytic nevus lesions on the face can be difficult, particularly in young patients or patients with multiple nevi. Dermoscopy is a useful tool for analyzing characteristic dermoscopic features of BCC, such as cartwheel structures, maple leaf-like areas, blue-gray nests and dots, and ulceration. It also reveals arborizing telangiectatic vessels and prominent curved vessels, which are typical of BCC, and comma vessels, which are typical of intradermal melanocytic nevi. It is, however, not always easy to distinguish between these 2 conditions, even when dermoscopy is used. We describe 2 facial lesions that posed a clinical and dermoscopic challenge in two 38-year-old patients; confocal microscopy showed separation between tumor nests and stroma and polarized nuclei, which are confocal microscopy features of basal cell carcinoma. Copyright © 2014 Elsevier España, S.L.U. y AEDV. All rights reserved.
Seismicity of Afghanistan and vicinity
Dewey, James W.
2006-01-01
This publication describes the seismicity of Afghanistan and vicinity and is intended for use in seismic hazard studies of that nation. Included are digital files with information on earthquakes that have been recorded in Afghanistan and vicinity through mid-December 2004. Chapter A provides an overview of the seismicity and tectonics of Afghanistan and defines the earthquake parameters included in the 'Summary Catalog' and the 'Summary of Macroseismic Effects.' Chapter B summarizes compilation of the 'Master Catalog' and 'Sub-Threshold Catalog' and documents their formats. The 'Summary Catalog' itself is presented as a comma-delimited ASCII file, the 'Summary of Macroseismic Effects' is presented as an html file, and the 'Master Catalog' and 'Sub-Threshold Catalog' are presented as flat ASCII files. Finally, this report includes as separate plates a digital image of a map of epicenters of earthquakes occurring since 1964 (Plate 1) and a representation of areas of damage or strong shaking from selected past earthquakes in Afghanistan and vicinity (Plate 2).
Computational Tools for Parsimony Phylogenetic Analysis of Omics Data
Salazar, Jose; Amri, Hakima; Noursi, David
2015-01-01
Abstract High-throughput assays from genomics, proteomics, metabolomics, and next generation sequencing produce massive omics datasets that are challenging to analyze in biological or clinical contexts. Thus far, there is no publicly available program for converting quantitative omics data into input formats to be used in off-the-shelf robust phylogenetic programs. To the best of our knowledge, this is the first report on creation of two Windows-based programs, OmicsTract and SynpExtractor, to address this gap. We note, as a way of introduction and development of these programs, that one particularly useful bioinformatics inferential modeling is the phylogenetic cladogram. Cladograms are multidimensional tools that show the relatedness between subgroups of healthy and diseased individuals and the latter's shared aberrations; they also reveal some characteristics of a disease that would not otherwise be apparent by other analytical methods. The OmicsTract and SynpExtractor were written for the respective tasks of (1) accommodating advanced phylogenetic parsimony analysis (through standard programs of MIX [from PHYLIP] and TNT), and (2) extracting shared aberrations at the cladogram nodes. OmicsTract converts comma-delimited data tables through assigning each data point into a binary value (“0” for normal states and “1” for abnormal states) then outputs the converted data tables into the proper input file formats for MIX or with embedded commands for TNT. SynapExtractor uses outfiles from MIX and TNT to extract the shared aberrations of each node of the cladogram, matching them with identifying labels from the dataset and exporting them into a comma-delimited file. Labels may be gene identifiers in gene-expression datasets or m/z values in mass spectrometry datasets. By automating these steps, OmicsTract and SynpExtractor offer a veritable opportunity for rapid and standardized phylogenetic analyses of omics data; their model can also be extended to next generation sequencing (NGS) data. We make OmicsTract and SynpExtractor publicly and freely available for non-commercial use in order to strengthen and build capacity for the phylogenetic paradigm of omics analysis. PMID:26230532
Yang, Tao; Gu, Yongchun; Zhang, Li; Hua, Zequan
2014-03-01
We report a rare case of congenital tri-cavernous hemangiomas of the right buccal region, right accessory parotid gland, and masseter muscle region in an adult. The patient, a 25-year-old woman, complained of 3 masses in her right midcheek. Ultrasonographic and computed tomographic findings showed an irregular-shaped mass (multiple calcifications) with a well-defined margin in the masseter muscle region, an ellipse-shaped mass (multiple calcifications) with a well-defined margin in the right buccal region, and a comma-shaped mass (no calcifications) with a well-defined margin separate from the parotid gland in the right accessory parotid gland region. These iconographic findings suggested that the masses were all hemangiomas separately originating from the parotid gland, accessory parotid gland, and masseter muscle. The masses were completely removed through a standard parotid incision without postoperative facial palsy, skin deformity, and difficulty in secreting saliva. Findings from histologic examination of the tumor revealed multiple, thin-walled, and dilated blood vessels, confirming the diagnosis of cavernous hemangiomas. Ultrasonographic and computed tomographic findings were extremely useful in diagnosing the mass/masses as hemangioma before surgery, clarifying relationships between the mass and adjacent structures, and determining the surgical approach to the mass/masses.
Phrase-based Multimedia Information Extraction
2006-07-01
names with periods — J. K. Ramirez, T. Grant Smith, Lita S. Jones; names with commas — Hector Jones, Jr.; and conjoined names, such as Sherlock and Judy... Holmes . Using both the type and token metrics (described above), we tested these extensions and improvements to the name identification module on
2003-04-15
of Albuquerque, New Mexico. . Since the system has “bottomed out” one could project a straight line northeastward (with little eastward movement of...in determining if forecast model guidance is “on track.” 14. 14. Subject Terms: CLOUDS, COMMA CLOUD, DRY LINE , GULF STRATUS, HEIGHT FALL CENTERS...4-40 Warm Fronts, Squall Lines and Mesocyclones
Fallon, Nevada FORGE Geodetic Data
Blankenship, Doug; Eneva, Mariana; Hammond, William
2018-02-01
Fallon FORGE InSAR and geodetic GPS deformation data. InSAR shapefiles are packaged together as .MPK (ArcMap map package, compatible with other GIS platforms), and as .CSV comma-delimited plaintext. GPS data and additional metadata are linked to the Nevada Geodetic Laboratory database at the Univ. of Nevada, Reno (UNR).
Records Inventory Data Collection Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cole, Brian A.
1995-03-01
DATALINK was created to provide an easy to use data collection program for records management software products. It provides several useful tools for capturing and validating record index data in the field. It also allows users to easily create a comma delimited, ASCII text file for data export into most records management software products.
Amplify Errors to Minimize Them
ERIC Educational Resources Information Center
Stewart, Maria Shine
2009-01-01
In this article, the author offers her experience of modeling mistakes and writing spontaneously in the computer classroom to get students' attention and elicit their editorial response. She describes how she taught her class about major sentence errors--comma splices, run-ons, and fragments--through her Sentence Meditation exercise, a rendition…
CO2calc: A User-Friendly Seawater Carbon Calculator for Windows, Mac OS X, and iOS (iPhone)
Robbins, L.L.; Hansen, M.E.; Kleypas, J.A.; Meylan, S.C.
2010-01-01
A user-friendly, stand-alone application for the calculation of carbonate system parameters was developed by the U.S. Geological Survey Florida Shelf Ecosystems Response to Climate Change Project in response to its Ocean Acidification Task. The application, by Mark Hansen and Lisa Robbins, USGS St. Petersburg, FL, Joanie Kleypas, NCAR, Boulder, CO, and Stephan Meylan, Jacobs Technology, St. Petersburg, FL, is intended as a follow-on to CO2SYS, originally developed by Lewis and Wallace (1998) and later modified for Microsoft Excel? by Denis Pierrot (Pierrot and others, 2006). Besides eliminating the need for using Microsoft Excel on the host system, CO2calc offers several improvements on CO2SYS, including: An improved graphical user interface for data entry and results Additional calculations of air-sea CO2 fluxes (for surface water calculations) The ability to tag data with sample name, comments, date, time, and latitude/longitude The ability to use the system time and date and latitude/ longitude (automatic retrieval of latitude and longitude available on iPhone? 3, 3GS, 4, and Windows? hosts with an attached National Marine Electronics Association (NMEA)-enabled GPS) The ability to process multiple files in a batch processing mode An option to save sample information, data input, and calculated results as a comma-separated value (CSV) file for use with Microsoft Excel, ArcGIS,? or other applications An option to export points with geographic coordinates as a KMZ file for viewing and editing in Google EarthTM
Original data preprocessor for Femap/Nastran
NASA Astrophysics Data System (ADS)
Oanta, Emil M.; Panait, Cornel; Raicu, Alexandra
2016-12-01
Automatic data processing and visualization in the finite elements analysis of the structural problems is a long run concern in mechanical engineering. The paper presents the `common database' concept according to which the same information may be accessed from an analytical model, as well as from a numerical one. In this way, input data expressed as comma-separated-value (CSV) files are loaded into the Femap/Nastran environment using original API codes, being automatically generated: the geometry of the model, the loads and the constraints. The original API computer codes are general, being possible to generate the input data of any model. In the next stages, the user may create the discretization of the model, set the boundary conditions and perform a given analysis. If additional accuracy is needed, the analyst may delete the previous discretizations and using the same information automatically loaded, other discretizations and analyses may be done. Moreover, if new more accurate information regarding the loads or constraints is acquired, they may be modelled and then implemented in the data generating program which creates the `common database'. This means that new more accurate models may be easily generated. Other facility consists of the opportunity to control the CSV input files, several loading scenarios being possible to be generated in Femap/Nastran. In this way, using original intelligent API instruments the analyst is focused to accurately model the phenomena and on creative aspects, the repetitive and time-consuming activities being performed by the original computer-based instruments. Using this data processing technique we apply to the best Asimov's principle `minimum change required / maximum desired response'.
Strasser, Torsten; Peters, Tobias; Jägle, Herbert; Zrenner, Eberhart
2018-02-01
The ISCEV standards and recommendations for electrophysiological recordings in ophthalmology define a set of protocols with stimulus parameters, acquisition settings, and recording conditions, to unify the data and enable comparability of results across centers. Up to now, however, there are no standards to define the storage and exchange of such electrophysiological recordings. The aim of this study was to develop an open standard data format for the exchange and storage of visual electrophysiological data (ElVisML). We first surveyed existing data formats for biomedical signals and examined their suitability for electrophysiological data in ophthalmology. We then compared the suitability of text-based and binary formats, as well as encoding in Extensible Markup Language (XML) and character/comma-separated values. The results of the methodological consideration led to the development of ElVisML with an XML-encoded text-based format. This allows referential integrity, extensibility, the storing of accompanying units, as well as ensuring confidentiality and integrity of the data. A visualization of ElVisML documents (ElVisWeb) has additionally been developed, which facilitates the exchange of recordings on mailing lists and allows open access to data along with published articles. The open data format ElVisML ensures the quality, validity, and integrity of electrophysiological data transmission and storage as well as providing manufacturer-independent access and long-term archiving in a future-proof format. Standardization of the format of such neurophysiology data would promote the development of new techniques and open software for the use of neurophysiological data in both clinic and research.
Recent Development of an Earth Science App - FieldMove Clino
NASA Astrophysics Data System (ADS)
Vaughan, Alan; Collins, Nathan; Krus, Mike; Rourke, Peter
2014-05-01
As geological modelling and analysis move into 3D digital space, it becomes increasingly important to be able to rapidly integrate new data with existing databases, without the potential degradation caused by repeated manual transcription of numeric, graphical and meta-data. Digital field mapping offers significant benefits when compared with traditional paper mapping techniques, in that it can directly and interactively feed and be guided by downstream geological modelling and analysis. One of the most important pieces of equipment used by the field geologists is the compass clinometer. Midland Valley's development team have recently release their highly anticipated FieldMove Clino App. FieldMove Clino is a digital compass-clinometer for data capture on a smartphone. The app allows the user to use their phone as a traditional hand-held bearing compass, as well as a digital compass-clinometer for rapidly measuring and capturing the georeferenced location and orientation of planar and linear features in the field. The user can also capture and store digital photographs and text notes. FieldMove Clino supports online Google Maps as well as offline maps, so that the user can import their own georeferenced basemaps. Data can be exported as comma-separated values (.csv) or Move™ (.mve) files and then imported directly into FieldMove™, Move™ or other applications. Midland Valley is currently pioneering tablet-based mapping and, along with its industrial and academic partners, will be using the application in field based projects throughout this year and will be integrating feedback in further developments of this technology.
Global Search Trends of Oral Problems using Google Trends from 2004 to 2016: An Exploratory Analysis
Patthi, Basavaraj; Singla, Ashish; Gupta, Ritu; Prasad, Monika; Ali, Irfan; Dhama, Kuldeep; Niraj, Lav Kumar
2017-01-01
Introduction Oral diseases are pandemic cause of morbidity with widespread geographic distribution. This technology based era has brought about easy knowledge transfer than traditional dependency on information obtained from family doctors. Hence, harvesting this system of trends can aid in oral disease quantification. Aim To conduct an exploratory analysis of the changes in internet search volumes of oral diseases by using Google Trends© (GT©). Materials and Methods GT© were utilized to provide real world facts based on search terms related to categories, interest by region and interest over time. Time period chosen was from January 2004 to December 2016. Five different search terms were explored and compared based on the highest relative search volumes along with comma separated value files to obtain an insight into highest search traffic. Results The search volume measured over the time span noted the term “Dental caries” to be the most searched in Japan, “Gingivitis” in Jordan, “Oral Cancer” in Taiwan, “No Teeth” in Australia, “HIV symptoms” in Zimbabwe, “Broken Teeth” in United Kingdom, “Cleft palate” in Philippines, “Toothache” in Indonesia and the comparison of top five searched terms provided the “Gingivitis” with highest search volume. Conclusion The results from the present study offers an insight into a competent tool that can analyse and compare oral diseases over time. The trend research platform can be used on emerging diseases and their drift in geographic population with great acumen. This tool can be utilized in forecasting, modulating marketing strategies and planning disability limitation techniques. PMID:29207825
ERIC Educational Resources Information Center
Bullard, Sue Burzynski; Anderson, Nancy
2014-01-01
Effective writing requires mastering grammar. For journalists, this mastery is critical because research shows poor grammar erodes media credibility. College writing instructors say students do not understand basic grammar concepts, and greater numbers of students are enrolling in remedial writing classes. This quasi-experimental mixed methods…
2008-02-01
grants g, we ab - breviate the condition Said(Alice,Perm(Alice, issue,Perm(Alice, issue, g))) as d(g) and we abbreviate the grant Perm(Alice, issue, g...as a string of symbols. For ease of exposi - tion, we assume that each pair of parenthesis and set braces has length 2, and each comma has length 1
DATALINK. Records Inventory Data Collection Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cole, B.
1995-03-01
DATALINK was created to provide an easy to use data collection program for records management software products. It provides several useful tools for capturing and validating record index data in the field. It also allows users to easily create a comma delimited, ASCII text file for data export into most records management software products.
Five Strategies for Remediating Sentence-Level Writing Deficiencies
ERIC Educational Resources Information Center
Quible, Zane K.
2006-01-01
Two types of sentence-level writing problems are often observed in student writing: (1) those that violate conventions of standard written English, such as subject-verb agreement errors and comma splices; and (2) those that involve a stylistic choice, such as beginning a sentence with an expletive structure like "There are" or using "if" rather…
NASA Astrophysics Data System (ADS)
Freire, E.; Acevedo, V.; Halac, E. B.; Polla, G.; López, M.; Reinoso, M.
2016-03-01
White virgules, commas, and dot designs on tricolored ceramics are sporadically found in different archaeological sites located in Northwestern Argentina area, as Puna and Quebrada de Humahuaca. This decorating style has been reported in several articles, but few previous archaeometric studies have been carried out on the pigment composition. Fragments from Puna and Quebrada archaeological sites, belonging to Regional Development Period (900-1430 AD), were analyzed by X-ray diffraction and Raman spectroscopy in order to characterize the pigments employed. Red and black pigments are based on iron and manganese oxides, as it has been extensively reported for the NW Argentina area. White pigments from white virgules, comma, and dot designs have shown different composition. Hydroxyapatite was found in samples from Doncellas site (North Puna region), and calcium and calcium-magnesium containing compounds, as vaterite and dolomite, along with titanium containing compounds were detected on samples from Abralaite (Central Puna region) and Gasoducto (Quebrada de Humahuaca region). It has been concluded that pigment composition is not characteristic of a unique region.
Freire, E; Acevedo, V; Halac, E B; Polla, G; López, M; Reinoso, M
2016-03-15
White virgules, commas, and dot designs on tricolored ceramics are sporadically found in different archaeological sites located in Northwestern Argentina area, as Puna and Quebrada de Humahuaca. This decorating style has been reported in several articles, but few previous archaeometric studies have been carried out on the pigment composition. Fragments from Puna and Quebrada archaeological sites, belonging to Regional Development Period (900-1430 AD), were analyzed by X-ray diffraction and Raman spectroscopy in order to characterize the pigments employed. Red and black pigments are based on iron and manganese oxides, as it has been extensively reported for the NW Argentina area. White pigments from white virgules, comma, and dot designs have shown different composition. Hydroxyapatite was found in samples from Doncellas site (North Puna region), and calcium and calcium-magnesium containing compounds, as vaterite and dolomite, along with titanium containing compounds were detected on samples from Abralaite (Central Puna region) and Gasoducto (Quebrada de Humahuaca region). It has been concluded that pigment composition is not characteristic of a unique region. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Anisuzzaman, S. M.; Abang, S.; Bono, A.; Krishnaiah, D.; Karali, R.; Safuan, M. K.
2017-06-01
Wax precipitation and deposition is one of the most significant flow assurance challenges in the production system of the crude oil. Wax inhibitors are developed as a preventive strategy to avoid an absolute wax deposition. Wax inhibitors are polymers which can be known as pour point depressants as they impede the wax crystals formation, growth, and deposition. In this study three formulations of wax inhibitors were prepared, ethylene vinyl acetate, ethylene vinyl acetate co-methyl methacrylate (EVA co-MMA) and ethylene vinyl acetate co-diethanolamine (EVA co-DEA) and the comparison of their efficiencies in terms of cloud point¸ pour point, performance inhibition efficiency (%PIE) and viscosity were evaluated. The cloud point and pour point for both EVA and EVA co-MMA were similar, 15°C and 10-5°C, respectively. Whereas, the cloud point and pour point for EVA co-DEA were better, 10°C and 10-5°C respectively. In conclusion, EVA co-DEA had shown the best % PIE (28.42%) which indicates highest percentage reduction of wax deposit as compared to the other two inhibitors.
Conservative management of extradural hematoma: A report of sixty-two cases.
Zwayed, A Rahim H; Lucke-Wold, Brandon
2018-06-01
Extradural hematomas (EDH) are considered life threatening in that the risk for brain herniation is significant. The current accepted understanding within the literature is to treat EDH via surgical evacuation of the hematoma. In this case-series we report 62 cases of EDH managed conservatively without surgical intervention. Inclusion criteria were: Glasgow comma scale score 13-15, extradural hematoma confirmed by CT being less than 40 mm, less than 6 mm of midline shift, and no other surgical lesions present. Patients were initially observed in a surgical intensive care unit prior to discharge and had closely scheduled follow-up. Of the 62 cases none required emergent intervention and the majority had interval resolution of the epidural hematoma over time. Resolution was apparent by 21 days and definitive by 3 to 6 months. Patients with EDH who have a high Glasgow comma scale score 13-15, volume <40 mm, and less than 6 mm of midline shift should be considered for conservative management. Our study indicates that these patients will have interval resolution of hematoma over time without worsening of symptoms.
Mynodbcsv: lightweight zero-config database solution for handling very large CSV files.
Adaszewski, Stanisław
2014-01-01
Volumes of data used in science and industry are growing rapidly. When researchers face the challenge of analyzing them, their format is often the first obstacle. Lack of standardized ways of exploring different data layouts requires an effort each time to solve the problem from scratch. Possibility to access data in a rich, uniform manner, e.g. using Structured Query Language (SQL) would offer expressiveness and user-friendliness. Comma-separated values (CSV) are one of the most common data storage formats. Despite its simplicity, with growing file size handling it becomes non-trivial. Importing CSVs into existing databases is time-consuming and troublesome, or even impossible if its horizontal dimension reaches thousands of columns. Most databases are optimized for handling large number of rows rather than columns, therefore, performance for datasets with non-typical layouts is often unacceptable. Other challenges include schema creation, updates and repeated data imports. To address the above-mentioned problems, I present a system for accessing very large CSV-based datasets by means of SQL. It's characterized by: "no copy" approach--data stay mostly in the CSV files; "zero configuration"--no need to specify database schema; written in C++, with boost [1], SQLite [2] and Qt [3], doesn't require installation and has very small size; query rewriting, dynamic creation of indices for appropriate columns and static data retrieval directly from CSV files ensure efficient plan execution; effortless support for millions of columns; due to per-value typing, using mixed text/numbers data is easy; very simple network protocol provides efficient interface for MATLAB and reduces implementation time for other languages. The software is available as freeware along with educational videos on its website [4]. It doesn't need any prerequisites to run, as all of the libraries are included in the distribution package. I test it against existing database solutions using a battery of benchmarks and discuss the results.
Mynodbcsv: Lightweight Zero-Config Database Solution for Handling Very Large CSV Files
Adaszewski, Stanisław
2014-01-01
Volumes of data used in science and industry are growing rapidly. When researchers face the challenge of analyzing them, their format is often the first obstacle. Lack of standardized ways of exploring different data layouts requires an effort each time to solve the problem from scratch. Possibility to access data in a rich, uniform manner, e.g. using Structured Query Language (SQL) would offer expressiveness and user-friendliness. Comma-separated values (CSV) are one of the most common data storage formats. Despite its simplicity, with growing file size handling it becomes non-trivial. Importing CSVs into existing databases is time-consuming and troublesome, or even impossible if its horizontal dimension reaches thousands of columns. Most databases are optimized for handling large number of rows rather than columns, therefore, performance for datasets with non-typical layouts is often unacceptable. Other challenges include schema creation, updates and repeated data imports. To address the above-mentioned problems, I present a system for accessing very large CSV-based datasets by means of SQL. It's characterized by: “no copy” approach – data stay mostly in the CSV files; “zero configuration” – no need to specify database schema; written in C++, with boost [1], SQLite [2] and Qt [3], doesn't require installation and has very small size; query rewriting, dynamic creation of indices for appropriate columns and static data retrieval directly from CSV files ensure efficient plan execution; effortless support for millions of columns; due to per-value typing, using mixed text/numbers data is easy; very simple network protocol provides efficient interface for MATLAB and reduces implementation time for other languages. The software is available as freeware along with educational videos on its website [4]. It doesn't need any prerequisites to run, as all of the libraries are included in the distribution package. I test it against existing database solutions using a battery of benchmarks and discuss the results. PMID:25068261
Murray, David; Stankovic, Lina; Stankovic, Vladimir
2017-01-01
Smart meter roll-outs provide easy access to granular meter measurements, enabling advanced energy services, ranging from demand response measures, tailored energy feedback and smart home/building automation. To design such services, train and validate models, access to data that resembles what is expected of smart meters, collected in a real-world setting, is necessary. The REFIT electrical load measurements dataset described in this paper includes whole house aggregate loads and nine individual appliance measurements at 8-second intervals per house, collected continuously over a period of two years from 20 houses. During monitoring, the occupants were conducting their usual routines. At the time of publishing, the dataset has the largest number of houses monitored in the United Kingdom at less than 1-minute intervals over a period greater than one year. The dataset comprises 1,194,958,790 readings, that represent over 250,000 monitored appliance uses. The data is accessible in an easy-to-use comma-separated format, is time-stamped and cleaned to remove invalid measurements, correctly label appliance data and fill in small gaps of missing data. PMID:28055033
NASA Astrophysics Data System (ADS)
Murray, David; Stankovic, Lina; Stankovic, Vladimir
2017-01-01
Smart meter roll-outs provide easy access to granular meter measurements, enabling advanced energy services, ranging from demand response measures, tailored energy feedback and smart home/building automation. To design such services, train and validate models, access to data that resembles what is expected of smart meters, collected in a real-world setting, is necessary. The REFIT electrical load measurements dataset described in this paper includes whole house aggregate loads and nine individual appliance measurements at 8-second intervals per house, collected continuously over a period of two years from 20 houses. During monitoring, the occupants were conducting their usual routines. At the time of publishing, the dataset has the largest number of houses monitored in the United Kingdom at less than 1-minute intervals over a period greater than one year. The dataset comprises 1,194,958,790 readings, that represent over 250,000 monitored appliance uses. The data is accessible in an easy-to-use comma-separated format, is time-stamped and cleaned to remove invalid measurements, correctly label appliance data and fill in small gaps of missing data.
Murray, David; Stankovic, Lina; Stankovic, Vladimir
2017-01-05
Smart meter roll-outs provide easy access to granular meter measurements, enabling advanced energy services, ranging from demand response measures, tailored energy feedback and smart home/building automation. To design such services, train and validate models, access to data that resembles what is expected of smart meters, collected in a real-world setting, is necessary. The REFIT electrical load measurements dataset described in this paper includes whole house aggregate loads and nine individual appliance measurements at 8-second intervals per house, collected continuously over a period of two years from 20 houses. During monitoring, the occupants were conducting their usual routines. At the time of publishing, the dataset has the largest number of houses monitored in the United Kingdom at less than 1-minute intervals over a period greater than one year. The dataset comprises 1,194,958,790 readings, that represent over 250,000 monitored appliance uses. The data is accessible in an easy-to-use comma-separated format, is time-stamped and cleaned to remove invalid measurements, correctly label appliance data and fill in small gaps of missing data.
The Myth of FANBOYS: Coordination, Commas, and College Composition Classes
ERIC Educational Resources Information Center
Reynolds, Brett
2011-01-01
The claim that the words "for," "and," "nor," "but," "or," "yet," and "so" (FANBOYS) constitute a complete list of English coordinating conjunctions is examined though syntactic analysis and found wanting. This analysis is presented as an illustration of the need for teachers constantly to question the choice of material that we present to our…
ERIC Educational Resources Information Center
Bitz, Michael
2010-01-01
This definitive book presents the newest research linking graphic narratives and literacy learning, as well as the tools teachers will need to make comic book projects a success in their classrooms. The Comic Book Project (www.comicbookproject.org) is an internationally celebrated initiative where children plan, write, design, and publish original…
ERIC Educational Resources Information Center
Hoover, Eric; Lipka, Sara
2013-01-01
Nobody wants to be here. In remedial English, earning no credit, stuck. Now--after months of commas, clauses, and four-paragraph essays--students have one last chance to write their way out. Twenty students sit at computers, poised to start the final in-class essay for English 002 at Montgomery College. Anybody can enroll here, and all kinds do.…
Road Signs to Writing: Language Curriculum, Levels C-D [Grades Three and Four]; Teacher's Guide.
ERIC Educational Resources Information Center
Oregon Univ., Eugene. Oregon Elementary English Project.
Developed by the Oregon Elementary English Project, this curriculum unit, intended for grades three and four, introduces students to some of the mechanics of writing and provides some practice in using these mechanics. The unit contains an introduction to writing and seven sections, covering beginning and end punctuation, commas, the apostrophe,…
A New Paradigm to Analyze Data Completeness of Patient Data.
Nasir, Ayan; Gurupur, Varadraj; Liu, Xinliang
2016-08-03
There is a need to develop a tool that will measure data completeness of patient records using sophisticated statistical metrics. Patient data integrity is important in providing timely and appropriate care. Completeness is an important step, with an emphasis on understanding the complex relationships between data fields and their relative importance in delivering care. This tool will not only help understand where data problems are but also help uncover the underlying issues behind them. Develop a tool that can be used alongside a variety of health care database software packages to determine the completeness of individual patient records as well as aggregate patient records across health care centers and subpopulations. The methodology of this project is encapsulated within the Data Completeness Analysis Package (DCAP) tool, with the major components including concept mapping, CSV parsing, and statistical analysis. The results from testing DCAP with Healthcare Cost and Utilization Project (HCUP) State Inpatient Database (SID) data show that this tool is successful in identifying relative data completeness at the patient, subpopulation, and database levels. These results also solidify a need for further analysis and call for hypothesis driven research to find underlying causes for data incompleteness. DCAP examines patient records and generates statistics that can be used to determine the completeness of individual patient data as well as the general thoroughness of record keeping in a medical database. DCAP uses a component that is customized to the settings of the software package used for storing patient data as well as a Comma Separated Values (CSV) file parser to determine the appropriate measurements. DCAP itself is assessed through a proof of concept exercise using hypothetical data as well as available HCUP SID patient data.
A New Paradigm to Analyze Data Completeness of Patient Data
Nasir, Ayan; Liu, Xinliang
2016-01-01
Summary Background There is a need to develop a tool that will measure data completeness of patient records using sophisticated statistical metrics. Patient data integrity is important in providing timely and appropriate care. Completeness is an important step, with an emphasis on understanding the complex relationships between data fields and their relative importance in delivering care. This tool will not only help understand where data problems are but also help uncover the underlying issues behind them. Objectives Develop a tool that can be used alongside a variety of health care database software packages to determine the completeness of individual patient records as well as aggregate patient records across health care centers and subpopulations. Methods The methodology of this project is encapsulated within the Data Completeness Analysis Package (DCAP) tool, with the major components including concept mapping, CSV parsing, and statistical analysis. Results The results from testing DCAP with Healthcare Cost and Utilization Project (HCUP) State Inpatient Database (SID) data show that this tool is successful in identifying relative data completeness at the patient, subpopulation, and database levels. These results also solidify a need for further analysis and call for hypothesis driven research to find underlying causes for data incompleteness. Conclusion DCAP examines patient records and generates statistics that can be used to determine the completeness of individual patient data as well as the general thoroughness of record keeping in a medical database. DCAP uses a component that is customized to the settings of the software package used for storing patient data as well as a Comma Separated Values (CSV) file parser to determine the appropriate measurements. DCAP itself is assessed through a proof of concept exercise using hypothetical data as well as available HCUP SID patient data. PMID:27484918
Molecular Imaging and Contrast Agent Database (MICAD): evolution and progress.
Chopra, Arvind; Shan, Liang; Eckelman, W C; Leung, Kam; Latterner, Martin; Bryant, Stephen H; Menkens, Anne
2012-02-01
The purpose of writing this review is to showcase the Molecular Imaging and Contrast Agent Database (MICAD; www.micad.nlm.nih.gov ) to students, researchers, and clinical investigators interested in the different aspects of molecular imaging. This database provides freely accessible, current, online scientific information regarding molecular imaging (MI) probes and contrast agents (CA) used for positron emission tomography, single-photon emission computed tomography, magnetic resonance imaging, X-ray/computed tomography, optical imaging and ultrasound imaging. Detailed information on >1,000 agents in MICAD is provided in a chapter format and can be accessed through PubMed. Lists containing >4,250 unique MI probes and CAs published in peer-reviewed journals and agents approved by the United States Food and Drug Administration as well as a comma separated values file summarizing all chapters in the database can be downloaded from the MICAD homepage. Users can search for agents in MICAD on the basis of imaging modality, source of signal/contrast, agent or target category, pre-clinical or clinical studies, and text words. Chapters in MICAD describe the chemical characteristics (structures linked to PubChem), the in vitro and in vivo activities, and other relevant information regarding an imaging agent. All references in the chapters have links to PubMed. A Supplemental Information Section in each chapter is available to share unpublished information regarding an agent. A Guest Author Program is available to facilitate rapid expansion of the database. Members of the imaging community registered with MICAD periodically receive an e-mail announcement (eAnnouncement) that lists new chapters uploaded to the database. Users of MICAD are encouraged to provide feedback, comments, or suggestions for further improvement of the database by writing to the editors at micad@nlm.nih.gov.
Ahamed, Nizam U; Sundaraj, Kenneth; Poo, Tarn S
2013-03-01
This article describes the design of a robust, inexpensive, easy-to-use, small, and portable online electromyography acquisition system for monitoring electromyography signals during rehabilitation. This single-channel (one-muscle) system was connected via the universal serial bus port to a programmable Windows operating system handheld tablet personal computer for storage and analysis of the data by the end user. The raw electromyography signals were amplified in order to convert them to an observable scale. The inherent noise of 50 Hz (Malaysia) from power lines electromagnetic interference was then eliminated using a single-hybrid IC notch filter. These signals were sampled by a signal processing module and converted into 24-bit digital data. An algorithm was developed and programmed to transmit the digital data to the computer, where it was reassembled and displayed in the computer using software. Finally, the following device was furnished with the graphical user interface to display the online muscle strength streaming signal in a handheld tablet personal computer. This battery-operated system was tested on the biceps brachii muscles of 20 healthy subjects, and the results were compared to those obtained with a commercial single-channel (one-muscle) electromyography acquisition system. The results obtained using the developed device when compared to those obtained from a commercially available physiological signal monitoring system for activities involving muscle contractions were found to be comparable (the comparison of various statistical parameters) between male and female subjects. In addition, the key advantage of this developed system over the conventional desktop personal computer-based acquisition systems is its portability due to the use of a tablet personal computer in which the results are accessible graphically as well as stored in text (comma-separated value) form.
Rosetta: Ensuring the Preservation and Usability of ASCII-based Data into the Future
NASA Astrophysics Data System (ADS)
Ramamurthy, M. K.; Arms, S. C.
2015-12-01
Field data obtained from dataloggers often take the form of comma separated value (CSV) ASCII text files. While ASCII based data formats have positive aspects, such as the ease of accessing the data from disk and the wide variety of tools available for data analysis, there are some drawbacks, especially when viewing the situation through the lens of data interoperability and stewardship. The Unidata data translation tool, Rosetta, is a web-based service that provides an easy, wizard-based interface for data collectors to transform their datalogger generated ASCII output into Climate and Forecast (CF) compliant netCDF files following the CF-1.6 discrete sampling geometries. These files are complete with metadata describing what data are contained in the file, the instruments used to collect the data, and other critical information that otherwise may be lost in one of many README files. The choice of the machine readable netCDF data format and data model, coupled with the CF conventions, ensures long-term preservation and interoperability, and that future users will have enough information to responsibly use the data. However, with the understanding that the observational community appreciates the ease of use of ASCII files, methods for transforming the netCDF back into a CSV or spreadsheet format are also built-in. One benefit of translating ASCII data into a machine readable format that follows open community-driven standards is that they are instantly able to take advantage of data services provided by the many open-source data server tools, such as the THREDDS Data Server (TDS). While Rosetta is currently a stand-alone service, this talk will also highlight efforts to couple Rosetta with the TDS, thus allowing self-publishing of thoroughly documented datasets by the data producers themselves.
Foreign Language Analysis and Recognition (FLARe) Initial Progress
2012-11-29
University Language Modeling ToolKit CoMMA Count Mediated Morphological Analysis CRUD Create, Read , Update & Delete CPAN Comprehensive Perl Archive...DATES COVERED (From - To) 1 October 2010 – 30 September 2012 4. TITLE AND SUBTITLE Foreign Language Analysis and Recognition (FLARe) Initial Progress...AFRL-RH-WP-TR-2012-0165 FOREIGN LANGUAGE ANALYSIS AND RECOGNITION (FLARE) INITIAL PROGRESS Brian M. Ore
Recruiting and Retaining Army Nurses: An Annotated Bibliography
1988-12-01
upon earlier research into the relationships among locus of control , organizational unit structure, job satisfaction, and registered nurses’ intentions...R.ctruiting Comma 2 Program Analysis and Evaluation Directorate Research and Studies Division Fort Sheridan, IL 60037-6000 -N --. NAVAL POSTGRADUATE SCHOOL...Monterey, California RADM. R. C. Austin Harrison Shull Superintendent Provost The research summarized herein was sponsored by the US Army Recruiting
Prosodic Boundaries in Writing: Evidence from a Keystroke Analysis
Fuchs, Susanne; Krivokapić, Jelena
2016-01-01
The aim of the paper is to investigate duration between successive keystrokes during typing in order to examine whether prosodic boundaries are expressed in the process of writing. In particular, we are interested in interkey durations that occur next to punctuation marks (comma and full stops while taking keystrokes between words as a reference), since these punctuation marks are often realized with minor or major prosodic boundaries during overt reading. A two-part experiment was conducted: first, participants’ keystrokes on a computer keyboard were recorded while writing an email to a close friend (in two conditions: with and without time pressure). Second, participants read the email they just wrote. Interkey durations were compared to pause durations at the same locations during read speech. Results provide evidence of significant differences between interkey durations between words, at commas and at full stops (from shortest to longest). These durations were positively correlated with silent pause durations during overt reading. A more detailed analysis of interkey durations revealed patterns that can be interpreted with respect to prosodic boundaries in speech production, namely as phrase-final and phrase-initial lengthening occurring at punctuation marks. This work provides initial evidence that prosodic boundaries are reflected in the writing process. PMID:27917129
Prosodic Boundaries in Writing: Evidence from a Keystroke Analysis.
Fuchs, Susanne; Krivokapić, Jelena
2016-01-01
The aim of the paper is to investigate duration between successive keystrokes during typing in order to examine whether prosodic boundaries are expressed in the process of writing. In particular, we are interested in interkey durations that occur next to punctuation marks (comma and full stops while taking keystrokes between words as a reference), since these punctuation marks are often realized with minor or major prosodic boundaries during overt reading. A two-part experiment was conducted: first, participants' keystrokes on a computer keyboard were recorded while writing an email to a close friend (in two conditions: with and without time pressure). Second, participants read the email they just wrote. Interkey durations were compared to pause durations at the same locations during read speech. Results provide evidence of significant differences between interkey durations between words, at commas and at full stops (from shortest to longest). These durations were positively correlated with silent pause durations during overt reading. A more detailed analysis of interkey durations revealed patterns that can be interpreted with respect to prosodic boundaries in speech production, namely as phrase-final and phrase-initial lengthening occurring at punctuation marks. This work provides initial evidence that prosodic boundaries are reflected in the writing process.
Trick Simulation Environment 07
NASA Technical Reports Server (NTRS)
Lin, Alexander S.; Penn, John M.
2012-01-01
The Trick Simulation Environment is a generic simulation toolkit used for constructing and running simulations. This release includes a Monte Carlo analysis simulation framework and a data analysis package. It produces all auto documentation in XML. Also, the software is capable of inserting a malfunction at any point during the simulation. Trick 07 adds variable server output options and error messaging and is capable of using and manipulating wide characters for international support. Wide character strings are available as a fundamental type for variables processed by Trick. A Trick Monte Carlo simulation uses a statistically generated, or predetermined, set of inputs to iteratively drive the simulation. Also, there is a framework in place for optimization and solution finding where developers may iteratively modify the inputs per run based on some analysis of the outputs. The data analysis package is capable of reading data from external simulation packages such as MATLAB and Octave, as well as the common comma-separated values (CSV) format used by Excel, without the use of external converters. The file formats for MATLAB and Octave were obtained from their documentation sets, and Trick maintains generic file readers for each format. XML tags store the fields in the Trick header comments. For header files, XML tags for structures and enumerations, and the members within are stored in the auto documentation. For source code files, XML tags for each function and the calling arguments are stored in the auto documentation. When a simulation is built, a top level XML file, which includes all of the header and source code XML auto documentation files, is created in the simulation directory. Trick 07 provides an XML to TeX converter. The converter reads in header and source code XML documentation files and converts the data to TeX labels and tables suitable for inclusion in TeX documents. A malfunction insertion capability allows users to override the value of any simulation variable, or call a malfunction job, at any time during the simulation. Users may specify conditions, use the return value of a malfunction trigger job, or manually activate a malfunction. The malfunction action may consist of executing a block of input file statements in an action block, setting simulation variable values, call a malfunction job, or turn on/off simulation jobs.
ERIC Educational Resources Information Center
Mitchell, Danielle
2008-01-01
Fayette County, once home to the Carnegies, the Mellons, and the Fricks, now punctuated by abandoned mines and coke ovens, is the second poorest county in Pennsylvania. Gay and lesbian students experience discrimination in this County. In this article, the author discusses her efforts to intervene in this complicated problem by deploying a…
DATALINK: Records inventory data collection software. User`s guide
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cole, B.A.
1995-03-01
DATALINK was created to provide an easy to use data collection program for records management software products. It provides several useful tools for capturing and validating record index data in the field. It also allows users to easily create a comma delimited, ASCII text file for data export into most records management software products. It runs on virtually any computer us MS-DOS.
A genetic scale of reading frame coding.
Michel, Christian J
2014-08-21
The reading frame coding (RFC) of codes (sets) of trinucleotides is a genetic concept which has been largely ignored during the last 50 years. A first objective is the definition of a new and simple statistical parameter PrRFC for analysing the probability (efficiency) of reading frame coding (RFC) of any trinucleotide code. A second objective is to reveal different classes and subclasses of trinucleotide codes involved in reading frame coding: the circular codes of 20 trinucleotides and the bijective genetic codes of 20 trinucleotides coding the 20 amino acids. This approach allows us to propose a genetic scale of reading frame coding which ranges from 1/3 with the random codes (RFC probability identical in the three frames) to 1 with the comma-free circular codes (RFC probability maximal in the reading frame and null in the two shifted frames). This genetic scale shows, in particular, the reading frame coding probabilities of the 12,964,440 circular codes (PrRFC=83.2% in average), the 216 C(3) self-complementary circular codes (PrRFC=84.1% in average) including the code X identified in eukaryotic and prokaryotic genes (PrRFC=81.3%) and the 339,738,624 bijective genetic codes (PrRFC=61.5% in average) including the 52 codes without permuted trinucleotides (PrRFC=66.0% in average). Otherwise, the reading frame coding probabilities of each trinucleotide code coding an amino acid with the universal genetic code are also determined. The four amino acids Gly, Lys, Phe and Pro are coded by codes (not circular) with RFC probabilities equal to 2/3, 1/2, 1/2 and 2/3, respectively. The amino acid Leu is coded by a circular code (not comma-free) with a RFC probability equal to 18/19. The 15 other amino acids are coded by comma-free circular codes, i.e. with RFC probabilities equal to 1. The identification of coding properties in some classes of trinucleotide codes studied here may bring new insights in the origin and evolution of the genetic code. Copyright © 2014 Elsevier Ltd. All rights reserved.
HOPE: An On-Line Piloted Handling Qualities Experiment Data Book
NASA Technical Reports Server (NTRS)
Jackson, E. B.; Proffitt, Melissa S.
2010-01-01
A novel on-line database for capturing most of the information obtained during piloted handling qualities experiments (either flight or simulated) is described. The Hyperlinked Overview of Piloted Evaluations (HOPE) web application is based on an open-source object-oriented Web-based front end (Ruby-on-Rails) that can be used with a variety of back-end relational database engines. The hyperlinked, on-line data book approach allows an easily-traversed way of looking at a variety of collected data, including pilot ratings, pilot information, vehicle and configuration characteristics, test maneuvers, and individual flight test cards and repeat runs. It allows for on-line retrieval of pilot comments, both audio and transcribed, as well as time history data retrieval and video playback. Pilot questionnaires are recorded as are pilot biographies. Simple statistics are calculated for each selected group of pilot ratings, allowing multiple ways to aggregate the data set (by pilot, by task, or by vehicle configuration, for example). Any number of per-run or per-task metrics can be captured in the database. The entire run metrics dataset can be downloaded in comma-separated text for further analysis off-line. It is expected that this tool will be made available upon request
Violent Storm Strikes Western Europe
2010-03-03
Image acquired February 27, 2010: An extratropical cyclone named Xynthia brought hurricane-force winds and high waves to Western Europe at the end of February 2010, CNN reported. Winds as fast as 200 kilometers (125 miles) per hour reached as far inland as Paris, and at the storm’s peak, hurricane-force winds extended from Portugal to the Netherlands. Hundreds of people had to take refuge from rising waters on their rooftops. By March 1, at least 58 people had died, some of them struck by falling trees. Most of the deaths occurred in France, but the storm also caused casualties in England, Germany, Belgium, Spain, and Portugal. The Moderate Resolution Imaging Spectroradiometer (MODIS) on NASA’s Aqua satellite captured this image of Western Europe, acquired in two separate overpasses on February 27, 2010. MODIS captured the eastern half of the image around 10:50 UTC, and the western half about 12:30 UTC. Forming a giant comma shape, clouds stretch from the Atlantic Ocean to northern Italy. NASA image courtesy MODIS Rapid Response Team at NASA Goddard Space Flight Center. Caption by Michon Scott. Instrument: Aqua - MODIS For more information related to this image go to: earthobservatory.nasa.gov/NaturalHazards/view.php?id=42881
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1987-10-01
Huffman codes, comma-free codes, and block codes with shift indicators are important candidate-message compression codes for improving the efficiency of communications systems. This study was undertaken to determine if these codes could be used to increase the thruput of the fixed very-low-frequency (FVLF) communication system. This applications involves the use of compression codes in a channel with errors.
Statistical Analysis of the Skaion Network Security Dataset
2012-09-01
DataType :=xlDelimited, _ TextQualifier:=xlDoubleQuote, ConsecutiveDelimiter:=True, Tab:=False, _ Semicolon:=False, Comma:=False, Space...Selection.TextToColumns Destination:=Range(“E1”), DataType :=xlDelimited, _ TextQualifier:=xlDoubleQuote, ConsecutiveDelimiter:=True, Tab:=False...True Columns(“F:F”).Select Selection.TextToColumns Destination:=Range(“F1”), DataType :=xlDelimited, _ TextQualifier:=xlDoubleQuote
Punctuation, Prosody, and Discourse: Afterthought Vs. Right Dislocation
Kalbertodt, Janina; Primus, Beatrice; Schumacher, Petra B.
2015-01-01
In a reading production experiment we investigate the impact of punctuation and discourse structure on the prosodic differentiation of right dislocation (RD) and afterthought (AT). Both discourse structure and punctuation are likely to affect the prosodic marking of these right-peripheral constructions, as certain prosodic markings are appropriate only in certain discourse structures, and punctuation is said to correlate with prosodic phrasing. With RD and AT clearly differing in discourse function (comment-topic structuring vs. disambiguation) and punctuation (comma vs. full stop), critical items in this study were manipulated with regard to the (mis-)match of these parameters. Since RD and AT are said to prosodically differ in pitch range, phrasing, and accentuation patterns, we measured the reduction of pitch range, boundary strength and prominence level. Results show an effect of both punctuation and discourse context (mediated by syntax) on phrasing and accentuation. Interestingly, for pitch range reduction no difference between RDs and ATs could be observed. Our results corroborate a language architecture model in which punctuation, prosody, syntax, and discourse-semantics are independent but interacting domains with correspondence constraints between them. Our findings suggest there are tight correspondence constraints between (i) punctuation (full stop and comma in particular) and syntax, (ii) prosody and syntax as well as (iii) prosody and discourse-semantics. PMID:26648883
Investigating the causes of wrap-up effects: evidence from eye movements and E-Z Reader.
Warren, Tessa; White, Sarah J; Reichle, Erik D
2009-04-01
Wrap-up effects in reading have traditionally been thought to reflect increased processing associated with intra- and inter-clause integration (Just, M. A. & Carpenter, P. A. (1980). A theory of reading: From eye fixations to comprehension. Psychological Review,87(4), 329-354; Rayner, K., Kambe, G., & Duffy, S. A. (2000). The effect of clause wrap-up on eye movements during reading. The Quarterly Journal of Experimental Psychology,53A(4), 1061-1080; cf. Hirotani, M., Frazier, L., & Rayner, K. (2006). Punctuation and intonation effects on clause and sentence wrap-up: Evidence from eye movements. Journal of Memory and Language,54, 425-443). We report an eye-tracking experiment with a strong manipulation of integrative complexity at a critical word that was either sentence-final, ended a comma-marked clause, or was not comma-marked. Although both complexity and punctuation had reliable effects, they did not interact in any eye-movement measure. These results as well as simulations using the E-Z Reader model of eye-movement control (Reichle, E. D., Warren, T., & McConnell, K. (2009). Using E-Z Reader to model the effects of higher-level language processing on eye movements during reading. Psychonomic Bulletin & Review,16(1), 1-20) suggest that traditional accounts of clause wrap-up are incomplete.
Borneo vortex and meso-scale convective rainfall
NASA Astrophysics Data System (ADS)
Koseki, S.; Koh, T.-Y.; Teo, C.-K.
2013-08-01
We have investigated how the Borneo vortex develops over the equatorial South China Sea under cold surge conditions in December during the Asian winter monsoon. Composite analysis using reanalysis and satellite datasets has revealed that absolute vorticity and water vapour are transported by strong cold surges from upstream of the South China Sea to around the equator. Rainfall is correspondingly enhanced over the equatorial South China Sea. A semi-idealized experiment reproduced the Borneo vortex over the equatorial South China Sea during a "perpetual" cold surge. The Borneo vortex is manifested as a meso-α cyclone with a comma-shaped rainband in the northeast sector of the cyclone. Vorticity budget analysis showed that the growth of the meso-α cyclone was achieved mainly by vortex stretching. The comma-shaped rainband consists of clusters of meso-β scale rainfall patches. The warm and wet cyclonic southeasterly flow meets with the cold and dry northeasterly surge forming a confluence front in the northeastern sector of the cyclone. Intense upward motion and heavy rainfall result both due to the low-level convergence and the favourable thermodynamic profile at the confluence front. At both meso-α and meso-β scales, the convergence is ultimately caused by the deviatoric strain in the confluence wind pattern but is much enhanced by nonlinear self-enhancement dynamics.
Newly described features resulting from high-magnification dermoscopy of tinea capitis.
Lacarrubba, Francesco; Verzì, Anna Elisa; Micali, Giuseppe
2015-03-01
Recent studies have reported "comma hairs" as a typical dermoscopic feature of tinea capitis observed at low magnification (×10). The aim of this study was to evaluate the dermoscopic aspects of tinea capitis at high magnification (×150) and its diagnostic role. Five children (2 boys and 3 girls; aged 4-10 years) with multiple scaly patches of alopecia underwent scalp dermoscopy, direct microscopic examinations, and mycological cultures of skin scrapings. Using low magnification (×30), typical comma hairs, "Morse code-like" hairs, and "zigzag" hairs were observed. When using high magnification (×150), additional features were horizontal white bands that appear as empty bands that are likely related to localized areas of fungal infection. These horizontal white bands are usually multiple and may cause the hair to bend and break. We also identified a new dermoscopic feature consisting of translucent, easily deformable hairs that look weakened and transparent and show unusual bends; they are likely the result of a massive fungal invasion involving the whole hair shaft. Direct microscopic examination showed fungal infection and results of mycological culture were positive for Microsporum canis in all cases. The identification of new findings using higher-magnification dermoscopy may enhance the diagnosis of tinea capitis and be of help to better understand some pathogenetic mechanisms.
Pauker, Efrat; Itzhak, Inbal; Baum, Shari R; Steinhauer, Karsten
2011-10-01
In reading, a comma in the wrong place can cause more severe misunderstandings than the lack of a required comma. Here, we used ERPs to demonstrate that a similar effect holds for prosodic boundaries in spoken language. Participants judged the acceptability of temporarily ambiguous English "garden path" sentences whose prosodic boundaries were either in line or in conflict with the actual syntactic structure. Sentences with incongruent boundaries were accepted less than those with missing boundaries and elicited a stronger on-line brain response in ERPs (N400/P600 components). Our results support the notion that mentally deleting an overt prosodic boundary is more costly than postulating a new one and extend previous findings, suggesting an immediate role of prosody in sentence comprehension. Importantly, our study also provides new details on the profile and temporal dynamics of the closure positive shift (CPS), an ERP component assumed to reflect prosodic phrasing in speech and music in real time. We show that the CPS is reliably elicited at the onset of prosodic boundaries in English sentences and is preceded by negative components. Its early onset distinguishes the speech CPS in adults both from prosodic ERP correlates in infants and from the "music CPS" previously reported for trained musicians.
The physics of musical scales: Theory and experiment
NASA Astrophysics Data System (ADS)
Durfee, Dallin S.; Colton, John S.
2015-10-01
The theory of musical scales involves mathematical ratios, harmonic resonators, beats, and human perception and provides an interesting application of the physics of waves and sound. We first review the history and physics of musical scales, with an emphasis on four historically important scales: twelve-tone equal temperament, Pythagorean, quarter-comma meantone, and Ptolemaic just intonation. We then present an easy way for students and teachers to directly experience the qualities of different scales using MIDI synthesis.
NASA Astrophysics Data System (ADS)
Ko, Yee Song; Cuervo-Reyes, Eduardo; Nüesch, Frank A.; Opris, Dorina M.
2016-04-01
The dielectric relaxation processes of polymethyl methacrylates that have been functionalized with Disperse Red 1 (DR1) in the side chain (DR1-co-MMA) were studied with temperature dependent impedance spectroscopy and thermally stimulated depolarization current (TSDC) techniques. Copolymers with dipole contents which varied between 10 mol% and 70 mol% were prepared. All samples showed dipole relaxations above the structural-glass transition temperature (Tg). The β-relaxation of the methyl methacrylate (MMA) repeating unit was most visible in DR1(10%)-co-MMA and rapidly vanishes with higher dipole contents. DSC data reveal an increase of the Tg by 20 °C to 125°C with the inclusion of the dipole into the polymethyl methacrylate (PMMA) as side chain. The impedance data of samples with several DR1 concentrations, taken at several temperatures above Tg, have been fitted with the Havriliak-Negami (HN) function. In all cases, the fits reveal a dielectric response that corresponds to power-law dipolar relaxations. TSDC measurements show that the copolymer can be poled, and that the induced polarization can be frozen by lowering the temperature well below the glass transition. Relaxation strengths ΔƐ estimated by integrating the depolarization current are similar to those obtained from the impedance data, confirming the efficient freezing of the dipoles in the structural glass state.
GeoCSV: tabular text formatting for geoscience data
NASA Astrophysics Data System (ADS)
Stults, M.; Arko, R. A.; Davis, E.; Ertz, D. J.; Turner, M.; Trabant, C. M.; Valentine, D. W., Jr.; Ahern, T. K.; Carbotte, S. M.; Gurnis, M.; Meertens, C.; Ramamurthy, M. K.; Zaslavsky, I.; McWhirter, J.
2015-12-01
The GeoCSV design was developed within the GeoWS project as a way to provide a baseline of compatibility between tabular text data sets from various sub-domains in geoscience. Funded through NSF's EarthCube initiative, the GeoWS project aims to develop common web service interfaces for data access across hydrology, geodesy, seismology, marine geophysics, atmospheric science and other areas. The GeoCSV format is an essential part of delivering data via simple web services for discovery and utilization by both humans and machines. As most geoscience disciplines have developed and use data formats specific for their needs, tabular text data can play a key role as a lowest common denominator useful for exchanging and integrating data across sub-domains. The design starts with a core definition compatible with best practices described by the W3C - CSV on the Web Working Group (CSVW). Compatibility with CSVW is intended to ensure the broadest usability of data expressed as GeoCSV. An optional, simple, but limited metadata description mechanism was added to allow inclusion of important metadata with comma separated data, while staying with the definition of a "dialect" by CSVW. The format is designed both for creating new datasets and to annotate data sets already in a tabular text format such that they are compliant with GeoCSV.
Health & Demographic Surveillance System Profile: The Birbhum population project (Birbhum HDSS).
Ghosh, Saswata; Barik, Anamitra; Majumder, Saikat; Gorain, Ashoke; Mukherjee, Subrata; Mazumdar, Saibal; Chatterjee, Kajal; Bhaumik, Sunil Kumar; Bandyopadhyay, Susanta Kumar; Satpathi, BiswaRanjan; Majumder, Partha P; Chowdhury, Abhijit
2015-02-01
The Birbhum HDSS was established in 2008 and covers 351 villages in four administrative blocks in rural areas of Birbhum district of West Bengal, India. The project currently follows 54 585 individuals living in 12557 households. The population being followed up is economically underprivileged and socially marginalized. The HDSS, a prospective longitudinal cohort study, has been designed to study changes in population demographic, health and healthcare utilization. In addition to collecting data on vital statistics and antenatal and postnatal tracking, verbal autopsies are being performed. Moreover, periodic surveys capturing socio-demographic and economic conditions have been conducted twice. Data on nutritional status (children as well as adults), non-communicable diseases, smoking etc. have also been collected in special surveys. Currently, intervention studies on anaemia, undernutrition and common preschool childhood morbidities through behavioural changes are under way. For access to the data, a researcher needs to send a request to the Data Manager [suri.shds@gmail.com]. Data are shared in common formats like comma-separated files (csv) or Microsoft Excel (xlsx) or Microsoft Access Database (mdb).The HDSS will soon upgrade its data management system to a more integrated platform, coordinated and guided by INDEPTH data sharing policy. © The Author 2014; all rights reserved. Published by Oxford University Press on behalf of the International Epidemiological Association.
SOLDESIGN user's manual copyright
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pillsbury, R.D. Jr.
1991-02-01
SOLDESIGN is a general purpose program for calculating and plotting magnetic fields, Lorentz body forces, resistances and inductances for a system of coaxial uniform current density solenoidal elements. The program was originally written in 1980 and has been evolving ever since. SOLDESIGN can be used with either interactive (terminal) or file input. Output can be to the terminal or to a file. All input is free-field with comma or space separators. SOLDESIGN contains an interactive help feature that allows the user to examine documentation while executing the program. Input to the program consists of a sequence of word commands andmore » numeric data. Initially, the geometry of the elements or coils is defined by specifying either the coordinates of one corner of the coil or the coil centroid, a symmetry parameter to allow certain reflections of the coil (e.g., a split pair), the radial and axial builds, and either the overall current density or the total ampere-turns (NI). A more general quadrilateral element is also available. If inductances or resistances are desired, the number of turns must be specified. Field, force, and inductance calculations also require the number of radial current sheets (or integration points). Work is underway to extend the field, force, and, possibly, inductances to non-coaxial solenoidal elements.« less
D0 Superconducting Solenoid Quench Data and Slow Dump Data Acquisition
DOE Office of Scientific and Technical Information (OSTI.GOV)
Markley, D.; /Fermilab
1998-06-09
This Dzero Engineering note describes the method for which the 2 Tesla Superconducting Solenoid Fast Dump and Slow Dump data are accumulated, tracked and stored. The 2 Tesla Solenoid has eleven data points that need to be tracked and then stored when a fast dump or a slow dump occur. The TI555(Texas Instruments) PLC(Programmable Logic Controller) which controls the DC power circuit that powers the Solenoid, also has access to all the voltage taps and other equipment in the circuit. The TI555 constantly logs these eleven points in a rotating memory buffer. When either a fast dump(dump switch opens) ormore » a slow dump (power supply turns off) occurs, the TI555 organizes the respective data and will down load the data to a file on DO-CCRS2. This data in this file is moved over ethernet and is stored in a CSV (comma separated format) file which can easily be examined by Microsoft Excel or any other spreadsheet. The 2 Tesla solenoid control system also locks in first fault information. The TI555 decodes the first fault and passes it along to the program collecting the data and storing it on DO-CCRS2. This first fault information is then part of the file.« less
Multielement geochemical dataset of surficial materials for the northern Great Basin
Coombs, Mary Jane; Kotlyar, Boris B.; Ludington, Steve; Folger, Helen W.; Mossotti, Victor G.
2002-01-01
This report presents geochemical data generated during mineral and environmental assessments for the Bureau of Land Management in northern Nevada, northeastern California, southeastern Oregon, and southwestern Idaho, along with metadata and map representations of selected elements. The dataset presented here is a compilation of chemical analyses of over 10,200 stream-sediment and soil samples originally collected during the National Uranium Resource Evaluation's (NURE) Hydrogeochemical and Stream Sediment Reconnaissance (HSSR) program of the Department of Energy and its predecessors and reanalyzed to support a series of mineral-resource assessments by the U.S. Geological Survey (USGS). The dataset also includes the analyses of additional samples collected by the USGS in 1992. The sample sites are in southeastern Oregon, southwestern Idaho, northeastern California, and, primarily, in northern Nevada. These samples were collected from 1977 to 1983, before the development of most of the present-day large-scale mining infrastructure in northern Nevada. As such, these data may serve as an important baseline for current and future geoenvironmental studies. Largely because of the very diverse analytical methods used by the NURE HSSR program, the original NURE analyses in this area yielded little useful geochemical information. The Humboldt, Malheur-Jordan-Andrews, and Winnemucca-Surprise studies were designed to provide useful geochemical data via improved analytical methods (lower detection levels and higher precision) and, in the Malheur-Jordan-Andrews and Winnemucca Surprise areas, to collect additional stream-sediment samples to increase sampling coverage. The data are provided in *.xls (Microsoft Excel) and *.csv (comma-separated-value) format. We also present graphically 35 elements, interpolated ("gridded") in a geographic information system (GIS) and overlain by major geologic trends, so that users may view the variation in elemental concentrations over the landscape and reach their own conclusions regarding correlation among geochemistry, geologic features, and known mineral deposits. Quality-control issues are discussed for the grids and data.
Agosto-Arroyo, Emmanuel; Coshatt, Gina M; Winokur, Thomas S; Harada, Shuko; Park, Seung L
2017-01-01
The molecular diagnostics laboratory faces the challenge of improving test turnaround time (TAT). Low and consistent TATs are of great clinical and regulatory importance, especially for molecular virology tests. Laboratory information systems (LISs) contain all the data elements necessary to do accurate quality assurance (QA) reporting of TAT and other measures, but these reports are in most cases still performed manually: a time-consuming and error-prone task. The aim of this study was to develop a web-based real-time QA platform that would automate QA reporting in the molecular diagnostics laboratory at our institution, and minimize the time expended in preparing these reports. Using a standard Linux, Nginx, MariaDB, PHP stack virtual machine running atop a Dell Precision 5810, we designed and built a web-based QA platform, code-named Alchemy. Data files pulled periodically from the LIS in comma-separated value format were used to autogenerate QA reports for the human immunodeficiency virus (HIV) quantitation, hepatitis C virus (HCV) quantitation, and BK virus (BKV) quantitation. Alchemy allowed the user to select a specific timeframe to be analyzed and calculated key QA statistics in real-time, including the average TAT in days, tests falling outside the expected TAT ranges, and test result ranges. Before implementing Alchemy, reporting QA for the HIV, HCV, and BKV quantitation assays took 45-60 min of personnel time per test every month. With Alchemy, that time has decreased to 15 min total per month. Alchemy allowed the user to select specific periods of time and analyzed the TAT data in-depth without the need of extensive manual calculations. Alchemy has significantly decreased the time and the human error associated with QA report generation in our molecular diagnostics laboratory. Other tests will be added to this web-based platform in future updates. This effort shows the utility of informatician-supervised resident/fellow programming projects as learning opportunities and workflow improvements in the molecular laboratory.
JCoDA: a tool for detecting evolutionary selection.
Steinway, Steven N; Dannenfelser, Ruth; Laucius, Christopher D; Hayes, James E; Nayak, Sudhir
2010-05-27
The incorporation of annotated sequence information from multiple related species in commonly used databases (Ensembl, Flybase, Saccharomyces Genome Database, Wormbase, etc.) has increased dramatically over the last few years. This influx of information has provided a considerable amount of raw material for evaluation of evolutionary relationships. To aid in the process, we have developed JCoDA (Java Codon Delimited Alignment) as a simple-to-use visualization tool for the detection of site specific and regional positive/negative evolutionary selection amongst homologous coding sequences. JCoDA accepts user-inputted unaligned or pre-aligned coding sequences, performs a codon-delimited alignment using ClustalW, and determines the dN/dS calculations using PAML (Phylogenetic Analysis Using Maximum Likelihood, yn00 and codeml) in order to identify regions and sites under evolutionary selection. The JCoDA package includes a graphical interface for Phylip (Phylogeny Inference Package) to generate phylogenetic trees, manages formatting of all required file types, and streamlines passage of information between underlying programs. The raw data are output to user configurable graphs with sliding window options for straightforward visualization of pairwise or gene family comparisons. Additionally, codon-delimited alignments are output in a variety of common formats and all dN/dS calculations can be output in comma-separated value (CSV) format for downstream analysis. To illustrate the types of analyses that are facilitated by JCoDA, we have taken advantage of the well studied sex determination pathway in nematodes as well as the extensive sequence information available to identify genes under positive selection, examples of regional positive selection, and differences in selection based on the role of genes in the sex determination pathway. JCoDA is a configurable, open source, user-friendly visualization tool for performing evolutionary analysis on homologous coding sequences. JCoDA can be used to rapidly screen for genes and regions of genes under selection using PAML. It can be freely downloaded at http://www.tcnj.edu/~nayaklab/jcoda.
JCoDA: a tool for detecting evolutionary selection
2010-01-01
Background The incorporation of annotated sequence information from multiple related species in commonly used databases (Ensembl, Flybase, Saccharomyces Genome Database, Wormbase, etc.) has increased dramatically over the last few years. This influx of information has provided a considerable amount of raw material for evaluation of evolutionary relationships. To aid in the process, we have developed JCoDA (Java Codon Delimited Alignment) as a simple-to-use visualization tool for the detection of site specific and regional positive/negative evolutionary selection amongst homologous coding sequences. Results JCoDA accepts user-inputted unaligned or pre-aligned coding sequences, performs a codon-delimited alignment using ClustalW, and determines the dN/dS calculations using PAML (Phylogenetic Analysis Using Maximum Likelihood, yn00 and codeml) in order to identify regions and sites under evolutionary selection. The JCoDA package includes a graphical interface for Phylip (Phylogeny Inference Package) to generate phylogenetic trees, manages formatting of all required file types, and streamlines passage of information between underlying programs. The raw data are output to user configurable graphs with sliding window options for straightforward visualization of pairwise or gene family comparisons. Additionally, codon-delimited alignments are output in a variety of common formats and all dN/dS calculations can be output in comma-separated value (CSV) format for downstream analysis. To illustrate the types of analyses that are facilitated by JCoDA, we have taken advantage of the well studied sex determination pathway in nematodes as well as the extensive sequence information available to identify genes under positive selection, examples of regional positive selection, and differences in selection based on the role of genes in the sex determination pathway. Conclusions JCoDA is a configurable, open source, user-friendly visualization tool for performing evolutionary analysis on homologous coding sequences. JCoDA can be used to rapidly screen for genes and regions of genes under selection using PAML. It can be freely downloaded at http://www.tcnj.edu/~nayaklab/jcoda. PMID:20507581
Côté, Richard G; Jones, Philip; Martens, Lennart; Kerrien, Samuel; Reisinger, Florian; Lin, Quan; Leinonen, Rasko; Apweiler, Rolf; Hermjakob, Henning
2007-10-18
Each major protein database uses its own conventions when assigning protein identifiers. Resolving the various, potentially unstable, identifiers that refer to identical proteins is a major challenge. This is a common problem when attempting to unify datasets that have been annotated with proteins from multiple data sources or querying data providers with one flavour of protein identifiers when the source database uses another. Partial solutions for protein identifier mapping exist but they are limited to specific species or techniques and to a very small number of databases. As a result, we have not found a solution that is generic enough and broad enough in mapping scope to suit our needs. We have created the Protein Identifier Cross-Reference (PICR) service, a web application that provides interactive and programmatic (SOAP and REST) access to a mapping algorithm that uses the UniProt Archive (UniParc) as a data warehouse to offer protein cross-references based on 100% sequence identity to proteins from over 70 distinct source databases loaded into UniParc. Mappings can be limited by source database, taxonomic ID and activity status in the source database. Users can copy/paste or upload files containing protein identifiers or sequences in FASTA format to obtain mappings using the interactive interface. Search results can be viewed in simple or detailed HTML tables or downloaded as comma-separated values (CSV) or Microsoft Excel (XLS) files suitable for use in a local database or a spreadsheet. Alternatively, a SOAP interface is available to integrate PICR functionality in other applications, as is a lightweight REST interface. We offer a publicly available service that can interactively map protein identifiers and protein sequences to the majority of commonly used protein databases. Programmatic access is available through a standards-compliant SOAP interface or a lightweight REST interface. The PICR interface, documentation and code examples are available at http://www.ebi.ac.uk/Tools/picr.
Eriksson, Jonatan; Andersson, Simone; Appelqvist, Roger; Wieslander, Elisabet; Truedsson, Mikael; Bugge, May; Malm, Johan; Dahlbäck, Magnus; Andersson, Bo; Fehniger, Thomas E; Marko-Varga, György
2016-01-01
Data from biological samples and medical evaluations plays an essential part in clinical decision making. This data is equally important in clinical studies and it is critical to have an infrastructure that ensures that its quality is preserved throughout its entire lifetime. We are running a 5-year longitudinal clinical study, KOL-Örestad, with the objective to identify new COPD (Chronic Obstructive Pulmonary Disease) biomarkers in blood. In the study, clinical data and blood samples are collected from both private and public health-care institutions and stored at our research center in databases and biobanks, respectively. The blood is analyzed by Mass Spectrometry and the results from this analysis then linked to the clinical data. We built an infrastructure that allows us to efficiently collect and analyze the data. We chose to use REDCap as the EDC (Electronic Data Capture) tool for the study due to its short setup-time, ease of use, and flexibility. REDCap allows users to easily design data collection modules based on existing templates. In addition, it provides two functions that allow users to import batches of data; through a web API (Application Programming Interface) as well as by uploading CSV-files (Comma Separated Values). We created a software, DART (Data Rapid Translation), that translates our biomarker data into a format that fits REDCap's CSV-templates. In addition, DART is configurable to work with many other data formats as well. We use DART to import our clinical chemistry data to the REDCap database. We have shown that a powerful and internationally adopted EDC tool such as REDCap can be extended so that it can be used efficiently in proteomic studies. In our study, we accomplish this by using DART to translate our clinical chemistry data to a format that fits the templates of REDCap.
Gene ARMADA: an integrated multi-analysis platform for microarray data implemented in MATLAB.
Chatziioannou, Aristotelis; Moulos, Panagiotis; Kolisis, Fragiskos N
2009-10-27
The microarray data analysis realm is ever growing through the development of various tools, open source and commercial. However there is absence of predefined rational algorithmic analysis workflows or batch standardized processing to incorporate all steps, from raw data import up to the derivation of significantly differentially expressed gene lists. This absence obfuscates the analytical procedure and obstructs the massive comparative processing of genomic microarray datasets. Moreover, the solutions provided, heavily depend on the programming skills of the user, whereas in the case of GUI embedded solutions, they do not provide direct support of various raw image analysis formats or a versatile and simultaneously flexible combination of signal processing methods. We describe here Gene ARMADA (Automated Robust MicroArray Data Analysis), a MATLAB implemented platform with a Graphical User Interface. This suite integrates all steps of microarray data analysis including automated data import, noise correction and filtering, normalization, statistical selection of differentially expressed genes, clustering, classification and annotation. In its current version, Gene ARMADA fully supports 2 coloured cDNA and Affymetrix oligonucleotide arrays, plus custom arrays for which experimental details are given in tabular form (Excel spreadsheet, comma separated values, tab-delimited text formats). It also supports the analysis of already processed results through its versatile import editor. Besides being fully automated, Gene ARMADA incorporates numerous functionalities of the Statistics and Bioinformatics Toolboxes of MATLAB. In addition, it provides numerous visualization and exploration tools plus customizable export data formats for seamless integration by other analysis tools or MATLAB, for further processing. Gene ARMADA requires MATLAB 7.4 (R2007a) or higher and is also distributed as a stand-alone application with MATLAB Component Runtime. Gene ARMADA provides a highly adaptable, integrative, yet flexible tool which can be used for automated quality control, analysis, annotation and visualization of microarray data, constituting a starting point for further data interpretation and integration with numerous other tools.
Côté, Richard G; Jones, Philip; Martens, Lennart; Kerrien, Samuel; Reisinger, Florian; Lin, Quan; Leinonen, Rasko; Apweiler, Rolf; Hermjakob, Henning
2007-01-01
Background Each major protein database uses its own conventions when assigning protein identifiers. Resolving the various, potentially unstable, identifiers that refer to identical proteins is a major challenge. This is a common problem when attempting to unify datasets that have been annotated with proteins from multiple data sources or querying data providers with one flavour of protein identifiers when the source database uses another. Partial solutions for protein identifier mapping exist but they are limited to specific species or techniques and to a very small number of databases. As a result, we have not found a solution that is generic enough and broad enough in mapping scope to suit our needs. Results We have created the Protein Identifier Cross-Reference (PICR) service, a web application that provides interactive and programmatic (SOAP and REST) access to a mapping algorithm that uses the UniProt Archive (UniParc) as a data warehouse to offer protein cross-references based on 100% sequence identity to proteins from over 70 distinct source databases loaded into UniParc. Mappings can be limited by source database, taxonomic ID and activity status in the source database. Users can copy/paste or upload files containing protein identifiers or sequences in FASTA format to obtain mappings using the interactive interface. Search results can be viewed in simple or detailed HTML tables or downloaded as comma-separated values (CSV) or Microsoft Excel (XLS) files suitable for use in a local database or a spreadsheet. Alternatively, a SOAP interface is available to integrate PICR functionality in other applications, as is a lightweight REST interface. Conclusion We offer a publicly available service that can interactively map protein identifiers and protein sequences to the majority of commonly used protein databases. Programmatic access is available through a standards-compliant SOAP interface or a lightweight REST interface. The PICR interface, documentation and code examples are available at . PMID:17945017
Book Analysis: Command and Control of Theater Forces: Adequacy,
1988-04-01
COPN R OTIC 11 AIR COMMAND AND STAFF COLLEGE STUDENT REPORT BOOK ANALYSIS: COMMA1ND AND CONTROL OF THEATER FORCES: ADEQUACY1 MA~JOR JOHN J. WRIGHT 88...brIc rNT- PA-- DTIC TAS L DIstrbA(-c & %W1.~b~ ~ Vii D~ , _ _ _ ABOUT THE AUTHOR The author is currently a student at the Air Command and Staff...College, Maxwell AFB AL. A 1973 ROTC graduate from Allegheny College, he attended undergraduate pilot training at Reese AFB TX. From 1975-1980, he was a
Type II Forward Storage Site Facilities. POMCUS System. Volume 1
1980-09-01
GENERAL ILLUM! NATTO . jl 37 900 ’ M23218 -DICAL EQUIPMENT SET S,.TTALI1.; a 115 1,355 N54691 LHARGER BATTERY 2 i8 200 N82364 PERISCOPE BATTER’ COMMA!-D I 3...the Pact has improved its capability to exploit these advantages , the United States has responded by improving its reinforcement capability. In...areas, but they are also large enough and positioned in such a way as to cp-italize on their basic advantages . Hence it is possible * that the Pact
Writing Skills Course for Newly Commissioned Marine Corps Officers
1993-10-01
on the parked government vehicle were the 0 main causes of the accident. (8 and 9) 4. That LCpl Frank Johnson’s injuries were incurred in the line of...on the parked government vehicle were the main causes of the accident. (Findings of Fact14 8 and 9) 4. That LCpI Frank Johnson’s injuries were...sports, such as soccer, touch football, baseball, and karate . 3. Use a comma after an introductory word, Phrase. or adverb clause. Adverb clauses are
1987-05-14
such as and. but, or the conjunction comma, as in apples, oranges and bananas . Once such a word was recognized, normal parsing was suspended; a portion...Antecedents interpreted with respect to the "reaching the Stadium" event, as happening sometime after thaL A new node a. I pioced up a banana . Up...dose, I noticed the banana would be created in e/s structure ordered sometime after was too green to a the "reaching the Stadium" event. On the other
[ital N]-string vertices in string field theory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bordes, J.; Abdurrahman, A.; Anton, F.
1994-03-15
We give the general form of the vertex corresponding to the interaction of an arbitrary number of strings. The technique employed relies on the comma'' representation of string field theory where string fields and interactions are represented as matrices and operations between them such as multiplication and trace. The general formulation presented here shows that the interaction vertex of [ital N] strings, for any arbitrary [ital N], is given as a function of particular combinations of matrices corresponding to the change of representation between the full string and the half string degrees of freedom.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baert, A.L.; Fevery, J.; Marchal, G.
1983-03-01
In 5 patients with Budd-Chiari syndrome, computed tomography after intravenous bolus injection of iodinated contrast agents demonstrated images not previously seen in other diseases. The images are compatible with stagnation of contrast material at the periphery of the liver. In 3 of the 5 cases, grey-scale ultrasonography failed to document the normal hepatic veins draining into the inferior caval vein, but showed an intrahepatic network of comma-shaped venous structures. It is proposed that these two noninvasive approaches can help in establishing an early diagnosis.
Depiction of Trends in Administrative Healthcare Data from Hospital Information System.
Kalankesh, Leila R; Pourasghar, Faramarz; Jafarabadi, Mohammad Asghari; Khanehdan, Negar
2015-06-01
administrative healthcare data are among main components of hospital information system. Such data can be analyzed and deployed for a variety of purposes. The principal aim of this research was to depict trends of administrative healthcare data from HIS in a general hospital from March 2011 to March 2014. data set used for this research was extracted from the SQL database of the hospital information system in Razi general hospital located in Marand. The data were saved as CSV (Comma Separated Values) in order to facilitate data cleaning and analysis. The variables of data set included patient's age, gender, final diagnosis, final diagnosis code based on ICD-10 classification system, date of hospitalization, date of discharge, LOS(Length of Stay), ward, and survival status of the patient. Data were analyzed and visualized after applying appropriate cleansing and preparing techniques. morbidity showed a constant trend over three years. Pregnancy, childbirth and the puerperium were the leading category of final diagnosis (about 32.8 %). The diseases of the circulatory system were the second class accounting for 13 percent of the hospitalization cases. The diseases of the digestive system had the third rank (10%). Patients aged between 14 and 44 constituted a higher proportion of total cases. Diseases of the circulatory system was the most common class of diseases among elderly patients (age≥65). The highest rate of mortality was observed among patients with final diagnosis of the circulatory system diseases followed by those with diseases of the respiratory system, and neoplasms. Mortality rate for the ICU and the CCU patients were 62% and 33% respectively. The longest average of LOS (7.3 days) was observed among patients hospitalized in the ICU while patients in the Obstetrics and Gynecology ward had the shortest average of LOS (2.4 days). Multiple regression analysis revealed that LOS was correlated with variables of surgery, gender, and type of payment, ward, the class of final diagnosis and age. this study presents trends in administrative health care data residing in hospital information system of a general public hospital. Patterns in morbidity, mortality and length of stay can inform decision making in health care management. Mining trends in administrative healthcare data can add value to the health care management.
Noise Pollution: Do We Need a Solution? An Analysis of Noise in a Cardiac Care Unit.
Ryan, Kevin M; Gagnon, Matthew; Hanna, Tyler; Mello, Brad; Fofana, Mustapha; Ciottone, Gregory; Molloy, Michael
2016-08-01
Introduction Hospitals are meant to be places for respite and healing; however, technological advances and reliance on monitoring alarms has led to the environment becoming increasingly noisy. The coronary care unit (CCU), like the emergency department, provides care to ill patients while being vulnerable to noise pollution. The World Health Organization (WHO; Geneva, Switzerland) recommends that for optimum rest and healing, sound levels should average approximately 30 decibels (dB) with maximum readings less than 40 dB. Problem The purpose of this study was to measure and analyze sound levels in three different locations in the CCU, and to review alarm reports in relation to sound levels. Over a one-month period, sound recorders (Extech SDL600; Extech Instruments; Nashua, New Hampshire USA) were placed in three separate locations in the CCU at the West Roxbury Veterans' Administration (VA) Hospital (Roxbury, Massachusetts USA). Sound samples were recorded once per second, stored in Comma Separated Values format for Excel (Microsoft Corporation; Redmond, Washington USA), and then exported to Microsoft Excel. Averages were determined, plotted per hour, and alarm histories were reviewed to determine alarm noise effect on total noise for each location, as well as common alarm occurrences. Patient Room 1 consistently had the lowest average recordings, though all averages were >40 dB, despite decreases between 10:00 pm and 7:00 am. During daytime hours, recordings maintained levels >50 dB. Overnight noise remained above recommended levels 55.25% of the period in Patient Room 1 and 99.61% of the same time period in Patient Room 7. The nurses' station remained the loudest location of all three. Alarms per hour ranged from 20-26 during the day. Alarms per day averaged: Patient Room 1-57.17, Patient Room 7-122.03, and the nurses' station - 562.26. Oxygen saturation alarms accounted for 33.59% of activity, and heart-related (including ST segment and pacemaker) accounted for 49.24% of alarms. The CCU cares for ill patients requiring constant monitoring. Despite advances in technology, measured noise levels for the hospital studied exceeded WHO standards of 40 dB and peaks of 45 dB, even during night hours when patients require rest. Further work is required to reduce noise levels and examine effects on patient satisfaction, clinical outcomes, and length of stay. Ryan KM , Gagnon M , Hanna T , Mello B , Fofana M , Ciottone G , Molloy M . Noise pollution: do we need a solution? An analysis of noise in a cardiac care unit. Prehosp Disaster Med. 2016;31(4):432-435.
BOREAS TGB-3 Plant Species Composition Data over the NSA Fen
NASA Technical Reports Server (NTRS)
Bubier, Jill L.; Hall, Forrest G. (Editor); Conrad, Sara K. (Editor)
2000-01-01
The BOReal Ecosystem-Atmosphere Study Trace Gas Biogeochemistry (BOREAS TGB-3) team collected several data sets that contributed to understanding the measured trace gas fluxes over sites in the Northern Study Area (NSA). This data set contains information about the composition of plant species that were within the collars used to measure Net Ecosystem Exchange of CO2 (NEE). The species composition was identified to understand the differences in NEE among the various plant communities in the NSA fen. The data were collected in July of 1994 and 1996. The data are contained in comma-delimited, ASCII files.
NASA Astrophysics Data System (ADS)
Vandegriff, J. D.; King, T. A.; Weigel, R. S.; Faden, J.; Roberts, D. A.; Harris, B. T.; Lal, N.; Boardsen, S. A.; Candey, R. M.; Lindholm, D. M.
2017-12-01
We present the Heliophysics Application Programmers Interface (HAPI), a new interface specification that both large and small data centers can use to expose time series data holdings in a standard way. HAPI was inspired by the similarity of existing services at many Heliophysics data centers, and these data centers have collaborated to define a single interface that captures best practices and represents what everyone considers the essential, lowest common denominator for basic data access. This low level access can serve as infrastructure to support greatly enhanced interoperability among analysis tools, with the goal being simplified analysis and comparison of data from any instrument, model, mission or data center. The three main services a HAPI server must perform are 1. list a catalog of datasets (one unique ID per dataset), 2. describe the content of one dataset (JSON metadata), and 3. retrieve numerical content for one dataset (stream the actual data). HAPI defines both the format of the query to the server, and the response from the server. The metadata is lightweight, focusing on use rather than discovery, and the data format is a streaming one, with Comma Separated Values (CSV) being required and binary or JSON streaming being optional. The HAPI specification is available at GitHub, where projects are also underway to develop reference implementation servers that data providers can adapt and use at their own sites. Also in the works are data analysis clients in multiple languages (IDL, Python, Matlab, and Java). Institutions which have agreed to adopt HAPI include Goddard (CDAWeb for data and CCMC for models), LASP at the University of Colorado Boulder, the Particles and Plasma Interactions node of the Planetary Data System (PPI/PDS) at UCLA, the Plasma Wave Group at the University of Iowa, the Space Sector at the Johns Hopkins Applied Physics Lab (APL), and the tsds.org site maintained at George Mason University. Over the next year, the adoption of a uniform way to access time series data is expected to significantly enhance interoperability within the Heliophysics data environment. https://github.com/hapi-server/data-specification
Advanced Query and Data Mining Capabilities for MaROS
NASA Technical Reports Server (NTRS)
Wang, Paul; Wallick, Michael N.; Allard, Daniel A.; Gladden, Roy E.; Hy, Franklin H.
2013-01-01
The Mars Relay Operational Service (MaROS) comprises a number of tools to coordinate, plan, and visualize various aspects of the Mars Relay network. These levels include a Web-based user interface, a back-end "ReSTlet" built in Java, and databases that store the data as it is received from the network. As part of MaROS, the innovators have developed and implemented a feature set that operates on several levels of the software architecture. This new feature is an advanced querying capability through either the Web-based user interface, or through a back-end REST interface to access all of the data gathered from the network. This software is not meant to replace the REST interface, but to augment and expand the range of available data. The current REST interface provides specific data that is used by the MaROS Web application to display and visualize the information; however, the returned information from the REST interface has typically been pre-processed to return only a subset of the entire information within the repository, particularly only the information that is of interest to the GUI (graphical user interface). The new, advanced query and data mining capabilities allow users to retrieve the raw data and/or to perform their own data processing. The query language used to access the repository is a restricted subset of the structured query language (SQL) that can be built safely from the Web user interface, or entered as freeform SQL by a user. The results are returned in a CSV (Comma Separated Values) format for easy exporting to third party tools and applications that can be used for data mining or user-defined visualization and interpretation. This is the first time that a service is capable of providing access to all cross-project relay data from a single Web resource. Because MaROS contains the data for a variety of missions from the Mars network, which span both NASA and ESA, the software also establishes an access control list (ACL) on each data record in the database repository to enforce user access permissions through a multilayered approach.
Agosto-Arroyo, Emmanuel; Coshatt, Gina M.; Winokur, Thomas S.; Harada, Shuko; Park, Seung L.
2017-01-01
Background: The molecular diagnostics laboratory faces the challenge of improving test turnaround time (TAT). Low and consistent TATs are of great clinical and regulatory importance, especially for molecular virology tests. Laboratory information systems (LISs) contain all the data elements necessary to do accurate quality assurance (QA) reporting of TAT and other measures, but these reports are in most cases still performed manually: a time-consuming and error-prone task. The aim of this study was to develop a web-based real-time QA platform that would automate QA reporting in the molecular diagnostics laboratory at our institution, and minimize the time expended in preparing these reports. Methods: Using a standard Linux, Nginx, MariaDB, PHP stack virtual machine running atop a Dell Precision 5810, we designed and built a web-based QA platform, code-named Alchemy. Data files pulled periodically from the LIS in comma-separated value format were used to autogenerate QA reports for the human immunodeficiency virus (HIV) quantitation, hepatitis C virus (HCV) quantitation, and BK virus (BKV) quantitation. Alchemy allowed the user to select a specific timeframe to be analyzed and calculated key QA statistics in real-time, including the average TAT in days, tests falling outside the expected TAT ranges, and test result ranges. Results: Before implementing Alchemy, reporting QA for the HIV, HCV, and BKV quantitation assays took 45–60 min of personnel time per test every month. With Alchemy, that time has decreased to 15 min total per month. Alchemy allowed the user to select specific periods of time and analyzed the TAT data in-depth without the need of extensive manual calculations. Conclusions: Alchemy has significantly decreased the time and the human error associated with QA report generation in our molecular diagnostics laboratory. Other tests will be added to this web-based platform in future updates. This effort shows the utility of informatician-supervised resident/fellow programming projects as learning opportunities and workflow improvements in the molecular laboratory. PMID:28480121
P-TRAP: a Panicle TRAit Phenotyping tool.
A L-Tam, Faroq; Adam, Helene; Anjos, António dos; Lorieux, Mathias; Larmande, Pierre; Ghesquière, Alain; Jouannic, Stefan; Shahbazkia, Hamid Reza
2013-08-29
In crops, inflorescence complexity and the shape and size of the seed are among the most important characters that influence yield. For example, rice panicles vary considerably in the number and order of branches, elongation of the axis, and the shape and size of the seed. Manual low-throughput phenotyping methods are time consuming, and the results are unreliable. However, high-throughput image analysis of the qualitative and quantitative traits of rice panicles is essential for understanding the diversity of the panicle as well as for breeding programs. This paper presents P-TRAP software (Panicle TRAit Phenotyping), a free open source application for high-throughput measurements of panicle architecture and seed-related traits. The software is written in Java and can be used with different platforms (the user-friendly Graphical User Interface (GUI) uses Netbeans Platform 7.3). The application offers three main tools: a tool for the analysis of panicle structure, a spikelet/grain counting tool, and a tool for the analysis of seed shape. The three tools can be used independently or simultaneously for analysis of the same image. Results are then reported in the Extensible Markup Language (XML) and Comma Separated Values (CSV) file formats. Images of rice panicles were used to evaluate the efficiency and robustness of the software. Compared to data obtained by manual processing, P-TRAP produced reliable results in a much shorter time. In addition, manual processing is not repeatable because dry panicles are vulnerable to damage. The software is very useful, practical and collects much more data than human operators. P-TRAP is a new open source software that automatically recognizes the structure of a panicle and the seeds on the panicle in numeric images. The software processes and quantifies several traits related to panicle structure, detects and counts the grains, and measures their shape parameters. In short, P-TRAP offers both efficient results and a user-friendly environment for experiments. The experimental results showed very good accuracy compared to field operator, expert verification and well-known academic methods.
P-TRAP: a Panicle Trait Phenotyping tool
2013-01-01
Background In crops, inflorescence complexity and the shape and size of the seed are among the most important characters that influence yield. For example, rice panicles vary considerably in the number and order of branches, elongation of the axis, and the shape and size of the seed. Manual low-throughput phenotyping methods are time consuming, and the results are unreliable. However, high-throughput image analysis of the qualitative and quantitative traits of rice panicles is essential for understanding the diversity of the panicle as well as for breeding programs. Results This paper presents P-TRAP software (Panicle TRAit Phenotyping), a free open source application for high-throughput measurements of panicle architecture and seed-related traits. The software is written in Java and can be used with different platforms (the user-friendly Graphical User Interface (GUI) uses Netbeans Platform 7.3). The application offers three main tools: a tool for the analysis of panicle structure, a spikelet/grain counting tool, and a tool for the analysis of seed shape. The three tools can be used independently or simultaneously for analysis of the same image. Results are then reported in the Extensible Markup Language (XML) and Comma Separated Values (CSV) file formats. Images of rice panicles were used to evaluate the efficiency and robustness of the software. Compared to data obtained by manual processing, P-TRAP produced reliable results in a much shorter time. In addition, manual processing is not repeatable because dry panicles are vulnerable to damage. The software is very useful, practical and collects much more data than human operators. Conclusions P-TRAP is a new open source software that automatically recognizes the structure of a panicle and the seeds on the panicle in numeric images. The software processes and quantifies several traits related to panicle structure, detects and counts the grains, and measures their shape parameters. In short, P-TRAP offers both efficient results and a user-friendly environment for experiments. The experimental results showed very good accuracy compared to field operator, expert verification and well-known academic methods. PMID:23987653
NASA Astrophysics Data System (ADS)
Allen, B. J.; Mansell, E. R.; Betten, D.
2014-12-01
Open questions exist regarding chemical transport by convection and the sensitivity of Lightning Nitrogen Oxide (LNOx) production to flash type (IC vs. CG), channel height, and channel length. To help answer these and other questions, the Deep Convective Clouds and Chemistry (DC3) field project was conducted during the spring of 2012. On 29 May 2012, observations of an Oklahoma supercell were collected by two mobile SMART-R radars, the mobile NOXP radar, multiple NEXRAD radars, the Oklahoma Lightning Mapping Array (LMA), and the NSF/NCAR HIAPER and NASA DC-8 aircraft. In this study, data from the mobile and NEXRAD radars are assimilated into the NSSL COMMAS model using the Ensemble Kalman Filter, beginning shortly after initiation of convection and ending when the aircraft made their final measurements of the storm's outflow. The model analyses produce a realistic representation of the kinematic character of the storm throughout this time period. COMMAS includes the NSSL multimoment microphysics, explicit cloud electrification, and a branched lightning discharge scheme, which is used to produce LNOx within the model via a method dependent upon air pressure and lightning channel length. Model results will be presented and compared to radar, lightning, and aircraft observations. Of particular importance, the vertical distribution of lightning, channel length of lightning, and LNOx production and transport in the model will be analyzed and compared to LMA observations and anvil-level outflow observations from the aircraft. In addition, to examine entrainment and detrainment of air by the storm and to provide a check on LNOx production and transport, trajectory analyses will be presented and the transport of inert trace gases such as carbon monoxide in the model will be analyzed and compared to aircraft measurements.
Bourezane, Y; Bourezane, Y
Trichoscopy (hair dermoscopy) is a non-invasive and very useful technique for the diagnosis and follow-up of hair and scalp disorders. In tinea capitis, specific aspects of the hair shaft have been described, with the main ones being: comma hair, corkscrew hair, bar code-like hair (BCH) and zigzag hair (ZZH). Herein we report on a retrospective study of 24 patients with tinea capitis (TC). All patients underwent trichoscopic examination and mycological culture. Trichoscopy was abnormal in all 24 patients showing hair-shaft abnormalities. We observed three types of images depending on the nature and the mechanism of infection and discuss the different trichoscopic aspects of the hair shaft (comma hair, corkscrew hair, bar code-like hair, zigzag hair, broken hair and black dots) resulting from 3 mechanisms of penetration of the fungus in the hair shaft (endothrix, ectothrix and ectothrix-endothrix). All patients had positive mycological cultures: 15 with trichophytic TC (8 with Trichophyton tonsurans, 5 with T. soudanense and 2 with T. verrucosum) and 9 microsporic TC (7 with Microsporum audouini, and 2 with M. canis). We propose for the first time, to our knowledge, a classification of trichoscopic signs of TC. This classification will enable rapid diagnosis and prediction of the nature of the fungus before mycological culture. Our study shows the importance of trichoscopy in the diagnosis and monitoring of TC as well as its very good correlation with mycological culture. We propose a new classification of trichoscopic signs dependent on the nature of the mycological agent and the mechanism of infection. Further prospective studies with more patients are needed to confirm this classification. Copyright © 2017 Elsevier Masson SAS. All rights reserved.
Saskatchewan Forest Fire Control Centre Surface Meteorological Data
NASA Technical Reports Server (NTRS)
Hall, Forrest G. (Editor); Newcomer, Jeffrey A. (Editor); Funk, Barry; Strub, Richard
2000-01-01
The Saskatchewan Forest Fire Control Centre (SFFCC) provided surface meteorological data to BOREAS from its archive. This data set contains hourly surface meteorological data from 18 of the Meteorological stations located across Saskatchewan. Included in these data are parameters of date, time, temperature, relative humidity, wind direction, wind speed, and precipitation. Temporally, the data cover the period of May through September of 1994 and 1995. The data are provided in comma-delimited ASCII files, and are classified as AFM-Staff data. The data files are available on a CD-ROM (see document number 20010000884), or from the Oak Ridge National Laboratory (ORNL) Distributed Active Archive Center (DAAC).
Hair shafts in trichoscopy: clues for diagnosis of hair and scalp diseases.
Rudnicka, Lidia; Rakowska, Adriana; Kerzeja, Marta; Olszewska, Małgorzata
2013-10-01
Trichoscopy (hair and scalp dermoscopy) analyzes the structure and size of growing hair shafts, providing diagnostic clues for inherited and acquired causes of hair loss. Types of hair shaft abnormalities observed include exclamation mark hairs (alopecia areata, trichotillomania, chemotherapy-induced alopecia), Pohl-Pinkus constrictions (alopecia areata, chemotherapy-induced alopecia, blood loss, malnutrition), comma hairs (tinea capitis), corkscrew hairs (tinea capitis), coiled hairs (trichotillomania), flame hairs (trichotillomania), and tulip hairs (in trichotillomania, alopecia areata). Trichoscopy allows differential diagnosis of most genetic hair shaft disorders. This article proposes a classification of hair shaft abnormalities observed by trichoscopy. Copyright © 2013. Published by Elsevier Inc.
Explosive Forming of Butt Welded Pipe Reducers.
1979-04-01
Ao—A 072 I3Q NAVAL ORDNANCE STATION LOUISVILLE KY F~ G 13/it EXPLOSIVE FORMING OF BIJTT WELDED PIPE REDUCERS. (U) APR 79 M W JO$*4SON UNCLASSIFIED...NOSL MT OS2 _ Eli _ _El [LII] DliB I I —~~~~~~~~~~ I 4 1 V S -. RB’ORT NO. M1052 AP~t 1919v-fl o~toswE FORMING (j~~c BUTI WELDED PIPE RE~~~ A PQWECT...MING BUTT WELDED PIPE REDUCERS A PROJECT OP THE MANUFACTURING TECHNOLOGY PROGR AM NAVAL SEA SYST~~(S COMMA ND fiNAL REPORT NAVAL ORDNANCE STATION L
Cargo Movement Operations System (CMOS). Requirements Traceability Matrix Increment II
1990-05-17
NO [ ] COMMENT DISPOSITION: ACCEPT [ ] REJECT [ ] COMMENT STATUS: OPEN [ ] CLOSED [ ] Cmnt Page Paragraph No. No. Number Comment 1. C-i SS0-3 Change "workstation" to "processor". 2. C-2 SS0009 Change "workstation" to "processor". SS0016 3. C-6 SS0032 Change "workstation" to "processor". SS0035 4. C-9 SS0063 Add comma after "e.g." 5. C-i SS0082 Change "workstation" to "processor". 6. C-17 SS0131 Change "workstation" to "processor". SS0132 7. C-28 SS0242 Change "workstation"
Cargo Movement Operations Systems (CMOS). Revised Draft Software Test Plan
1990-05-17
NO [ ] COMMENT DISPOSITION: ACCEPT [ ] REJECT [ ] COMMENT STATUS: OPEN [ ] CLOSED [ ] Cmnt Page Paragraph No. No. Number Comment 1. 1 1 Delete the period following this and all other single digit paragraph numbers in order to comply with the format used in the DID. 2. 9 3.1.3 Replace "Z-248" with "PC Workstation" in the second line of the paragraph. 3. 10 3.2.1 Change the "?" to a "" in the second entry of Table 3.2.1. 10 3.2.2 rut parentheses around the phrase bounded by commas in the second and third lines, i.e.,
Hydraulics Graphics Package. Users Manual
1985-11-01
ENTER: VARIABLE/SEPARATOR/VALUE OR STRING GLBL, TETON DAM FAILURE ENTER: VARIABLE/SEPARATOR/VALUE OR STRING SLOC ,DISCHARGE HISTOGRAM ENTER: VARIABLE...ENTER: VARIABLE/SEPARATOR/VALUE OR STRING YLBL,FLOW IN 1000 CFS ENTER: VARIABLE/SEPARATORVA LUE OR STRING GLBL, TETON DAM FAILURE ENTER: VARIABLE...SEPARATOR/VALUE OR STRING SECNO, 0 ENTER: VARIABLE/SEPARATOR/VALUE OR STRING GO 1ee0. F go L 0 U I Goo. 200. TETON DAM FAILUPE N\\ rLOIJ Alr 4wi. fiNT. I .I
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hillson, Nathan
j5 automates and optimizes the design of the molecular biological process of cloning/constructing DNA. j5 enables users to benefit from (combinatorial) multi-part scar-less SLIC, Gibson, CPEC, Golden Gate assembly, or variants thereof, for which automation software does not currently exist, without the intense labor currently associated with the process. j5 inputs a list of the DNA sequences to be assembled, along with a Genbank, FASTA, jbei-seq, or SBOL v1.1 format sequence file for each DNA source. Given the list of DNA sequences to be assembled, j5 first determines the cost-minimizing assembly strategy for each part (direct synthesis, PCR/SOE, or oligo-embedding),more » designs DNA oligos with Primer3, adds flanking homology sequences (SLIC, Gibson, and CPEC; optimized with Primer3 for CPEC) or optimized overhang sequences (Golden Gate) to the oligos and direct synthesis pieces, and utilizes BLAST to check against oligo mis-priming and assembly piece incompatibility events. After identifying DNA oligos that are already contained within a local collection for reuse, the program estimates the total cost of direct synthesis and new oligos to be ordered. In the instance that j5 identifies putative assembly piece incompatibilities (multiple pieces with high flanking sequence homology), the program suggests hierarchical subassemblies where possible. The program outputs a comma-separated value (CSV) file, viewable via Excel or other spreadsheet software, that contains assembly design information (such as the PCR/SOE reactions to perform, their anticipated sizes and sequences, etc.) as well as a properly annotated genbank file containing the sequence resulting from the assembly, and appends the local oligo library with the oligos to be ordered j5 condenses multiple independent assembly projects into 96-well format for high-throughput liquid-handling robotics platforms, and generates configuration files for the PR-PR biology-friendly robot programming language. j5 thus provides a new way to design DNA assembly procedures much more productively and efficiently, not only in terms of time, but also in terms of cost. To a large extent, however, j5 does not allow people to do something that could not be done before by hand given enough time and effort. An exception to this is that, since the very act of using j5 to design the DNA assembly process standardizes the experimental details and workflow, j5 enables a single person to concurrently perform the independent DNA construction tasks of an entire group of researchers. Currently, this is not readily possible, since separate researchers employ disparate design strategies and workflows, and furthermore, their designs and workflows are very infrequently fully captured in an electronic format which is conducive to automation.« less
On the reliability of hook echoes as tornado indicators
NASA Technical Reports Server (NTRS)
Forbes, G. S.
1981-01-01
A study of radar echoes associated with the tornadoes of the 3 April 1974 outbreak was performed to evaluate the usefulness of echo shape as an indicator of tornadic thunderstorms. The hook shape was usually successful in characterizing an echo as tornadic, with a false alarm rate of 16%. Because hook echoes were relatively rare, however, a less restrictive shape called distinctive was more successful at detecting tornadic thunderstorms, identifying 65% of the tornadic echoes. An echo had a distinctive shape if it possessed a marked appendage on its right rear flank or was in the shape of a spiral, comma or line echo wave pattern (LEWP). Characteristics of the distinctive echo are given.
Hydroacoustic Evaluation of Fish Passage Through Bonneville Dam in 2005
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ploskey, Gene R.; Weiland, Mark A.; Zimmerman, Shon A.
2006-12-04
The Portland District of the U.S. Army Corps of Engineers requested that the Pacific Northwest National Laboratory (PNNL) conduct fish-passage studies at Bonneville Dam in 2005. These studies support the Portland District's goal of maximizing fish-passage efficiency (FPE) and obtaining 95% survival for juvenile salmon passing Bonneville Dam. Major passage routes include 10 turbines and a sluiceway at Powerhouse 1 (B1), an 18-bay spillway, and eight turbines and a sluiceway at Powerhouse 2 (B2). In this report, we present results of two studies related to juvenile salmonid passage at Bonneville Dam. The studies were conducted between April 16 and Julymore » 15, 2005, encompassing most of the spring and summer migrations. Studies included evaluations of (1) Project fish passage efficiency and other major passage metrics, and (2) smolt approach and fate at B1 Sluiceway Outlet 3C from the B1 forebay. Some of the large appendices are only presented on the compact disk (CD) that accompanies the final report. Examples include six large comma-separated-variable (.CSV) files of hourly fish passage, hourly variances, and Project operations for spring and summer from Appendix E, and large Audio Video Interleave (AVI) files with DIDSON-movie clips of the area upstream of B1 Sluiceway Outlet 3C (Appendix H). Those video clips show smolts approaching the outlet, predators feeding on smolts, and vortices that sometimes entrained approaching smolts into turbines. The CD also includes Adobe Acrobat Portable Document Files (PDF) of the entire report and appendices.« less
Klimasauskas, Edward P.; Miller, Marti L.; Bradley, Dwight C.
2007-01-01
Introduction The Kuskokwim mineral belt of Bundtzen and Miller (1997) forms an important metallogenic region in southwestern Alaska that has yielded more than 3.22 million ounces of gold and 400,000 ounces of silver. Precious-metal and related deposits in this region associated with Late Cretaceous to early Tertiary igneous complexes extend into the Taylor Mountains 1:250,000-scale quadrangle. The U.S. Geological Survey is in the process of conducting a mineral resource assessment of this region. This report presents analytical data collected during the third year of this multiyear study. A total of 138 rock geochemistry samples collected during the 2006 field season were analyzed using the ICP-AES/MS42, ICP-AES10, fire assay, and cold vapor atomic absorption methods described in more detail below. Analytical values are provided in percent (% or pct: 1 gram per 100 grams), parts per million (ppm: 1 gram per 1,000,000 grams), or parts per billion (ppb: 1 gram per 1,000,000,000 grams) as indicated in the column heading of the data table. Data are provided for download in Excel (*.xls), comma delimited (*.csv), dBase 4 (*.dbf) and as a point coverage in ArcInfo interchange (*.e00) formats available at http://pubs.usgs.gov/of/2007/1386/.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Perl, J; Villagomez-Bernabe, B; Currell, F
2015-06-15
Purpose: The stochastic nature of the subatomic world presents a challenge for physics education. Even experienced physicists can be amazed at the varied behavior of electrons, x-rays, protons, neutrons, ions and the any short-lived particles that make up the overall behavior of our accelerators, brachytherapy sources and medical imaging systems. The all-particle Monte Carlo particle transport tool, TOPAS Tool for Particle Simulation, originally developed for proton therapy research, has been repurposed into a physics teaching tool, TOPAS-edu. Methods: TOPAS-edu students set up simulated particle sources, collimators, scatterers, imagers and scoring setups by writing simple ASCII files (in the TOPAS Parametermore » Control System format). Students visualize geometry setups and particle trajectories in a variety of modes from OpenGL graphics to VRML 3D viewers to gif and PostScript image files. Results written to simple comma separated values files are imported by the student into their preferred data analysis tool. Students can vary random seeds or adjust parameters of physics processes to better understand the stochastic nature of subatomic physics. Results: TOPAS-edu has been successfully deployed as the centerpiece of a physics course for master’s students at Queen’s University Belfast. Tutorials developed there takes students through a step by step course on the basics of particle transport and interaction, scattering, Bremsstrahlung, etc. At each step in the course, students build simulated experimental setups and then analyze the simulated results. Lessons build one upon another so that a student might end up with a full simulation of a medical accelerator, a water-phantom or an imager. Conclusion: TOPAS-edu was well received by students. A second application of TOPAS-edu is currently in development at Zurich University of Applied Sciences, Switzerland. It is our eventual goal to make TOPAS-edu available free of charge to any non-profit organization, along with associated tutorial materials developed by the TOPAS-edu community. Work supported in part by the U.S. Department of Energy under contract number DE-AC02-76SF00515. B. Villagomez-Bernabe is supported by CONACyT (Mexican Council for Science and Technology) project 231844.« less
Scanning electron microscope automatic defect classification of process induced defects
NASA Astrophysics Data System (ADS)
Wolfe, Scott; McGarvey, Steve
2017-03-01
With the integration of high speed Scanning Electron Microscope (SEM) based Automated Defect Redetection (ADR) in both high volume semiconductor manufacturing and Research and Development (R and D), the need for reliable SEM Automated Defect Classification (ADC) has grown tremendously in the past few years. In many high volume manufacturing facilities and R and D operations, defect inspection is performed on EBeam (EB), Bright Field (BF) or Dark Field (DF) defect inspection equipment. A comma separated value (CSV) file is created by both the patterned and non-patterned defect inspection tools. The defect inspection result file contains a list of the inspection anomalies detected during the inspection tools' examination of each structure, or the examination of an entire wafers surface for non-patterned applications. This file is imported into the Defect Review Scanning Electron Microscope (DRSEM). Following the defect inspection result file import, the DRSEM automatically moves the wafer to each defect coordinate and performs ADR. During ADR the DRSEM operates in a reference mode, capturing a SEM image at the exact position of the anomalies coordinates and capturing a SEM image of a reference location in the center of the wafer. A Defect reference image is created based on the Reference image minus the Defect image. The exact coordinates of the defect is calculated based on the calculated defect position and the anomalies stage coordinate calculated when the high magnification SEM defect image is captured. The captured SEM image is processed through either DRSEM ADC binning, exporting to a Yield Analysis System (YAS), or a combination of both. Process Engineers, Yield Analysis Engineers or Failure Analysis Engineers will manually review the captured images to insure that either the YAS defect binning is accurately classifying the defects or that the DRSEM defect binning is accurately classifying the defects. This paper is an exploration of the feasibility of the utilization of a Hitachi RS4000 Defect Review SEM to perform Automatic Defect Classification with the objective of the total automated classification accuracy being greater than human based defect classification binning when the defects do not require multiple process step knowledge for accurate classification. The implementation of DRSEM ADC has the potential to improve the response time between defect detection and defect classification. Faster defect classification will allow for rapid response to yield anomalies that will ultimately reduce the wafer and/or the die yield.
Quantification of dsDNA using the Hitachi F-7000 Fluorescence Spectrophotometer and PicoGreen dye.
Moreno, Luis A; Cox, Kendra L
2010-11-05
Quantification of DNA, especially in small concentrations, is an important task with a wide range of biological applications including standard molecular biology assays such as synthesis and purification of DNA, diagnostic applications such as quantification of DNA amplification products, and detection of DNA molecules in drug preparations. During this video we will demonstrate the capability of the Hitachi F-7000 Fluorescence Spectrophotometer equipped with a Micro Plate Reader accessory to perform dsDNA quantification using Molecular Probes Quant-it PicoGreen dye reagent kit. The F-7000 Fluorescence Spectrophotometer offers high sensitivity and high speed measurements. It is a highly flexible system capable of measuring fluorescence, luminescence, and phosphorescence. Several measuring modes are available, including wavelength scan, time scan, photometry and 3-D scan measurement. The spectrophotometer has sensitivity in the range of 50 picomoles of fluorescein when using a 300 μL sample volume in the microplate, and is capable of measuring scan speeds of 60,000 nm/minute. It also has a wide dynamic range of up to 5 orders of magnitude which allows for the use of calibration curves over a wide range of concentrations. The optical system uses all reflective optics for maximum energy and sensitivity. The standard wavelength range is 200 to 750 nm, and can be extended to 900 nm when using one of the optional near infrared photomultipliers. The system allows optional temperature control for the plate reader from 5 to 60 degrees Celsius using an optional external temperature controlled liquid circulator. The microplate reader allows for the use of 96 well microplates, and the measuring speed for 96 wells is less than 60 seconds when using the kinetics mode. Software controls for the F-7000 and Microplate Reader are also highly flexible. Samples may be set in either column or row formats, and any combination of wells may be chosen for sample measurements. This allows for optimal utilization of the microplate. Additionally, the software allows importing micro plate sample configurations created in Excel and saved in comma separated values, or "csv" format. Microplate measuring configurations can be saved and recalled by the software for convenience and increased productivity. Data results can be output to a standard report, to Excel, or to an optional Report Generator Program.
Quantification of dsDNA using the Hitachi F-7000 Fluorescence Spectrophotometer and PicoGreen Dye
Moreno, Luis A.; Cox, Kendra L.
2010-01-01
Quantification of DNA, especially in small concentrations, is an important task with a wide range of biological applications including standard molecular biology assays such as synthesis and purification of DNA, diagnostic applications such as quantification of DNA amplification products, and detection of DNA molecules in drug preparations. During this video we will demonstrate the capability of the Hitachi F-7000 Fluorescence Spectrophotometer equipped with a Micro Plate Reader accessory to perform dsDNA quantification using Molecular Probes Quant-it PicoGreen dye reagent kit. The F-7000 Fluorescence Spectrophotometer offers high sensitivity and high speed measurements. It is a highly flexible system capable of measuring fluorescence, luminescence, and phosphorescence. Several measuring modes are available, including wavelength scan, time scan, photometry and 3-D scan measurement. The spectrophotometer has sensitivity in the range of 50 picomoles of fluorescein when using a 300 μL sample volume in the microplate, and is capable of measuring scan speeds of 60,000 nm/minute. It also has a wide dynamic range of up to 5 orders of magnitude which allows for the use of calibration curves over a wide range of concentrations. The optical system uses all reflective optics for maximum energy and sensitivity. The standard wavelength range is 200 to 750 nm, and can be extended to 900 nm when using one of the optional near infrared photomultipliers. The system allows optional temperature control for the plate reader from 5 to 60 degrees Celsius using an optional external temperature controlled liquid circulator. The microplate reader allows for the use of 96 well microplates, and the measuring speed for 96 wells is less than 60 seconds when using the kinetics mode. Software controls for the F-7000 and Microplate Reader are also highly flexible. Samples may be set in either column or row formats, and any combination of wells may be chosen for sample measurements. This allows for optimal utilization of the microplate. Additionally, the software allows importing micro plate sample configurations created in Excel and saved in comma separated values, or "csv" format. Microplate measuring configurations can be saved and recalled by the software for convenience and increased productivity. Data results can be output to a standard report, to Excel, or to an optional Report Generator Program. PMID:21189464
A Java-based tool for creating KML files from GPS waypoints
NASA Astrophysics Data System (ADS)
Kinnicutt, P. G.; Rivard, C.; Rimer, S.
2008-12-01
Google Earth provides a free tool with powerful capabilities for visualizing geoscience images and data. Commercial software tools exist for doing sophisticated digitizing and spatial modeling , but for the purposes of presentation, visualization and overlaying aerial images with data Google Earth provides much of the functionality. Likewise, with current technologies in GPS (Global Positioning System) systems and with Google Earth Plus, it is possible to upload GPS waypoints, tracks and routes directly into Google Earth for visualization. However, older technology GPS units and even low-cost GPS units found today may lack the necessary communications interface to a computer (e.g. no Bluetooth, no WiFi, no USB, no Serial, etc.) or may have an incompatible interface, such as a Serial port but no USB adapter available. In such cases, any waypoints, tracks and routes saved in the GPS unit or recorded in a field notebook must be manually transferred to a computer for use in a GIS system or other program. This presentation describes a Java-based tool developed by the author which enables users to enter GPS coordinates in a user-friendly manner, then save these coordinates in a Keyhole MarkUp Language (KML) file format, for visualization in Google Earth. This tool either accepts user-interactive input or accepts input from a CSV (Comma Separated Value) file, which can be generated from any spreadsheet program. This tool accepts input in the form of lat/long or UTM (Universal Transverse Mercator) coordinates. This presentation describes this system's applicability through several small case studies. This free and lightweight tool simplifies the task of manually inputting GPS data into Google Earth for people working in the field without an automated mechanism for uploading the data; for instance, the user may not have internet connectivity or may not have the proper hardware or software. Since it is a Java application and not a web- based tool, it can be installed on one's field laptop and the GPS data can be manually entered without the need for internet connectivity. This tool provides a table view of the GPS data, but lacks a KML viewer to view the data overlain on top of an aerial view, as this viewer functionality is provided in Google Earth. The tool's primary contribution lies in its more convenient method for entering the GPS data manually when automated technologies are not available.
The Figure.tar.gz contains a directory for each WRF ensemble run. In these directories are *.csv files for each meteorology variable examined. These are comma delimited text files that contain statistics for each observation site. Also provided is an R script that reads these files (user would need to change directory pointers) and computes the variability of error and bias of the ensemble at each site and plots these for reproduction of figure 3.This dataset is associated with the following publication:Gilliam , R., C. Hogrefe , J. Godowitch, S. Napelenok , R. Mathur , and S.T. Rao. Impact of inherent meteorology uncertainty on air quality model predictions. JOURNAL OF GEOPHYSICAL RESEARCH-ATMOSPHERES. American Geophysical Union, Washington, DC, USA, 120(23): 12,259–12,280, (2015).
Sombatmaithai, Alita; Pattanaprichakul, Penvadee; Tuchinda, Papapit; Surawan, Theetat; Muanprasart, Chanai; Matthapan, Lalita; Bunyaratavej, Sumanas
2015-04-01
Tinea capitis is unusual and often misdiagnosed in healthy adults. We report a case of a healthy woman with a several-year history of asymptomatic, bizarre-shaped, non-scarring alopecia. She had used over-the-counter ketoconazole shampoo regularly for a long time. An initial potassium hydroxide preparation showed negative result for fungal organism. The scalp biopsy revealed endothrix infection, and dermoscopic examination demonstrated the comma hair and corkscrew hair signs. The fungal culture showed Trichophyton tonsurans. The daily use of antifungal shampoo could be the important factor to conceal clinical and laboratory findings for diagnosis of T. tonsurans tinea capitis in our case, which required high clinical suspicion and histopathology and dermoscopic examinations.
Sombatmaithai, Alita; Pattanaprichakul, Penvadee; Tuchinda, Papapit; Surawan, Theetat; Muanprasart, Chanai; Matthapan, Lalita; Bunyaratavej, Sumanas
2015-01-01
Tinea capitis is unusual and often misdiagnosed in healthy adults. We report a case of a healthy woman with a several-year history of asymptomatic, bizarre-shaped, non-scarring alopecia. She had used over-the-counter ketoconazole shampoo regularly for a long time. An initial potassium hydroxide preparation showed negative result for fungal organism. The scalp biopsy revealed endothrix infection, and dermoscopic examination demonstrated the comma hair and corkscrew hair signs. The fungal culture showed Trichophyton tonsurans. The daily use of antifungal shampoo could be the important factor to conceal clinical and laboratory findings for diagnosis of T. tonsurans tinea capitis in our case, which required high clinical suspicion and histopathology and dermoscopic examinations. PMID:26114071
Separation of solids by varying the bulk density of a fluid separating medium
Peterson, Palmer L.; Duffy, James B.; Tokarz, Richard D.
1978-01-01
A method and apparatus for separating objects having a density greater than a selected density value from objects having a density less than said selected density value. The method typically comprises: (a) providing a separation vessel having an upper and lower portion, said vessel containing a liquid having a density exceeding said selected density value; (b) reducing the apparent density of the liquid to said selected density value by introducing solid, bubble-like bodies having a density less than that of the liquid into the lower portion of the vessel and permitting them to rise therethrough; (c) introducing the objects to be separated into the separation vessel and permitting the objects having a density greater than the apparent density of the liquid to sink to the lower portion of the vessel, while the objects having a density less than said selected density value float in the upper portion of the vessel; and (d) separately removing the higher density objects in the lower portion and the lower density objects in the upper portion from the separation vessel. The apparatus typically comprises: (a) a vessel containing a liquid having a density such that at least part of said objects having a density exceeding said selected density value will float therein; (b) means to place said objects into said vessel; (c) means to reduce the effective density of at least a portion of said liquid to said selected density value, whereby said objects having a density exceeding said selected density value sink into said liquid and said objects having a density less than said selected density value remain afloat, said means to adjust the effective density comprising solid, bubble-like bodies having a density less than said selected density value and means for introducing said bodies into said liquid; and (d) means for separately removing said objects having a density exceeding said selected density value and said objects having a density less than said selected density value from said vessel.
Spreadsheet Toolkit for Ulysses Hi-Scale Measurements of Interplanetary Ions and Electrons
NASA Astrophysics Data System (ADS)
Reza, J. Z.; Lanzerotti, L. J.; Denker, C.; Patterson, D.; Amstrong, T. P.
2004-05-01
Throughout the entire Ulysses out-of-the-ecliptic solar polar mission, the Heliosphere Instrument for Spectra, Composition, and Anisotropy at Low Energies (HI-SCALE) has collected measurements of interplanetary ions and electrons. Time-series of electron and ion fluxes obtained since 1990 have been carefully calibrated and will be stored in a data management system, which will be publicly accessible via the WWW. The goal of the Virtual Solar Observatory (VSO) is to provide data uniformly and efficiently to a diverse user community. However, data dissemination can only be a first step, which has to be followed by a suite of data analysis tools that are tailored towards a diverse user community in science, technology, and education. The widespread use and familiarity of spreadsheets, which are available at low cost or open source for many operating systems, make them an interesting tool to investigate for the analysis of HI-SCALE data. The data are written in comma separated variable (CSV) format, which is commonly used in spreadsheet programs. CSV files can simply be linked as external data to spreadsheet templates, which in turn can be used to generate tables and figures of basic statistical properties and frequency distributions, temporal evolution of electron and ion spectra, comparisons of various energy channels, automatic detection of solar events, solar cycle variations, and space weather. Exploring spreadsheet-assisted data analysis in the context of information technology research, data base information search and retrieval, and data visualization potentially impacts other VSO components, where diverse user communities are targeted. Finally, this presentation is the result of an undergraduate research project, which will allow us to evaluate the performance of user-based spreadsheet analysis "benchmarked" at the undergraduate skill level.
"AFacet": a geometry based format and visualizer to support SAR and multisensor signature generation
NASA Astrophysics Data System (ADS)
Rosencrantz, Stephen; Nehrbass, John; Zelnio, Ed; Sudkamp, Beth
2018-04-01
When simulating multisensor signature data (including SAR, LIDAR, EO, IR, etc...), geometry data are required that accurately represent the target. Most vehicular targets can, in real life, exist in many possible configurations. Examples of these configurations might include a rotated turret, an open door, a missing roof rack, or a seat made of metal or wood. Previously we have used the Modelman (.mmp) format and tool to represent and manipulate our articulable models. Unfortunately Modelman is now an unsupported tool and an undocumented binary format. Some work has been done to reverse engineer a reader in Matlab so that the format could continue to be useful. This work was tedious and resulted in an incomplete conversion. In addition, the resulting articulable models could not be altered and re-saved in the Modelman format. The AFacet (.afacet) articulable facet file format is a replacement for the binary Modelman (.mmp) file format. There is a one-time straight forward path for conversion from Modelman to the AFacet format. It is a simple ASCII, comma separated, self-documenting format that is easily readable (and in many cases usefully editable) by a human with any text editor, preventing future obsolescence. In addition, because the format is simple, it is relatively easy for even the most novice programmer to create a program to read and write AFacet files in any language without any special libraries. This paper presents the AFacet format, as well as a suite of tools for creating, articulating, manipulating, viewing, and converting the 370+ (when this paper was written) models that have been converted to the AFacet format.
Surgical anatomy of round window and its implications for cochlear implantation.
Singla, A; Sahni, D; Gupta, A K; Loukas, M; Aggarwal, A
2014-04-01
The objective of this work was to study the morphometry and morphology of the round window (RW) and its relationships with the internal carotid artery, jugular bulb (JB), facial nerve and oval window (OW). Fifty cadaveric temporal bones were microdissected to expose the medial wall of the middle ear. The areas around the RW were cleared and its shape, height and width were noted. Its distances from the carotid canal (CC), jugular fossa (JF), facial canal (FC), and OW were measured. Oval, round, triangular, comma, quadrangular, and pear shapes of RW were observed. The average height and width of the RW were 1.62 ± 0.77 mm and 1.15 ± 0.39 mm, respectively. There was a statistically significant correlation (r = 0.4, P < 0.01) between the height and width. The distances between the RW and the CC, JF, FC, and OW were in the ranges 4.39-11.05 mm, 0.38-8.65 mm, 2.99-6.3 mm, and 1.39-3.57 mm, respectively. In 8% of cases, the distance between the RW and the JF was <1 mm. There were no statistically significant differences with regard to age group, gender, or side. Electrode insertion can be challenging in cases where the height and width of the RW are <1 mm. The thin bone separating the roof of the JF from the RW (<1 mm in 8%) highlights a potential risk of injury to the JB during cochleostomy placement. This information could be useful for selecting cochlear implant electrodes in order to avoid potential risks to vital neurovascular structures during implant surgery. Copyright © 2013 Wiley Periodicals, Inc.
Hiroyoshi, S; Mitsuhashi, J
1999-02-01
The relationship between sperm quantity in the duplex and that in the vasa deferentia was examined in the Asian comma butterfly, Polygonia c-aureum. In virgin males, the number of eupyrene sperm bundles in the duplex increased linearly with age, whereas that in the vasa deferentia was consistently small. However, numerous sperm were found in the vasa deferentia of males immediately after mating. The number of eupyrene sperm bundles in the vasa deferentia after mating significantly increased with age and with increasing the time interval between matings. From these and other results, it was suggested that some sperm in the duplex were moved back to the vasa deferentia during mating, and that such sperm reflux provides a means to save sperm for multiple mating.
Procedures for Separations within Batches of Values, 1. The Orderly Tool Kit and Some Heuristics
1989-03-01
separations within batches of values, I. The orderly tool kit and some heuristics by Thu Hoang* and John W. Tukey** *Universite Rene Descartes ...separations with batches of values, . The orderly tool kit and heuristics Thu Hoang* and John W. Tukey** *Universite Rene Descartes Laboratoire de
SEPARATION OF SCANDIUM VALUES FORM IRON VALUES BY SOLVENT EXTRACTION
Kuhlman, C.W. Jr.; Lang, G.P.
1961-12-19
A process is given for separating scandium from trivalent iron values. In this process, an aqueous nitric acid solution is contacted with a water- immiscible alkyl phosphate solution, the aqueous solution containing the values to be separated, whereby the scandium is taken up by the alkyl phosphate. The aqueous so1ution is preferably saturated with magnesium nitrate to retain the iron in the aqueous solution. (AEC)
Extratropical Cyclone over the United Kingdom
2014-02-14
Soggy winters are not unusual in the United Kingdom, but this winter has been in a category of its own. UK Met Office meteorologists had just declared January 2014 the wettest month on record for parts of southern Britain when another series of storms swept across the area in early February. The Moderate Resolution Imaging Spectroradiometer (MODIS) on NASA’s Terra satellite captured this image of an extratropical cyclone bearing down on the United Kingdom on February 12, 2014. Mature extratropical cyclones often feature comma-shaped cloud patterns that are the product of “conveyor belt” circulation. While heavy precipitation is often present near the low-pressure head of the comma, a slot of dry air usually trails the west side of the tail. The storm brought the United Kingdom yet another round of heavy rain, as well as winds that exceeded 160 kilometers (100 miles) per hour. It snarled traffic, disrupted train service, and caused power outages for more than 700,000 people. The also exacerbated severe flooding in southern England. More than 5,800 homes have flooded since early December, according to media reports. Authorities have deployed thousands of soldiers to towns and cities in southern England to help with flood recovery. Meanwhile, the Met Office was forecasting more of the same. They warned that another system bearing heavy rain and winds was lining up to push into the United Kingdom from the southeast on Friday morning. NASA Earth Observatory image by Jesse Allen, using data from the Land Atmosphere Near real-time Capability for EOS (LANCE). Caption by Adam Voiland. More info: earthobservatory.nasa.gov/NaturalHazards/view.php?id=83127 Instrument: Terra - MODIS Credit: NASA Earth Observatory NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram
URANIUM DECONTAMINATION WITH RESPECT TO ZIRCONIUM
Vogler, S.; Beederman, M.
1961-05-01
A process is given for separating uranium values from a nitric acid aqueous solution containing uranyl values, zirconium values and tetravalent plutonium values. The process comprises contacting said solution with a substantially water-immiscible liquid organic solvent containing alkyl phosphate, separating an organic extract phase containing the uranium, zirconium, and tetravalent plutonium values from an aqueous raffinate, contacting said organic extract phase with an aqueous solution 2M to 7M in nitric acid and also containing an oxalate ion-containing substance, and separating a uranium- containing organic raffinate from aqueous zirconium- and plutonium-containing extract phase.
Tsukamoto, Takafumi; Yasunaga, Takuo
2014-11-01
Eos (Extensible object-oriented system) is one of the powerful applications for image processing of electron micrographs. In usual cases, Eos works with only character user interfaces (CUI) under the operating systems (OS) such as OS-X or Linux, not user-friendly. Thus, users of Eos need to be expert at image processing of electron micrographs, and have a little knowledge of computer science, as well. However, all the persons who require Eos does not an expert for CUI. Thus we extended Eos to a web system independent of OS with graphical user interfaces (GUI) by integrating web browser.Advantage to use web browser is not only to extend Eos with GUI, but also extend Eos to work under distributed computational environment. Using Ajax (Asynchronous JavaScript and XML) technology, we implemented more comfortable user-interface on web browser. Eos has more than 400 commands related to image processing for electron microscopy, and the usage of each command is different from each other. Since the beginning of development, Eos has managed their user-interface by using the interface definition file of "OptionControlFile" written in CSV (Comma-Separated Value) format, i.e., Each command has "OptionControlFile", which notes information for interface and its usage generation. Developed GUI system called "Zephyr" (Zone for Easy Processing of HYpermedia Resources) also accessed "OptionControlFIle" and produced a web user-interface automatically, because its mechanism is mature and convenient,The basic actions of client side system was implemented properly and can supply auto-generation of web-form, which has functions of execution, image preview, file-uploading to a web server. Thus the system can execute Eos commands with unique options for each commands, and process image analysis. There remain problems of image file format for visualization and workspace for analysis: The image file format information is useful to check whether the input/output file is correct and we also need to provide common workspace for analysis because the client is physically separated from a server. We solved the file format problem by extension of rules of OptionControlFile of Eos. Furthermore, to solve workspace problems, we have developed two type of system. The first system is to use only local environments. The user runs a web server provided by Eos, access to a web client through a web browser, and manipulate the local files with GUI on the web browser. The second system is employing PIONE (Process-rule for Input/Output Negotiation Environment), which is our developing platform that works under heterogenic distributed environment. The users can put their resources, such as microscopic images, text files and so on, into the server-side environment supported by PIONE, and so experts can write PIONE rule definition, which defines a workflow of image processing. PIONE run each image processing on suitable computers, following the defined rule. PIONE has the ability of interactive manipulation, and user is able to try a command with various setting values. In this situation, we contribute to auto-generation of GUI for a PIONE workflow.As advanced functions, we have developed a module to log user actions. The logs include information such as setting values in image processing, procedure of commands and so on. If we use the logs effectively, we can get a lot of advantages. For example, when an expert may discover some know-how of image processing, other users can also share logs including his know-hows and so we may obtain recommendation workflow of image analysis, if we analyze logs. To implement social platform of image processing for electron microscopists, we have developed system infrastructure, as well. © The Author 2014. Published by Oxford University Press on behalf of The Japanese Society of Microscopy. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Separation by solvent extraction
Holt, Jr., Charles H.
1976-04-06
17. A process for separating fission product values from uranium and plutonium values contained in an aqueous solution, comprising adding an oxidizing agent to said solution to secure uranium and plutonium in their hexavalent state; contacting said aqueous solution with a substantially water-immiscible organic solvent while agitating and maintaining the temperature at from -1.degree. to -2.degree. C. until the major part of the water present is frozen; continuously separating a solid ice phase as it is formed; separating a remaining aqueous liquid phase containing fission product values and a solvent phase containing plutonium and uranium values from each other; melting at least the last obtained part of said ice phase and adding it to said separated liquid phase; and treating the resulting liquid with a new supply of solvent whereby it is practically depleted of uranium and plutonium.
Global transport calculations with an equivalent barotropic system
NASA Technical Reports Server (NTRS)
Salby, Murry L.; O'Sullivan, Donal; Garcia, Rolando R.; Tribbia, Joseph
1990-01-01
Transport properties of the two-dimensional equations governing equivalent barotropic motion are investigated on the sphere. This system has ingredients such as forcing, equivalent depth, and thermal dissipation explicitly represented, and takes into account compression effects associated with vertical motion along isentropic surfaces. Horizontal transport properties of this system are investigated under adiabatic and diabatic conditions for different forms of dissipation, and over a range of resolutions. It is shown that forcing represetative of time-mean and amplified conditions at 10 mb leads to the behavior typical of observations at this level. The displacement of the polar night vortex and its distortion into a comma shape are evident, as is irreversible mixing under sufficiently strong forcing amplitude. It is shown that thermal dissipation influences the behavior significantly by inhibiting the amplification of unstable eddies and thereby the horizontal stirring of air.
A Prototype Web-based system for GOES-R Space Weather Data
NASA Astrophysics Data System (ADS)
Sundaravel, A.; Wilkinson, D. C.
2010-12-01
The Geostationary Operational Environmental Satellite-R Series (GOES-R) makes use of advanced instruments and technologies to monitor the Earth's surface and provide with accurate space weather data. The first GOES-R series satellite is scheduled to be launched in 2015. The data from the satellite will be widely used by scientists for space weather modeling and predictions. This project looks into the ways of how these datasets can be made available to the scientists on the Web and to assist them on their research. We are working on to develop a prototype web-based system that allows users to browse, search and download these data. The GOES-R datasets will be archived in NetCDF (Network Common Data Form) and CSV (Comma Separated Values) format. The NetCDF is a self-describing data format that contains both the metadata information and the data. The data is stored in an array-oriented fashion. The web-based system will offer services in two ways: via a web application (portal) and via web services. Using the web application, the users can download data in NetCDF or CSV format and can also plot a graph of the data. The web page displays the various categories of data and the time intervals for which the data is available. The web application (client) sends the user query to the server, which then connects to the data sources to retrieve the data and delivers it to the users. Data access will also be provided via SOAP (Simple Object Access Protocol) and REST (Representational State Transfer) web services. These provide functions which can be used by other applications to fetch data and use the data for further processing. To build the prototype system, we are making use of proxy data from existing GOES and POES space weather datasets. Java is the programming language used in developing tools that formats data to NetCDF and CSV. For the web technology we have chosen Grails to develop both the web application and the services. Grails is an open source web application framework based on the Groovy language. We are also making use of the THREDDS (Thematic Realtime Environmental Distributed Data Services) server to publish and access the NetCDF files. We have completed developing software tools to generate NetCDF and CSV data files and also tools to translate NetCDF to CSV. The current phase of the project involves in designing and developing the web interface.
Sediment-Hosted Copper Deposits of the World: Deposit Models and Database
Cox, Dennis P.; Lindsey, David A.; Singer, Donald A.; Diggles, Michael F.
2003-01-01
Introduction This publication contains four descriptive models and four grade-tonnage models for sediment hosted copper deposits. Descriptive models are useful in exploration planning and resource assessment because they enable the user to identify deposits in the field and to identify areas on geologic and geophysical maps where deposits could occur. Grade and tonnage models are used in resource assessment to predict the likelihood of different combinations of grades and tonnages that could occur in undiscovered deposits in a specific area. They are also useful in exploration in deciding what deposit types meet the economic objectives of the exploration company. The models in this report supersede the sediment-hosted copper models in USGS Bulletin 1693 (Cox, 1986, and Mosier and others, 1986) and are subdivided into a general type and three subtypes. The general model is useful in classifying deposits whose features are obscured by metamorphism or are otherwise poorly described, and for assessing regions in which the geologic environments are poorly understood. The three subtypes are based on differences in deposit form and environments of deposition. These differences are described under subtypes in the general model. Deposit models are based on the descriptions of geologic environments and physical characteristics, and on metal grades and tonnages of many individual deposits. Data used in this study are presented in a database representing 785 deposits in nine continents. This database was derived partly from data published by Kirkham and others (1994) and from new information in recent publications. To facilitate the construction of grade and tonnage models, the information, presented by Kirkham in disaggregated form, was brought together to provide a single grade and a single tonnage for each deposit. Throughout the report individual deposits are defined as being more than 2,000 meters from the nearest adjacent deposit. The deposit models are presented here as a PDF file. The database can be most conveniently read in FileMaker Pro. For those who do not have the FileMaker application, Microsoft-Excel, tab-delimited-ASCII and comma-separated-value files are included. The reader may be interested in a similar publication on porphyry copper deposits (Singer and others, 2005) also available online. The Google Earth image is not intended to be viewed at the highest possible magnification because the resolution of the database is plus or minus two kilometers. At extreme zoom settings, the deposit locations may not coincide with the Google-Earth images of the mine workings.
Kraus, K.A.; Moore, G.E.
1959-02-01
A process is presented for the separation of protactinium values from an aqueous solution containing Pa and Th values comprising establishing in the solution a HCl concentration of from 4 to 11 molar, contacting the resulting solution with an anion-exchange adsorbent, such as a polystyrene divinyl benzene polymer with quatenary amines as the active exchange group, to effect the adsorption of Pa values upon the adsorbent while leaving Th values in the solution, and then washlng the separated Pa bearing adsorbent with an aqueous solution of HCl of less than 4M to exclusively elute Pa values from the adsorbent. If hexavalent U values are contained in the original solution thcy are adsorbed on the resin together with Pa. A separation is offected chromatographically by percolating the resin with aqueous HCl.
Dvořák, Martin; Svobodová, Jana; Dubský, Pavel; Riesová, Martina; Vigh, Gyula; Gaš, Bohuslav
2015-03-01
Although the classical formula of peak resolution was derived to characterize the extent of separation only for Gaussian peaks of equal areas, it is often used even when the peaks follow non-Gaussian distributions and/or have unequal areas. This practice can result in misleading information about the extent of separation in terms of the severity of peak overlap. We propose here the use of the equivalent peak resolution value, a term based on relative peak overlap, to characterize the extent of separation that had been achieved. The definition of equivalent peak resolution is not constrained either by the form(s) of the concentration distribution function(s) of the peaks (Gaussian or non-Gaussian) or the relative area of the peaks. The equivalent peak resolution value and the classically defined peak resolution value are numerically identical when the separated peaks are Gaussian and have identical areas and SDs. Using our new freeware program, Resolution Analyzer, one can calculate both the classically defined and the equivalent peak resolution values. With the help of this tool, we demonstrate here that the classical peak resolution values mischaracterize the extent of peak overlap even when the peaks are Gaussian but have different areas. We show that under ideal conditions of the separation process, the relative peak overlap value is easily accessible by fitting the overall peak profile as the sum of two Gaussian functions. The applicability of the new approach is demonstrated on real separations. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Noncommutative Field Theories and (super)string Field Theories
NASA Astrophysics Data System (ADS)
Aref'eva, I. Ya.; Belov, D. M.; Giryavets, A. A.; Koshelev, A. S.; Medvedev, P. B.
2002-11-01
In this lecture notes we explain and discuss some ideas concerning noncommutative geometry in general, as well as noncommutative field theories and string field theories. We consider noncommutative quantum field theories emphasizing an issue of their renormalizability and the UV/IR mixing. Sen's conjectures on open string tachyon condensation and their application to the D-brane physics have led to wide investigations of the covariant string field theory proposed by Witten about 15 years ago. We review main ingredients of cubic (super)string field theories using various formulations: functional, operator, conformal and the half string formalisms. The main technical tools that are used to study conjectured D-brane decay into closed string vacuum through the tachyon condensation are presented. We describe also methods which are used to study the cubic open string field theory around the tachyon vacuum: construction of the sliver state, "comma" and matrix representations of vertices.
NASA Astrophysics Data System (ADS)
Kraljić, K.; Strüngmann, L.; Fimmel, E.; Gumbel, M.
2018-01-01
The genetic code is degenerated and it is assumed that redundancy provides error detection and correction mechanisms in the translation process. However, the biological meaning of the code's structure is still under current research. This paper presents a Genetic Code Analysis Toolkit (GCAT) which provides workflows and algorithms for the analysis of the structure of nucleotide sequences. In particular, sets or sequences of codons can be transformed and tested for circularity, comma-freeness, dichotomic partitions and others. GCAT comes with a fertile editor custom-built to work with the genetic code and a batch mode for multi-sequence processing. With the ability to read FASTA files or load sequences from GenBank, the tool can be used for the mathematical and statistical analysis of existing sequence data. GCAT is Java-based and provides a plug-in concept for extensibility. Availability: Open source Homepage:http://www.gcat.bio/
Are two spaces better than one? The effect of spacing following periods and commas during reading.
Johnson, Rebecca L; Bui, Becky; Schmitt, Lindsay L
2018-04-24
The most recent edition of the American Psychological Association (APA) Manual states that two spaces should follow the punctuation at the end of a sentence. This is in contrast to the one-space requirement from previous editions. However, to date, there has been no empirical support for either convention. In the current study, participants performed (1) a typing task to assess spacing usage and (2) an eye-tracking experiment to assess the effect that punctuation spacing has on reading performance. Although comprehension was not affected by punctuation spacing, the eye movement record suggested that initial processing of the text was facilitated when periods were followed by two spaces, supporting the change made to the APA Manual. Individuals' typing usage also influenced these effects such that those who use two spaces following a period showed the greatest overall facilitation from reading with two spaces.
National geochronological and natural radioelement data bases
Zartman, Robert E.; Bush, Charles A.; Abston, C.C.
1995-01-01
This CD-ROM contains both the National Geochronological Data Base [NGDB] and the Natural Radioelement Data Base [NRDB]. Supporting location, geologic, and reference information is provided for both data bases. The NGDB is a compilation of more than 30,000 individual published Pb-alpha, fission-track, K-Ar, Rb-Sr, U-Th-Pb, and Sm-Nd rock and mineral ages reported on approximately 18,000 dated samples from the United States. A program is provided to search the data files by latitude and longitude, state, analytical method, and age range. The NGDB is provided as quote-comma delimited files that can be entered into most commercial spreadsheet programs. The NRDB gives gamma-ray spectrometric analyses of the natural radioelements (U, Th, and K) for more than 8500 whole-rock samples obtained under the USGS Natural Radioelement Distribution Project. A program is provided to search the data files by state, keyword, U content, Th content, and K content.
Range expansion through fragmented landscapes under a variable climate
Bennie, Jonathan; Hodgson, Jenny A; Lawson, Callum R; Holloway, Crispin TR; Roy, David B; Brereton, Tom; Thomas, Chris D; Wilson, Robert J
2013-01-01
Ecological responses to climate change may depend on complex patterns of variability in weather and local microclimate that overlay global increases in mean temperature. Here, we show that high-resolution temporal and spatial variability in temperature drives the dynamics of range expansion for an exemplar species, the butterfly Hesperia comma. Using fine-resolution (5 m) models of vegetation surface microclimate, we estimate the thermal suitability of 906 habitat patches at the species' range margin for 27 years. Population and metapopulation models that incorporate this dynamic microclimate surface improve predictions of observed annual changes to population density and patch occupancy dynamics during the species' range expansion from 1982 to 2009. Our findings reveal how fine-scale, short-term environmental variability drives rates and patterns of range expansion through spatially localised, intermittent episodes of expansion and contraction. Incorporating dynamic microclimates can thus improve models of species range shifts at spatial and temporal scales relevant to conservation interventions. PMID:23701124
New objective of the "New Horizons" in the Kuiper belt
NASA Astrophysics Data System (ADS)
Vidmachenko, A. P.
2018-05-01
The scientific purpose of the study of the small planet 2014 MU69 is to obtain characteristics on its geology and morphology, to perform mapping of the surface composition: the search for ammonia, carbon monoxide, methane, water ice, etc. Also, it is planned to study its surface, the history of formation and development, measure the temperature, display the 3D topography in order to find out what it looks like, and by what it is differ, for example, from cometary nuclei, asteroids, dwarf planets, such as Pluto; search for any signs of activity, such as commas, to search for and explore possible satellites and / or rings, determine the mass, and so on. The spacecraft will visit MU69 1.01.2019. It is planned to get closer to its surface at a distance of about 3500 km. This will allow obtaining images of the surface with a resolution of up to 30 m.
SEPARATION OF URANYL AND RUTHENIUM VALUES BY THE TRIBUTYL PHOSPHATE EXTRACTION PROCESS
Wilson, A.S.
1961-05-01
A process is given for separating uranyl values from ruthenium values contained in an aqueous 3 to 4 M nitric acid solution. After the addition of hydrogen peroxide to obtain a concentration of 0.3 M, the uranium is selectively extracted with kerosene-diluted tributyl phosphate.
Televising Supreme Court and Other Federal Court Proceedings: Legislation and Issues
2006-11-08
proceedings, including the possible effect on judicial proceedings, separation of powers concerns, the purported educational value of such coverage, and...for Adverse Effects on Judicial Proceedings . . . . . . . . . . . . 11 Separation of Powers Concerns...proponents and opponents on myriad issues in the electronic media coverage debate — democratic values of government transparency, separation of powers , due
Gester, Kathrin; Jansen, Sebastian V; Stahl, Marion; Steinseifer, Ulrich
2015-05-01
Even though the separation of blood into erythrocyte-rich and erythrocyte-poor areas is well known in physiological setups such as small vessels, it has recently come into focus in small gaps in cardiovascular applications. Studies show that separation effects occur, for example, in gaps in hydrodynamic bearings, where they can have a positive effect on hemolysis. Separation effects depend on the hematocrit value, but due to visualization issues, studies in small gaps used very low hematocrit values. In this study, a test setup and an evaluation method for the investigation of separation effects of blood with hematocrit values of 30, 45, and 60% were developed. The erythrocyte distribution was evaluated by means of gray scale value distribution. This principle is based on the fact that an erythrocyte-rich region is more opaque than an erythrocyte-poor region. The experimental setup is designed in a way that no further processes (e.g., fluorescence labeling) need to be carried out which might change the properties of the membrane of the erythrocytes, and therefore their flow properties. Additionally, the method is executable with basic laboratory equipment, which makes it applicable for many laboratories. To validate the feasibility of the method, the influence of the diameter and the flow rate on the migration of erythrocytes were studied in micro channels for three different physiological hematocrit values. Even though no individual cells were traced, plasma layer and areas of high erythrocyte concentration could be identified. Dependencies of the erythrocyte distribution on flow rate and channel diameter were validated. The influence of the hematocrit value was demonstrated as well and showed the hematocrit value to be a crucial factor when investigating cell separation. The experimental results were consistent with findings in the literature. As the developed method is suitable for physiological hematocrit values and easy to handle, it provides an optimal basis for cell separation studies in gap models with whole blood, for example, hydrodynamic bearings, where it can be used to optimize these devices. Copyright © 2014 International Center for Artificial Organs and Transplantation and Wiley Periodicals, Inc.
BASIC PEROXIDE PRECIPITATION METHOD OF SEPARATING PLUTONIUM FROM CONTAMINANTS
Seaborg, G.T.; Perlman, I.
1959-02-10
A process is described for the separation from each other of uranyl values, tetravalent plutonium values and fission products contained in an aqueous acidic solution. First the pH of the solution is adjusted to between 2.5 and 8 and hydrogen peroxide is then added to the solution causing precipitation of uranium peroxide which carries any plutonium values present, while the fission products remain in solution. Separation of the uranium and plutonium values is then effected by dissolving the peroxide precipitate in an acidic solution and incorporating a second carrier precipitate, selective for plutonium. The plutonium values are thus carried from the solution while the uranium remains flissolved. The second carrier precipitate may be selected from among the group consisting of rare earth fluorides, and oxalates, zirconium phosphate, and bismuth lihosphate.
Facets : a Cloudcompare Plugin to Extract Geological Planes from Unstructured 3d Point Clouds
NASA Astrophysics Data System (ADS)
Dewez, T. J. B.; Girardeau-Montaut, D.; Allanic, C.; Rohmer, J.
2016-06-01
Geological planar facets (stratification, fault, joint…) are key features to unravel the tectonic history of rock outcrop or appreciate the stability of a hazardous rock cliff. Measuring their spatial attitude (dip and strike) is generally performed by hand with a compass/clinometer, which is time consuming, requires some degree of censoring (i.e. refusing to measure some features judged unimportant at the time), is not always possible for fractures higher up on the outcrop and is somewhat hazardous. 3D virtual geological outcrop hold the potential to alleviate these issues. Efficiently segmenting massive 3D point clouds into individual planar facets, inside a convenient software environment was lacking. FACETS is a dedicated plugin within CloudCompare v2.6.2 (http://cloudcompare.org/ ) implemented to perform planar facet extraction, calculate their dip and dip direction (i.e. azimuth of steepest decent) and report the extracted data in interactive stereograms. Two algorithms perform the segmentation: Kd-Tree and Fast Marching. Both divide the point cloud into sub-cells, then compute elementary planar objects and aggregate them progressively according to a planeity threshold into polygons. The boundaries of the polygons are adjusted around segmented points with a tension parameter, and the facet polygons can be exported as 3D polygon shapefiles towards third party GIS software or simply as ASCII comma separated files. One of the great features of FACETS is the capability to explore planar objects but also 3D points with normals with the stereogram tool. Poles can be readily displayed, queried and manually segmented interactively. The plugin blends seamlessly into CloudCompare to leverage all its other 3D point cloud manipulation features. A demonstration of the tool is presented to illustrate these different features. While designed for geological applications, FACETS could be more widely applied to any planar objects.
aCLIMAX 4.0.1, The new version of the software for analyzing and interpreting INS spectra
NASA Astrophysics Data System (ADS)
Ramirez-Cuesta, A. J.
2004-03-01
In Inelastic Neutron Scattering Spectroscopy, the neutron scattering intensity is plotted versus neutron energy loss giving a spectrum that looks like an infrared or a Raman spectrum. Unlike IR or Raman, INS does not have selection rules, i.e. all transitions are in principle observable. This particular characteristic makes INS a test bed for Density Functional Theory calculations of vibrational modes. aCLIMAX is the first user friendly program, within the Windows environment, that uses the output of normal modes to generate the calculated INS of the model molecule, making a lot easier to establish a connection between theory and experiment. Program summaryTitle of program: aCLIMAX 4.0.1 Catalogue identifier: ADSW Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADSW Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Operating systems: Windows 95 onwards, except Windows ME where it does not work Programming language used: Visual Basic Memory requirements: 64 MB No. of processors: 1 Has the code been parallelized: No No. of bytes in distributed program, including test data, etc.: 2 432 775 No. of lines in distributed program, including test data, etc.: 17 998 Distribution format: tar gzip file Nature of physical problem: Calculation of the Inelastic Neutron Scattering Spectra from DFT calculations of the vibrational density of states for molecules. Method of solution: INS spectral intensity calculated from normal modes analysis. Isolated molecule approximation. Typical time of running: From few seconds to few minutes depending on the size of the molecule. Unusual features of the program: Special care has to be taken in the case of computers that have different regional options than the English speaking countries, the decimal separator has to be set as "." (dot) instead of the usual "," (comma) that most countries use.
A mesoscale vortex over Halley Station, Antarctica
DOE Office of Scientific and Technical Information (OSTI.GOV)
Turner, J.; Lachlan-Cope, T.A.; Warren, D.E.
1993-05-01
A detailed analysis of the evolution and structure of a mesoscale vortex and associated cloud comma that developed at the eastern edge of the Weddell Sea, Antarctica, during the early part of January 1986 is presented. The system remained quasi-stationary for over three days close to the British research station Halley (75[degrees]36'S, 26'42[degrees]W) and gave severe weather with gale-force winds and prolonged snow. The formation and development of the system were investigated using conventional surface and upper-air meteorological observations taken at Halley, analyses from the U.K. Meteorological Office 15-level model, and satellite imagery and sounder data from the TIROS-N-NOAA seriesmore » of polar orbiting satellites. The thermal structure of the vortex was examined using atmospheric profiles derived from radiance measurements from the TIROS Operational Vertical Sounder. Details of the wind field were examined using cloud motion vectors derived from a sequence of Advanced Very High Resolution Radiometer images. The vortex developed inland of the Brunt Ice Shelf in a strong baroclinic zone separating warm air, which had been advected polewards down the eastern Weddell Sea, and cold air descending from the Antarctic Plateau. The system intensified when cold, continental air associated with an upper-level short-wave trough was advected into the vortex. A frontal cloud band developed when slantwise ascent of warm air took place at the leading edge of the cold-air outbreak. Most of the precipitation associated with the low occurred on this cloud band. The small sea surface-atmospheric temperature differences gave only limited heat fluxes and there was no indication of deep convection associated with the system. The vortex was driven by baroclinic forcing and had some features in common with the baroclinic type of polar lows that occur in the Northern Hemisphere. 25 refs., 14 figs.« less
Separate Populations of Neurons in Ventral Striatum Encode Value and Motivation
Gentry, Ronny N.; Goldstein, Brandon L.; Hearn, Taylor N.; Barnett, Brian R.; Kashtelyan, Vadim; Roesch, Matthew R.
2013-01-01
Neurons in the ventral striatum (VS) fire to cues that predict differently valued rewards. It is unclear whether this activity represents the value associated with the expected reward or the level of motivation induced by reward anticipation. To distinguish between the two, we trained rats on a task in which we varied value independently from motivation by manipulating the size of the reward expected on correct trials and the threat of punishment expected upon errors. We found that separate populations of neurons in VS encode expected value and motivation. PMID:23724077
STRIPPING OF URANIUM FROM ORGANIC EXTRACTANTS
Crouse, D.J. Jr.
1962-09-01
A liquid-liquid extraction method is given for recovering uranium values from uranium-containing solutions. Uranium is removed from a uranium-containing organic solution by contacting said organic solution with an aqueous ammonium carbonate solution substantially saturated in uranium values. A uranium- containing precipitate is thereby formed which is separated from the organic and aqueous phases. Uranium values are recovered from this separated precipitate. (AE C)
Fundamental study of phosphor separation by controlling magnetic force
NASA Astrophysics Data System (ADS)
Wada, Kohei; Mishima, Fumihito; Akiyama, Yoko; Nishijima, Shigehiro
2013-11-01
The phosphor wastes consist of phosphors with different emission colors, green (LAP), red (YOX), blue (BAM) and white (HP). It is required to recover and reuse the rare earth phosphors with high market value. In this study, we tried to separate the phosphor using the magnetic separation by HTS bulk magnet utilizing the differences of magnetic susceptibility by the type of phosphors. We succeeded in the successive separation of HP with low market value from YOX and BAM including the rare earth using the magnetic Archimedes method. In this method, vertical and radial components of the magnetic force were used.
Klimasauskas, Edward P.; Miller, Marti L.; Bradley, Dwight C.; Karl, Sue M.; Baichtal, James F.; Blodgett, Robert B.
2006-01-01
The Kuskokwim mineral belt of Bundtzen and Miller (1997) forms an important metallogenic region in southwestern Alaska that has yielded more than 3.22 million ounces of gold and 400,000 ounces of silver. Precious-metal and related deposits in this region associated with Late Cretaceous to early Tertiary igneous complexes extend into the Taylor Mountains 1:250,000-scale quadrangle. The U.S. Geological Survey is conducting geologic mapping and a mineral resource assessment of this area that will provide a better understanding of the geologic framework, regional geochemistry, and may provide targets for mineral exploration and development. During the 2004 field season 137 rock samples were collected for a variety of purposes. The 4 digital files accompanying this report reflect the type of analysis performed and its intended purpose and are available for download as an Excel workbook, comma delimited format (*.csv), dBase 4 files (*.dbf) or as point coverages in ArcInfo interchange format (*.e00). Data values are provided in percent, pct (1gram per 100grams), or parts per million, ppm (1gram per 1,000,000grams) per the column heading in the table. All samples were analyzed for a suite of 42 trace-elements (icp42.*) to provide data for use in geochemical exploration as well as some baseline data. Selected samples were analyzed by additional methods; 104 targeted geochemical exploration samples were analyzed for gold, arsenic, and mercury (auashg.*); 21 of these samples were also analyzed to obtain concentrations of 10 loosely bound metals (icp10.*); 33 rock samples were analyzed for major element oxides to support the regional mapping program (reg.*), of which 28 sedimentary rock samples were also analyzed for total carbon, and carbonate carbon.
Software to Control and Monitor Gas Streams
NASA Technical Reports Server (NTRS)
Arkin, C.; Curley, Charles; Gore, Eric; Floyd, David; Lucas, Damion
2012-01-01
This software package interfaces with various gas stream devices such as pressure transducers, flow meters, flow controllers, valves, and analyzers such as a mass spectrometer. The software provides excellent user interfacing with various windows that provide time-domain graphs, valve state buttons, priority- colored messages, and warning icons. The user can configure the software to save as much or as little data as needed to a comma-delimited file. The software also includes an intuitive scripting language for automated processing. The configuration allows for the assignment of measured values or calibration so that raw signals can be viewed as usable pressures, flows, or concentrations in real time. The software is based on those used in two safety systems for shuttle processing and one volcanic gas analysis system. Mass analyzers typically have very unique applications and vary from job to job. As such, software available on the market is usually inadequate or targeted on a specific application (such as EPA methods). The goal was to develop powerful software that could be used with prototype systems. The key problem was to generalize the software to be easily and quickly reconfigurable. At Kennedy Space Center (KSC), the prior art consists of two primary methods. The first method was to utilize Lab- VIEW and a commercial data acquisition system. This method required rewriting code for each different application and only provided raw data. To obtain data in engineering units, manual calculations were required. The second method was to utilize one of the embedded computer systems developed for another system. This second method had the benefit of providing data in engineering units, but was limited in the number of control parameters.
ERIC Educational Resources Information Center
Ryan, Ellen
1994-01-01
New Council for Advancement and Support of Education (CASE) standards for college and university fund raising establish three key rules for campaign reporting: (1) separation of gifts for featured and unspecified objectives; (2) separation of current from deferred gifts; and (3) disclosure of both face value and discounted present value of…
Determining the Separation and Position Angles of Orbiting Binary Stars: Comparison of Three Methods
NASA Astrophysics Data System (ADS)
Walsh, Ryan; Boule, Cory; Andrews, Katelyn; Penfield, Andrew; Ross, Ian; Lucas, Gaylon; Braught, Trisha; Harfenist, Steven; Goodale, Keith
2015-07-01
To initiate a long term binary star research program, undergraduate students compared the accuracy and ease of measuring the separations and position angles of three long period binary pairs using three different measurement techniques. It was found that digital image capture using BackyardEOS software and subsequent analysis in Adobe Photoshop was the most accurate and easiest to use of our three methods. The systems WDS J17419+7209 (STF 2241AB), WDS 19418+5032 (STFA 46AB), and WDS 16362+5255 (STF 2087AB) were found to have separations and position angles of: 30", 16°; 39.7", 133°; and 3.1", 104°, respectively. This method produced separation values within 1.3" and position angle values within 1.3° of the most recently observed values found in the Washington Double Star Catalog.
OpenStereo: Open Source, Cross-Platform Software for Structural Geology Analysis
NASA Astrophysics Data System (ADS)
Grohmann, C. H.; Campanha, G. A.
2010-12-01
Free and open source software (FOSS) are increasingly seen as synonyms of innovation and progress. Freedom to run, copy, distribute, study, change and improve the software (through access to the source code) assure a high level of positive feedback between users and developers, which results in stable, secure and constantly updated systems. Several software packages for structural geology analysis are available to the user, with commercial licenses or that can be downloaded at no cost from the Internet. Some provide basic tools of stereographic projections such as plotting poles, great circles, density contouring, eigenvector analysis, data rotation etc, while others perform more specific tasks, such as paleostress or geotechnical/rock stability analysis. This variety also means a wide range of data formating for input, Graphical User Interface (GUI) design and graphic export format. The majority of packages is built for MS-Windows and even though there are packages for the UNIX-based MacOS, there aren't native packages for *nix (UNIX, Linux, BSD etc) Operating Systems (OS), forcing the users to run these programs with emulators or virtual machines. Those limitations lead us to develop OpenStereo, an open source, cross-platform software for stereographic projections and structural geology. The software is written in Python, a high-level, cross-platform programming language and the GUI is designed with wxPython, which provide a consistent look regardless the OS. Numeric operations (like matrix and linear algebra) are performed with the Numpy module and all graphic capabilities are provided by the Matplolib library, including on-screen plotting and graphic exporting to common desktop formats (emf, eps, ps, pdf, png, svg). Data input is done with simple ASCII text files, with values of dip direction and dip/plunge separated by spaces, tabs or commas. The user can open multiple file at the same time (or the same file more than once), and overlay different elements of each dataset (poles, great circles etc). The GUI shows the opened files in a tree structure, similar to “layers” of many illustration software, where the vertical order of the files in the tree reflects the drawing order of the selected elements. At this stage, the software performs plotting operations of poles to planes, lineations, great circles, density contours and rose diagrams. A set of statistics is calculated for each file and its eigenvalues and eigenvectors are used to suggest if the data is clustered about a mean value or distributed along a girdle. Modified Flinn, Triangular and histograms plots are also available. Next step of development will focus on tools as merging and rotation of datasets, possibility to save 'projects' and paleostress analysis. In its current state, OpenStereo requires Python, wxPython, Numpy and Matplotlib installed in the system. We recommend installing PythonXY or the Enthought Python Distribution on MS-Windows and MacOS machines, since all dependencies are provided. Most Linux distributions provide an easy way to install all dependencies through software repositories. OpenStereo is released under the GNU General Public License. Programmers willing to contribute are encouraged to contact the authors directly. FAPESP Grant #09/17675-5
Miccio, Joseph; Parikh, Shruti; Marinaro, Xavier; Prasad, Atulya; McClain, Steven; Singer, Adam J; Clark, Richard A F
2016-03-01
Current methods of assessing burn depth are limited and are primarily based on visual assessments by burn surgeons. This technique has been shown to have only 60% accuracy and a more accurate, simple, noninvasive method is needed to determine burn wound depth. Forward-looking infrared (FLIR) thermography is both noninvasive and user-friendly with the potential to rapidly assess burn depth. The purpose of this paper is to determine if early changes in burn temperature (first 3 days) can be a predictor of burn depth as assessed by vertical scarring 28 days after injury. While under general anesthesia, 20 burns were created on the backs of two female Yorkshire swine using a 2.5cm×2.5cm×7.5cm, 150g aluminum bar, for a total of 40 burns. FLIR imaging was performed at both early (1, 2 and 3 days) and late (7, 10, 14, 17, 21, 24 and 28 days) time points. Burns were imaged from a height of 12 inches from the skin surface. FLIR ExaminIR(©) software was used to examine the infrared thermographs. One hundred temperature points from burn edge to edge across the center of the burn were collected for each burn at all time points and were exported as a comma-separated values (CSV) file. The CSV file was processed and analyzed using a MATLAB program. The temperature profiles through the center of the burns generated parabola-like curves. The lowest temperature (temperature minimum) and a line midway between the temperature minimum and ambient skin temperature at the burn edges was defined and the area of the curve calculated (the "temperature half-area"). Half-area values 2 days after burn had higher correlations with scar depth than did the minimum temperatures. However, burns that became warmer from 1 day to 2 days after injury had a lower scar depth then burns that became cooler and this trend was best predicted by temperature minima. When data were analyzed as a diagnostic test for sensitivity and specificity using >3mm scarring, i.e. a full-thickness burn, as a clinically relevant criterion standard, temperature minima at 2 days after burn was found to be the most sensitive and specific test. FLIR imaging is a fast and simple tool that has been shown to predict burn wound outcome in a porcine vertical injury progression model. Data showed that more severe burn wounds get cooler between 1 and 2 days after burn. We found four analytic methods of FLIR images that were predictive of burn progression at 1 and 2 days after burn; however, temperature minima 2 days after burn appeared to be the best predictive test for injury progression to a full-thickness burn. Although these results must be validated in clinical studies, FLIR imaging has the potential to aid clinicians in assessing burn severity and thereby assisting in burn wound management. Copyright © 2015 Elsevier Ltd and ISBI. All rights reserved.
Prediction of Phase Separation of Immiscible Ga-Tl Alloys
NASA Astrophysics Data System (ADS)
Kim, Yunkyum; Kim, Han Gyeol; Kang, Youn-Bae; Kaptay, George; Lee, Joonho
2017-06-01
Phase separation temperature of Ga-Tl liquid alloys was investigated using the constrained drop method. With this method, density and surface tension were investigated together. Despite strong repulsive interactions, molar volume showed ideal mixing behavior, whereas surface tension of the alloy was close to that of pure Tl due to preferential adsorption of Tl. Phase separation temperatures and surface tension values obtained with this method were close to the theoretically calculated values using three different thermodynamic models.
A critique of the naturalistic fallacy thesis.
Tullberg, Jan; Tullberg, Birgitta S
2001-09-01
If the prescriptive "ought" is separated from the factual "is," an intellectual analysis of the real world is by definition without normative value. The naturalistic fallacy thesis -- maintaining that normative and descriptive spheres must remain separated -- is often presented in a weak sense that seems reasonable. However, only in a strong sense -- by strictly separating facts and values -- are fallacy accusations supported. We claim that this naturalistic fallacy thesis is unsound and that normative statements instead should be based on rational understanding as found in the Darwinian and social sciences. The Cartesian compromise should be abandoned, since only naturalism can provide a cogent framework for better understanding and support ethics with a solid foundation. Many people nurture values based on tradition, whim, subgroup identification etc., and they demand respect for those values. However, we can demand respect for values only when they have a rational foundation. The common belief in the thesis of naturalistic fallacy is an anti-intellectual device that shields values from rational inquiry.
Santercole, Viviana; Delmonte, Pierluigi; Kramer, John K G
2012-03-01
Commercial fish oils and foods containing fish may contain trans and/or isomerized fatty acids (FA) produced during processing or as part of prepared foods. The current American Oil Chemists' Society (AOCS) official method for marine oils (method Ce 1i-07) is based on separation by use of poly(ethylene glycol) (PEG) columns, for example Supelcowax-10 or equivalent, which do not resolve most unsaturated FA geometric isomers. Highly polar 100-m cyanopropyl siloxane (CPS) columns, for example SP-2560 and CP Sil 88 are recommended for separation of geometric FA isomers. Complementary separations were achieved by use of two different elution temperature programs with the same CPS column. This study is the first direct comparison of the separations achieved by use of 30-m Supelcowax-10 and 100-m SP-2560 columns for fatty acid methyl esters (FAME) prepared from the same fish oil and fish muscle sample. To simplify the identification of the FA in these fish samples, FA were fractionated on the basis of the number and type of double bonds by silver-ion solid-phase extraction (Ag⁺-SPE) before GC analysis. The results showed that a combination of the three GC separations was necessary to resolve and identify most of the unsaturated FA, FA isomers, and other components of fish products, for example phytanic and phytenic acids. Equivalent chain length (ECL) values of most FAME in fish were calculated from the separations achieved by use of both GC columns; the values obtained were shown to be consistent with previously reported values for the Supelcowax-10 column. ECL values were also calculated for the FA separated on the SP-2560 column. The calculated ECL values were equally valid under isothermal and temperature-programmed elution GC conditions, and were valuable for confirmation of the identity of several unsaturated FAME in the fish samples. When analyzing commercially prepared fish foods, deodorized marine oils, or foods fortified with marine oils it is strongly recommended that quantitative data acquired by use of PEG columns is complemented with data obtained from separations using highly polar CPS columns.
77 FR 46699 - Honey From the People's Republic of China: Preliminary Results of Review
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-06
... quantity and value, its separate rate status, structure and affiliations, sales process, accounting and... quantity and value, separate rate status, structure and affiliations, sales process, accounting and... (CIT August 10, 2009) (''Commerce may, of course, begin its total AFA selection process by defaulting...
NASA Technical Reports Server (NTRS)
Fanourakis, Sofia
2015-01-01
My main project was to determine and implement updates to be made to MODEAR (Mission Operations Data Enterprise Architecture Repository) process definitions to be used for CST-100 (Crew Space Transportation-100) related missions. Emphasis was placed on the scheduling aspect of the processes. In addition, I was to complete other tasks as given. Some of the additional tasks were: to create pass-through command look-up tables for the flight controllers, finish one of the MDT (Mission Operations Directorate Display Tool) displays, gather data on what is included in the CST-100 public data, develop a VBA (Visual Basic for Applications) script to create a csv (Comma-Separated Values) file with specific information from spreadsheets containing command data, create a command script for the November MCC-ASIL (Mission Control Center-Avionics System Integration Laboratory) testing, and take notes for one of the TCVB (Terminal Configured Vehicle B-737) meetings. In order to make progress in my main project I scheduled meetings with the appropriate subject matter experts, prepared material for the meetings, and assisted in the discussions in order to understand the process or processes at hand. After such discussions I made updates to various MODEAR processes and process graphics. These meetings have resulted in significant updates to the processes that were discussed. In addition, the discussions have helped the departments responsible for these processes better understand the work ahead and provided material to help document how their products are created. I completed my other tasks utilizing resources available to me and, when necessary, consulting with the subject matter experts. Outputs resulting from my other tasks were: two completed and one partially completed pass through command look-up tables for the fight controllers, significant updates to one of the MDT displays, a spreadsheet containing data on what is included in the CST-100 public data, a tool to create a csv file with specific information from spreadsheets containing command data, a command script for the November MCC-ASIL testing which resulted in a successful test day identifying several potential issues, and notes from one of the TCVB meetings that was used to keep the teams up to date on what was discussed and decided. I have learned a great deal working at NASA these last four months. I was able to meet and work with amazing individuals, further develop my technical knowledge, expand my knowledge base regarding human spaceflight, and contribute to the CST-100 missions. My work at NASA has strengthened my desire to continue my education in order to make further contributions to the field, and has given me the opportunity to see the advantages of a career at NASA.
Nonvolatile chemical cues affect host-plant ranking by gravid Polygonia c-album females.
Mozūraitis, Raimondas; Murtazina, Rushana; Nylin, Sören; Borg-Karlson, Anna-Karin
2012-01-01
In a multiple-choice test, the preference of egg-laying Polygonia c-album (comma butterfly) females was studied for oviposition on plants bearing surrogate leaves treated with crude methanol extracts obtained from leaves of seven host-plant species: Humulus lupulus, Urtica dioica, Ulmus glabra, Salix caprea, Ribes nigrum, Corylus avellana, and Betula pubescens. The ranking order of surrogate leaves treated with host-plant extracts corresponded well to that reported on natural foliage, except R. nigrum. Thus, host-plant choice in P. c-album seems to be highly dependent on chemical cues. Moreover, after two subsequent fractionations using reversed-phase chromatography the nonvolatile chemical cues residing in the most polar water-soluble fractions evidently provided sufficient information for egg-laying females to discriminate and rank between the samples of more and less preferred plants, since the ranking in these assays was similar to that for natural foliage or whole methanol extracts, while the physical traits of the surrogate leaves remained uniform.
Expression and effect of inhibition of aminopeptidase-A during nephrogenesis.
Dijkman, Henry B P M; Assmann, Karel J M; Steenbergen, Eric J; Wetzels, Jack F M
2006-02-01
Aminopeptidase-A (APA) is a metalloprotease that cleaves N-terminal aspartyl and glutamyl residues from peptides. Its best-known substrate is angiotensin II (Ang II), the most active compound of the renin-angiotensin system (RAS). The RAS is involved in renal development. Most components of the RAS system are expressed in the developing kidney. Thus far, APA has not been studied in detail. In the present study we have evaluated the expression of APA at the protein, mRNA, and enzyme activity (EA) level in the kidney during nephrogenesis. Furthermore, we have studied the effect of inhibiting APA EA by injection of anti-APA antibodies into 1-day-old mice. APA expression was observed from the comma stage onwards, predominantly in the developing podocytes and brush borders of proximal tubular cells. Notably, APA was absent in the medulla or the renal arterioles. Inhibition of APA EA caused temporary podocyte foot-process effacement, suggesting a minimum role for APA during nephrogenesis.
Whissell, Cynthia
2013-12-01
Titles of articles in seven highly ranked multidisciplinary psychology journals for every fifth year between 1966 and 2011 (inclusive) were studied in terms of title length, word length, punctuation density, and word pleasantness, activation, and concreteness (assessed by the Dictionary of Affect in Language). Titles grew longer (by three words) and were more frequently punctuated (by one colon or comma for every other article) between 1966 and 2011. This may reflect the increasing complexity of psychology and satisfy readers' requirements for more specific information. There were significant differences among journals (e.g., titles in the Annual Review of Psychology were scored by the Dictionary of Affect as the most pleasant, and those in Psychological Bulletin as the least pleasant) and among categories of journals (e.g., titles in review journals employed longer words than those in research or association journals). Differences were stable across time and were employed to successfully categorize titles from a validation sample.
Mineral Resources Data System (MRDS)
Mason, G.T.; Arndt, R.E.
1996-01-01
The U.S. Geological Survey (USGS) operates the Mineral Resources Data System (MRDS), a digital system that contained 111,955 records on Sept. 1, 1995. Records describe metallic and industrial commodity deposits, mines, prospects, and occurrences in the United States and selected other countries. These records have been created over the years by USGS commodity specialists and through cooperative agreements with geological surveys of U.S. States and other countries. This CD-ROM contains the complete MRDS data base, several subsets of it, and software to allow data retrieval and display. Data retrievals are made by using GSSEARCH, a program that is included on this CD-ROM. Retrievals are made by specifying fields or any combination of the fields that provide information on deposit name, location, commodity, deposit model type, geology, mineral production, reserves, and references. A tutorial is included. Retrieved records may be printed or written to a hard disk file in four different formats: ascii, fixed, comma delimited, and DBASE compatible.
NASA Astrophysics Data System (ADS)
Lusebrink, Inka; Burkhardt, Dirk; Gedig, Thomas; Dettner, Konrad; Mosandl, Armin; Seifert, Karlheinz
2007-02-01
Most species of the rove beetle genus Stenus employ the spreading alkaloid stenusine as an escape mechanism on water surfaces. In the case of danger, they emit stenusine from their pygidial glands, and it propels them over the water very quickly. Stenusine is a chiral molecule with four stereoisomers: (2' R,3 R)-, (2' S,3 R)-, (2' S,3 S)-, and (2' R,3 S)-stenusine. The percentile ratio of these four isomers is only known for the most common species of the genus: Stenus comma. With the intention of determining the stereoisomer ratios of five additional species from the two subgenera, Stenus and Hypostenus, we used GC/mass spectrometry measurements with a chiral phase . The results showed that the ratio differs among the genus. These findings can be a basis for chemotaxonomy. It is also possible that the biological function of stenusine, e.g., as antibiotic or fungicide, varies with changing stereoisomer composition.
Liquid-feeding strategy of the proboscis of butterflies
NASA Astrophysics Data System (ADS)
Lee, Seung Chul; Lee, Sang Joon; CenterBiofluid; Biomimic Research Team
2015-11-01
The liquid-feeding strategy of the proboscis of butterflies was experimentally investigated. Firstly, the liquid uptake from a pool by the proboscis of a nectar-feeding butterfly, cabbage white (Pieris rapae) was tested. Liquid-intake flow phenomenon at the submerged proboscis was visualized by micro-particle image velocimetry. The periodic liquid-feeding flow is induced by the systaltic motion of the cibarial pump. Reynolds number and Womersley number of the liquid-intake flow in the proboscis are low enough to assume quasi-steady laminar flow. Next, the liquid feeding from wet surfaces by the brush-tipped proboscis of a nymphalid butterfly, Asian comma (Polygonia c-aureum) was investigated. The tip of the proboscis was observed especially brush-like sensilla styloconica. The liquid-feeding flow between the proboscis and wet surfaces was also quantitatively visualized. During liquid drinking from the wet surface, the sensilla styloconica enhance liquid uptake rate with accumulation of liquid. This work was supported by the National Research Foundation of Korea (NRF) grant funded by the Korea government (MSIP) (No. 2008-0061991).
Caryoscope: An Open Source Java application for viewing microarray data in a genomic context
Awad, Ihab AB; Rees, Christian A; Hernandez-Boussard, Tina; Ball, Catherine A; Sherlock, Gavin
2004-01-01
Background Microarray-based comparative genome hybridization experiments generate data that can be mapped onto the genome. These data are interpreted more easily when represented graphically in a genomic context. Results We have developed Caryoscope, which is an open source Java application for visualizing microarray data from array comparative genome hybridization experiments in a genomic context. Caryoscope can read General Feature Format files (GFF files), as well as comma- and tab-delimited files, that define the genomic positions of the microarray reporters for which data are obtained. The microarray data can be browsed using an interactive, zoomable interface, which helps users identify regions of chromosomal deletion or amplification. The graphical representation of the data can be exported in a number of graphic formats, including publication-quality formats such as PostScript. Conclusion Caryoscope is a useful tool that can aid in the visualization, exploration and interpretation of microarray data in a genomic context. PMID:15488149
Mating system and the evolution of sex-specific mortality rates in two nymphalid butterflies.
Wiklund, Christer; Gotthard, Karl; Nylin, Sören
2003-09-07
Life-history theory predicts that organisms should invest resources into intrinsic components of lifespan only to the degree that it pays off in terms of reproductive success. The benefit of a long life may differ between the sexes and different mating systems may therefore select for different sex-specific mortality rates. In insects with polyandrous mating systems, females mate throughout their lives and male reproductive success is likely to increase monotonously with lifespan. In monandrous systems, where the mating season is less protracted because receptive females are available only at the beginning of the flight season, male mating success should be less dependent on a long lifespan. Here, we show, in a laboratory experiment without predation, that the duration of the mating season is longer in the polyandrous comma butterfly, Polygonia c-album, than in the monandrous peacock butterfly, Inachis io, and that, in line with predictions, male lifespan is shorter than female lifespan in I. io, whereas male and female lifespans are similar in P. c-album.
Characterization of microporous separators for lithium-ion batteries
NASA Astrophysics Data System (ADS)
Venugopal, Ganesh; Moore, John; Howard, Jason; Pendalwar, Shekhar
Several properties including porosity, pore-size distribution, thickness value, electrochemical stability and mechanical properties have to be optimized before a membrane can qualify as a separator for a lithium-ion battery. In this paper we present results of characterization studies carried out on some commercially available lithium-ion battery separators. The relevance of these results to battery performance and safety are also discussed. Porosity values were measured using a simple liquid absorption test and gas permeabilities were measured using a novel pressure drop technique that is similar in principle to the Gurley test. For separators from one particular manufacturer, the trend observed in the pressure drop times was found to be in agreement with the Gurley numbers reported by the separator manufacturer. Shutdown characteristics of the separators were studied by measuring the impedance of batteries containing the separators as a function of temperature. Overcharge tests were also performed to confirm that separator shutdown is indeed a useful mechanism for preventing thermal runaway situations. Polyethylene containing separators, in particular trilayer laminates of polypropylene, polyethylene and polypropylene, appear to have the most attractive properties for preventing thermal runaway in lithium ion cells.
Plume effects on the flow around a blunted cone at hypersonic speeds
NASA Technical Reports Server (NTRS)
Atcliffe, P.; Kumar, D.; Stollery, J. L.
1992-01-01
Tests at M = 8.2 show that a simulated rocket plume at the base of a blunted cone can cause large areas of separated flow, with dramatic effects on the heat transfer rate distribution. The plume was simulated by solid discs of varying sizes or by an annular jet of gas. Flow over the cone without a plume is fully laminar and attached. Using a large disc, the boundary layer is laminar at separation at the test Reynolds number. Transition occurs along the separated shear layer and the boundary layer quickly becomes turbulent. The reduction in heat transfer associated with a laminar separated region is followed by rising values as transition occurs and the heat transfer rates towards the rear of the cone substantially exceed the values obtained without a plume. With the annular jet or a small disc, separation occurs much further aft, so that heat transfer rates at the front of the cone are comparable with those found without a plume. Downstream of separation the shear layer now remains laminar and the heat transfer rates to the surface are significantly lower than the attached flow values.
2015-10-30
pressure values onto the SD card. The addition of free and open-source Arduino libraries allowed for the seamless integration of the shield into the...alert the user when replacing the separator is necessary. Methods: A sensor was built to measure and record differential pressure values within the...from the transducers during simulated blockages were transformed into pressure values using linear regression equations from the calibration data
Determining Binary Star Orbits Using Kepler's Equation
NASA Astrophysics Data System (ADS)
Boule, Cory; Andrews, Kaitlyn; Penfield, Andrew; Puckette, Ian; Goodale, Keith; Harfenist, Steven
2017-04-01
Students calculated ephemerides and generated orbits of four well-known binary systems. Using an iterative technique in Microsoft® Excel® to solve Kepler's equation, separation and position angle values were generated as well as plots of the apparent orbits. Current position angle and separation values were measured in the field and compared well to the calculated values for the stars: STF1196AB,C, STF296AB, STF296AB and STF60AB.
Beaton, R.H.
1960-06-28
A process is given for separating tri- or tetravalent plutonium from fission products in an aqueous solution by complexing the fission products with oxalate, tannate, citrate, or tartrate anions at a pH value of at least 2.4 (preferably between 2.4 and 4), and contacting a cation exchange resin with the solution whereby the plutonium is adsorbed while the complexed fission products remain in solution.
Psychological Separation, Ethnic Identity and Adjustment in Chicano/Latinos.
ERIC Educational Resources Information Center
Rodriguez, Ester R.; Bernstein, Bianca L.
This study examined the relationship between psychological separation and college adjustment in a Chicano/Latino sample, a group which has traditionally not valued psychological separation (N=137). Ethnic identity as a moderator variable was also explored. The Psychological Separation Inventory, Student Adjustment to College Questionnaire, and the…
PROCESS FOR SEPARATING AMERICIUM AND CURIUM FROM RARE EARTH ELEMENTS
Baybarz, R.D.; Lloyd, M.H.
1963-02-26
This invention relates to methods of separating americium and curium values from rare earth values. In accordance with the invention americium, curium, and rare earth values are sorbed on an anion exchange resin. A major portion of the rare earth values are selectively stripped from the resin with a concentrated aqueous solution of lithium chloride, and americium, curium, and a minor portion of rare earth values are then stripped from the resin with a dilute aqueous solution of lithium chloride. The americium and curium values are further purified by increasing the concentration of lithium chloride in the solution to at least 8 molar and selectively extracting rare earth values from the resulting solution with a monoalkylphosphoric acid. (AEC)
Method for the concentration and separation of actinides from biological and environmental samples
Horwitz, E. Philip; Dietz, Mark L.
1989-01-01
A method and apparatus for the quantitative recover of actinide values from biological and environmental sample by passing appropriately prepared samples in a mineral acid solution through a separation column of a dialkyl(phenyl)-N,N-dialylcarbamoylmethylphosphine oxide dissolved in tri-n-butyl phosphate on an inert substrate which selectively extracts the actinide values. The actinide values can be eluted either as a group or individually and their presence quantitatively detected by alpha counting.
Evans, Robert J.; Chum, Helena L.
1994-01-01
A process of using fast pyrolysis in a carrier gas to convert a plastic waste feedstream having a mixed polymeric composition in a manner such that pyrolysis of a given polymer to its high value monomeric constituent occurs prior to pyrolysis of other plastic components therein comprising: selecting a first temperature program range to cause pyrolysis of said given polymer to its high value monomeric constituent prior to a temperature range that causes pyrolysis of other plastic components; selecting a catalyst and support for treating said feed streams with said catalyst to effect acid or base catalyzed reaction pathways to maximize yield or enhance separation of said high value monomeric constituent in said temperature program range; differentially heating said feed stream at a heat rate within the first temperature program range to provide differential pyrolysis for selective recovery of optimum quantities of the high value monomeric constituent prior to pyrolysis of other plastic components; separating the high value monomeric constituents, selecting a second higher temperature range to cause pyrolysis of a different high value monomeric constituent of said plastic waste and differentially heating the feedstream at the higher temperature program range to cause pyrolysis of the different high value monomeric constituent; and separating the different high value monomeric constituent.
Evans, Robert J.; Chum, Helena L.
1994-01-01
A process of using fast pyrolysis in a carrier gas to convert a plastic waste feedstream having a mixed polymeric composition in a manner such that pyrolysis of a given polymer to its high value monomeric constituent occurs prior to pyrolysis of other plastic components therein comprising: selecting a first temperature program range to cause pyrolysis of said given polymer to its high value monomeric constituent prior to a temperature range that causes pyrolysis of other plastic components; selecting a catalyst and support for treating said feed streams with said catalyst to effect acid or base catalyzed reaction pathways to maximize yield or enhance separation of said high value monomeric constituent in said temperature program range; differentially heating said feed stream at a heat rate within the first temperature program range to provide differential pyrolysis for selective recovery of optimum quantities of the high value monomeric constituent prior to pyrolysis of other plastic components; separating the high value monomeric constituents; selecting a second higher temperature range to cause pyrolysis of a different high value monomeric constituent of said plastic waste and differentially heating the feedstream at the higher temperature program range to cause pyrolysis of the different high value monomeric constituent; and separating the different high value monomeric constituent.
Evans, Robert J.; Chum, Helena L.
1993-01-01
A process of using fast pyrolysis in a carrier gas to convert a plastic waste feedstream having a mixed polymeric composition in a manner such that pyrolysis of a given polymer to its high value monomeric constituent occurs prior to pyrolysis of other plastic components therein comprising: selecting a first temperature program range to cause pyrolysis of said given polymer to its high value monomeric constituent prior to a temperature range that causes pyrolysis of other plastic components; selecting a catalyst and support for treating said feed streams with said catalyst to effect acid or base catalyzed reaction pathways to maximize yield or enhance separation of said high value monomeric constituent in said temperature program range; differentially heating said feed stream at a heat rate within the first temperature program range to provide differential pyrolysis for selective recovery of optimum quantities of the high value monomeric constituent prior to pyrolysis of other plastic components; separating the high value monomeric constituents; selecting a second higher temperature range to cause pyrolysis of a different high value monomeric constituent of said plastic waste and differentially heating the feedstream at the higher temperature program range to cause pyrolysis of the different high value monomeric constituent; and separating the different high value monomeric constituent.
Developmental Dynamics of X-Chromosome Dosage Compensation by the DCC and H4K20me1 in C. elegans
Kramer, Maxwell; Kranz, Anna-Lena; Su, Amanda; Winterkorn, Lara H.; Albritton, Sarah Elizabeth; Ercan, Sevinc
2015-01-01
In Caenorhabditis elegans, the dosage compensation complex (DCC) specifically binds to and represses transcription from both X chromosomes in hermaphrodites. The DCC is composed of an X-specific condensin complex that interacts with several proteins. During embryogenesis, DCC starts localizing to the X chromosomes around the 40-cell stage, and is followed by X-enrichment of H4K20me1 between 100-cell to comma stage. Here, we analyzed dosage compensation of the X chromosome between sexes, and the roles of dpy-27 (condensin subunit), dpy-21 (non-condensin DCC member), set-1 (H4K20 monomethylase) and set-4 (H4K20 di-/tri-methylase) in X chromosome repression using mRNA-seq and ChIP-seq analyses across several developmental time points. We found that the DCC starts repressing the X chromosomes by the 40-cell stage, but X-linked transcript levels remain significantly higher in hermaphrodites compared to males through the comma stage of embryogenesis. Dpy-27 and dpy-21 are required for X chromosome repression throughout development, but particularly in early embryos dpy-27 and dpy-21 mutations produced distinct expression changes, suggesting a DCC independent role for dpy-21. We previously hypothesized that the DCC increases H4K20me1 by reducing set-4 activity on the X chromosomes. Accordingly, in the set-4 mutant, H4K20me1 increased more from the autosomes compared to the X, equalizing H4K20me1 level between X and autosomes. H4K20me1 increase on the autosomes led to a slight repression, resulting in a relative effect of X derepression. H4K20me1 depletion in the set-1 mutant showed greater X derepression compared to equalization of H4K20me1 levels between X and autosomes in the set-4 mutant, indicating that H4K20me1 level is important, but X to autosomal balance of H4K20me1 contributes only slightly to X-repression. Thus H4K20me1 by itself is not a downstream effector of the DCC. In summary, X chromosome dosage compensation starts in early embryos as the DCC localizes to the X, and is strengthened in later embryogenesis by H4K20me1. PMID:26641248
Hikosaka, Okihide
2014-01-01
Gaze is strongly attracted to visual objects that have been associated with rewards. Key to this function is a basal ganglia circuit originating from the caudate nucleus (CD), mediated by the substantia nigra pars reticulata (SNr), and aiming at the superior colliculus (SC). Notably, subregions of CD encode values of visual objects differently: stably by CD tail [CD(T)] vs. flexibly by CD head [CD(H)]. Are the stable and flexible value signals processed separately throughout the CD-SNr-SC circuit? To answer this question, we identified SNr neurons by their inputs from CD and outputs to SC and examined their sensitivity to object values. The direct input from CD was identified by SNr neuron's inhibitory response to electrical stimulation of CD. We found that SNr neurons were separated into two groups: 1) neurons inhibited by CD(T) stimulation, located in the caudal-dorsal-lateral SNr (cdlSNr), and 2) neurons inhibited by CD(H) stimulation, located in the rostral-ventral-medial SNr (rvmSNr). Most of CD(T)-recipient SNr neurons encoded stable values, whereas CD(H)-recipient SNr neurons tended to encode flexible values. The output to SC was identified by SNr neuron's antidromic response to SC stimulation. Among the antidromically activated neurons, many encoded only stable values, while some encoded only flexible values. These results suggest that CD(T)-cdlSNr-SC circuit and CD(H)-rvmSNr-SC circuit transmit stable and flexible value signals, largely separately, to SC. The speed of signal transmission was faster through CD(T)-cdlSNr-SC circuit than through CD(H)-rvmSNr-SC circuit, which may reflect automatic and controlled gaze orienting guided by these circuits. PMID:25540224
Valenzuela, Carlos Y
2017-02-13
Direct tests of the random or non-random distribution of nucleotides on genomes have been devised to test the hypothesis of neutral, nearly-neutral or selective evolution. These tests are based on the direct base distribution and are independent of the functional (coding or non-coding) or structural (repeated or unique sequences) properties of the DNA. The first approach described the longitudinal distribution of bases in tandem repeats under the Bose-Einstein statistics. A huge deviation from randomness was found. A second approach was the study of the base distribution within dinucleotides whose bases were separated by 0, 1, 2… K nucleotides. Again an enormous difference from the random distribution was found with significances out of tables and programs. These test values were periodical and included the 16 dinucleotides. For example a high "positive" (more observed than expected dinucleotides) value, found in dinucleotides whose bases were separated by (3K + 2) sites, was preceded by two smaller "negative" (less observed than expected dinucleotides) values, whose bases were separated by (3K) or (3K + 1) sites. We examined mtDNAs, prokaryote genomes and some eukaryote chromosomes and found that the significant non-random interactions and periodicities were present up to 1000 or more sites of base separation and in human chromosome 21 until separations of more than 10 millions sites. Each nucleotide has its own significant value of its distance to neutrality; this yields 16 hierarchical significances. A three dimensional table with the number of sites of separation between the bases and the 16 significances (the third dimension is the dinucleotide, individual or taxon involved) gives directly an evolutionary state of the analyzed genome that can be used to obtain phylogenies. An example is provided.
Forest-cover-type separation using RADARSAT-1 synthetic aperture radar imagery
Mark D. Nelson; Kathleen T. Ward; Marvin E. Bauer
2009-01-01
RADARSAT-1 synthetic aperture radar data, speckle reduction, and texture measures provided for separation among forest types within the Twin Cities metropolitan area, MN, USA. The highest transformed divergence values for 16-bit data resulted from speckle filtering while the highest values for 8-bit data resulted from the orthorectified image, before and after...
Kinetics of motility-induced phase separation and swim pressure
NASA Astrophysics Data System (ADS)
Patch, Adam; Yllanes, David; Marchetti, M. Cristina
Active Brownian particles (ABPs) represent a minimal model of active matter consisting of self-propelled spheres with purely repulsive interactions and rotational noise. We correlate the time evolution of the mean pressure towards its steady state value with the kinetics of motility-induced phase separation. For parameter values corresponding to phase separated steady states, we identify two dynamical regimes. The pressure grows monotonically in time during the initial regime of rapid cluster formation, overshooting its steady state value and then quickly relaxing to it, and remains constant during the subsequent slower period of cluster coalescence and coarsening. The overshoot is a distinctive feature of active systems. NSF-DMR-1305184, NSF-DGE-1068780, ACI-1341006, FIS2015-65078-C02, BIFI-ZCAM.
Van Winkle, Q.; Kraus, K.A.
1959-10-27
A process is presented for separating polonium, protactinium, or mixtures thereof in aqueous solution from bismuth, zirconium, lead, and niobium values contained in the solution. The method comprises providing hydrochloric acid in the solution in a concentration of at least 5N. contacting the aqueous solution with a substantially waterimmiscible organic solvent such as diisopropyl ketone, and separating the aqueous phase containing the bismuth, zirconium, lead, and niobium from the organic extract phase containing the polonium, protactinium, or mixture thereof.
Russell, E.R.; Adamson, A.W.; Schubert, J.; Boyd, G.E.
1957-10-29
A process for separating plutonium values from aqueous solutions which contain the plutonium in minute concentrations is described. These values can be removed from an aqueous solution by taking an aqueous solution containing a salt of zirconium, titanium, hafnium or thorium, adding an aqueous solution of silicate and phosphoric acid anions to the metal salt solution, and separating, washing and drying the precipitate which forms when the two solutions are mixed. The aqueous plutonium containing solution is then acidified and passed over the above described precipi-tate causing the plutonium values to be adsorbed by the precipitate.
Method for the concentration and separation of actinides from biological and environmental samples
Horwitz, E.P.; Dietz, M.L.
1989-05-30
A method and apparatus for the quantitative recover of actinide values from biological and environmental sample by passing appropriately prepared samples in a mineral acid solution through a separation column of a dialkyl(phenyl)-N,N-dialylcarbamoylmethylphosphine oxide dissolved in tri-n-butyl phosphate on an inert substrate which selectively extracts the actinide values. The actinide values can be eluted either as a group or individually and their presence quantitatively detected by alpha counting. 3 figs.
Use of the electro-separation method for improvement of the utility value of winter rapeseeds
NASA Astrophysics Data System (ADS)
Kovalyshyn, S. J.; Shvets, O. P.; Grundas, S.; Tys, J.
2013-12-01
The paper presents the results of a study of the use of electro-separation methods for improvement of the utility value of 5 winter rapeseed cultivars. The process of electro-separation of rapeseed was conducted on a prototype apparatus built at the Laboratory of Application of Electro-technologies in Agriculture, Lviv National Agriculture University. The process facilitated separation of damaged, low quality seeds from the sowing material. The initial mean level of mechanically damaged seeds in the winter rapeseed cultivars studied varied within the range of 15.8-20.1%. Verification of the amount of seeds with mechanical damage was performed on X-ray images of seeds acquired by means of a digital X-ray apparatus. In the course of analysis of the X-ray images, it was noted that the mean level of mechanical damage to the seeds after the electro-separation was in the range of 2.1-3.8%. The application of the method of separation of rapeseeds in the corona discharge field yielded a significant reduction of the level of seeds with mechanical damage. The application of the method in practice may effectively contribute to improvement of the utility value of sowing material or seed material for production of edible oil.
Evans, R.J.; Chum, H.L.
1994-06-14
A process is described using fast pyrolysis to convert a plastic waste feed stream containing polycarbonate and ABS to high value monomeric constituents prior to pyrolysis of other plastic components therein comprising: selecting a first temperature program range to cause pyrolysis of a given polymer to its high value monomeric constituents prior to a temperature range that causes pyrolysis of other plastic components; selecting an acid or base catalysts and an oxide or carbonate support for treating the feed stream to affect acid or base catalyzed reaction pathways to maximize yield or enhance separation of the high value monomeric constituents of polycarbonate and ABS in the first temperature program range; differentially heating the feed stream at a heat rate within the first temperature program range to provide differential pyrolysis for selective recovery of optimum quantities of the high value monomeric constituents prior to pyrolysis or other plastic components; separating the high value monomeric constituents from the polycarbonate to cause pyrolysis to a different high value monomeric constituent of the plastic waste and differentially heating the feed stream at the second higher temperature program range to cause pyrolysis of different high value monomeric constituents; and separating the different high value monomeric constituents. 68 figs.
Evans, R.J.; Chum, H.L.
1998-10-13
A process is described for using fast pyrolysis in a carrier gas to convert a plastic waste feed stream having a mixed polymeric composition in a manner such that pyrolysis of a given polymer to its high value monomeric constituent occurs prior to pyrolysis of other plastic components therein comprising: selecting a first temperature program range to cause pyrolysis of said given polymer to its high value monomeric constituent prior to a temperature range that causes pyrolysis of other plastic components; selecting a catalyst and support for treating said feed streams with said catalyst to effect acid or base catalyzed reaction pathways to maximize yield or enhance separation of said high value monomeric constituent in said temperature program range; differentially heating said feed stream at a heat rate within the first temperature program range to provide differential pyrolysis for selective recovery of optimum quantities of the high value monomeric constituent prior to pyrolysis of other plastic components; separating the high value monomeric constituents; selecting a second higher temperature range to cause pyrolysis of a different high value monomeric constituent of said plastic waste and differentially heating the feed stream at the higher temperature program range to cause pyrolysis of the different high value monomeric constituent; and separating the different high value monomeric constituent. 83 figs.
Evans, Robert J.; Chum, Helena L.
1998-01-01
A process of using fast pyrolysis in a carrier gas to convert a plastic waste feedstream having a mixed polymeric composition in a manner such that pyrolysis of a given polymer to its high value monomeric constituent occurs prior to pyrolysis of other plastic components therein comprising: selecting a first temperature program range to cause pyrolysis of said given polymer to its high value monomeric constituent prior to a temperature range that causes pyrolysis of other plastic components; selecting a catalyst and support for treating said feed streams with said catalyst to effect acid or base catalyzed reaction pathways to maximize yield or enhance separation of said high value monomeric constituent in said temperature program range; differentially heating said feed stream at a heat rate within the first temperature program range to provide differential pyrolysis for selective recovery of optimum quantities of the high value monomeric constituent prior to pyrolysis of other plastic components; separating the high value monomeric constituents; selecting a second higher temperature range to cause pyrolysis of a different high value monomeric constituent of said plastic waste and differentially heating the feedstream at the higher temperature program range to cause pyrolysis of the different high value monomeric constituent; and separating the different high value monomeric constituent.
Evans, Robert J.; Chum, Helena L.
1994-01-01
A process of using fast pyrolysis to convert a plastic waste feed stream containing polycarbonate and ABS to high value monomeric constituents prior to pyrolysis of other plastic components therein comprising: selecting a first temperature program range to cause pyrolysis of a given polymer to its high value monomeric constituents prior to a temperature range that causes pyrolysis of other plastic components; selecting an acid or base catalysts and an oxide or carbonate support for treating the feed stream to affect acid or base catalyzed reaction pathways to maximize yield or enhance separation of the high value monomeric constituents of polycarbonate and ABS in the first temperature program range; differentially heating the feed stream at a heat rate within the first temperature program range to provide differential pyrolysis for selective recovery of optimum quantities of the high value monomeric constituents prior to pyrolysis or other plastic components; separating the high value monomeric constituents from the polycarbonate to cause pyrolysis to a different high value monomeric constituent of the plastic waste and differentially heating the feed stream at the second higher temperature program range to cause pyrolysis of different high value monomeric constituents; and separating the different high value monomeric constituents.
SEPARATION OF RADIOACTIVE COLUMBIUM TRACER
Glendenin, L.E.; Gest, H.
1958-08-26
A process is presented for the recovery of radioactive columbium from solutions containing such columbium together with radioactive tellurium. The columbium and tellurium values are separated from such solutions by means of an inorganic oxide carrier precipitate, such as MnO/sub 2/. This oxide carrier precipitate and its associated columbium and telluriuan values are then dissolved in an aqueous acidic solution and nonradioactive tellurium, in an ionic form, is then introduced into such solution, for example in the form of H/sub 2/TeO/sub 3/. The tellurium present in the solution is then reduced to the elemental state and precipitates, and is then separated from the supernataat solution. A basic acetate precipitate is formed in the supernatant and carries the remaining columblum values therefrom. After separation, this basic ferric acetate precipitate is dissolved, and the ferric ions are removed by means of an organic solvent extraction process utilizing ether. The remaining solution contains carrier-free columbium as its only metal ion.
Two optimal working regimes of the ”long” Iguasu gas centrifuge
NASA Astrophysics Data System (ADS)
Borman, V. D.; Bogovalov, S. V.; Borisevich, V. D.; Tronin, I. V.; Tronin, V. N.
2016-09-01
We argue on the basis of the results of optimization calculations that the dependence of the optimal separative power of the Iguasu gas centrifuge with 2 m rotor has two local maxima,corresponding pressures of p max1 = 35 mmHg and p max2 = 350 mmHg. The optimal separative power values in these maxima differ by the value of 0.6%. Low pressure maximum is caused by the thermal drive, whereas high pressure maximum is caused by both thermal and mechanical drives. High pressure maximum is located on wide ’’plateau” from p 1 = 200 mmHg to p 2 = 500 mmHg, where the optimal separative power changes in the range of 0.7%. In this way, Iguasu gas centrifuge has two optimal working regimes with different sets of working parameters and close slightly different values of the separative power. Calculations show that high pressure regime is less sensitive to the parameters change than low pressure one.
Capillary electrophoresis-based assay of phosphofructokinase-1.
Malina, Andrew; Bryant, Sherrisse K; Chang, Simon H; Waldrop, Grover L; Gilman, S Douglass
2014-02-15
An assay was developed for phosphofructokinase-1 (PFK-1) using capillary electrophoresis (CE). In the glycolytic pathway, this enzyme catalyzes the rate-limiting step from fructose-6-phosphate and magnesium-bound adenosine triphosphate (Mg-ATP) to fructose-1,6-bisphosphate and magnesium-bound adenosine diphosphate (Mg-ADP). This enzyme has recently become a research target because of the importance of glycolysis in cancer and obesity. The CE assay for PFK-1 is based on the separation and detection by ultraviolet (UV) absorbance at 260 nm of Mg-ATP and Mg-ADP. The separation was enhanced by the addition of Mg²⁺ to the separation buffer. Inhibition studies of PFK-1 by aurintricarboxylic acid and palmitoyl coenzyme A were also performed. An IC₅₀ value was determined for aurintricarboxylic acid, and this value matched values in the literature obtained using coupled spectrophotometric assays. This assay for PFK-1 directly monitors the enzyme-catalyzed reaction, and the CE separation reduces the potential of spectral interference by inhibitors.
Capillary Electrophoresis-Based Assay of Phosphofructokinase-1
Malina, Andrew; Bryant, Sherrisse K.; Chang, Simon H.; Waldrop, Grover L.; Gilman, S. Douglass
2013-01-01
An assay was developed for phosphofructokinase-1 (PFK-1) using capillary electrophoresis (CE). In the glycolytic pathway, this enzyme catalyzes the rate-limiting step from fructose-6-phosphate and magnesium-bound adenosine triphosphate (Mg-ATP) to fructose-1,6-bisphosphate and magnesium-bound adenosine diphosphate (Mg-ADP). This enzyme has recently become a research target because of the importance of glycolysis in cancer and obesity. The CE assay for PFK-1 is based on the separation and detection by UV absorbance at 260 nm of Mg-ATP and Mg-ADP. The separation was enhanced by addition of Mg2+ to the separation buffer. Inhibition studies of PFK-1 by aurintricarboxylic acid and palmitoyl coenzyme A were also performed. An IC50 value was determined for aurintricarboxylic acid, and this value matched values in the literature obtained using coupled spectrophotometric assays. This assay for PFK-1 directly monitors the enzyme-catalyzed reaction, and the CE separation reduces the potential of spectral interference by inhibitors. PMID:24444856
Turbulence measurements in hypersonic shock-wave boundary-layer interaction flows
NASA Technical Reports Server (NTRS)
Mikulla, V.; Horstman, C. C.
1976-01-01
Turbulent intensity and Reynolds shear stress measurements are presented for two nonadiabatic hypersonic shock-wave boundary-layer interaction flows, one with and one without separation. These measurements were obtained using a new hot-wire probe specially designed for heated flows. Comparison of the separated and attached flows shows a significant increase above equilibrium values in the turbulent intensity and shear stress downstream of the interaction region for the attached case, while for the separated case, the turbulent fluxes remain close to equilibrium values. This effect results in substantial differences in turbulence lifetime for the two flows. We propose that these differences are due to a coupling between the turbulent energy and separation bubble unsteadiness, a hypothesis supported by the statistical properties of the turbulent fluctuations.
Evans, R.J.; Chum, H.L.
1994-10-25
A process of using fast pyrolysis in a carrier gas to convert a plastic waste feedstream having a mixed polymeric composition in a manner such that pyrolysis of a given polymer to its high value monomeric constituent occurs prior to pyrolysis of other plastic components therein comprising: selecting a first temperature program range to cause pyrolysis of said given polymer to its high value monomeric constituent prior to a temperature range that causes pyrolysis of other plastic components; selecting a catalyst and support for treating said feed streams with said catalyst to effect acid or base catalyzed reaction pathways to maximize yield or enhance separation of said high value monomeric constituent in said temperature program range; differentially heating said feed stream at a heat rate within the first temperature program range to provide differential pyrolysis for selective recovery of optimum quantities of the high value monomeric constituent prior to pyrolysis of other plastic components; separating the high value monomeric constituents; selecting a second higher temperature range to cause pyrolysis of a different high value monomeric constituent of said plastic waste and differentially heating the feedstream at the higher temperature program range to cause pyrolysis of the different high value monomeric constituent; and separating the different high value monomeric constituent. 83 figs.
Evans, Robert J.; Chum, Helena L.
1994-01-01
A process of using fast pyrolysis in a carrier gas to convert a plastic waste feedstream having a mixed polymeric composition in a manner such that pyrolysis of a given polymer to its high value monomeric constituent occurs prior to pyrolysis of other plastic components therein comprising: selecting a first temperature program range to cause pyrolysis of said given polymer to its high value monomeric constituent prior to a temperature range that causes pyrolysis of other plastic components; selecting a catalyst and support for treating said feed streams with said catalyst to effect acid or base catalyzed reaction pathways to maximize yield or enhance separation of said high value monomeric constituent in said temperature program range; differentially heating said feed stream at a heat rate within the first temperature program range to provide differential pyrolysis for selective recovery of optimum quantities of the high value monomeric constituent prior to pyrolysis of other plastic components; separating the high value monomeric constituents; selecting a second higher temperature range to cause pyrolysis of a different high value monomeric constituent of said plastic waste and differentially heating the feedstream at the higher temperature program range to cause pyrolysis of the different high value monomeric constituent; and separating the different high value monomeric constituent.
Evans, R.J.; Chum, H.L.
1994-04-05
A process is described for using fast pyrolysis in a carrier gas to convert a plastic waste feedstream having a mixed polymeric composition in a manner such that pyrolysis of a given polymer to its high value monomeric constituent occurs prior to pyrolysis of other plastic components therein comprising: selecting a first temperature program range to cause pyrolysis of said given polymer to its high value monomeric constituent prior to a temperature range that causes pyrolysis of other plastic components; selecting a catalyst and support for treating said feed streams with said catalyst to effect acid or base catalyzed reaction pathways to maximize yield or enhance separation of said high value monomeric constituent in said temperature program range; differentially heating said feed stream at a heat rate within the first temperature program range to provide differential pyrolysis for selective recovery of optimum quantities of the high value monomeric constituent prior to pyrolysis of other plastic components; separating the high value monomeric constituents, selecting a second higher temperature range to cause pyrolysis of a different high value monomeric constituent of said plastic waste and differentially heating the feedstream at the higher temperature program range to cause pyrolysis of the different high value monomeric constituent; and separating the different high value monomeric constituent. 87 figures.
Evans, R.J.; Chum, H.L.
1994-10-25
A process of using fast pyrolysis in a carrier gas to convert a plastic waste feedstream having a mixed polymeric composition in a manner such that pyrolysis of a given polymer to its high value monomeric constituent occurs prior to pyrolysis of other plastic components therein comprising: selecting a first temperature program range to cause pyrolysis of said given polymer to its high value monomeric constituent prior to a temperature range that causes pyrolysis of other plastic components; selecting a catalyst and support for treating said feed streams with said catalyst to effect acid or base catalyzed reaction pathways to maximize yield or enhance separation of said high value monomeric constituent in said temperature program range; differentially heating said feed stream at a heat rate within the first temperature program range to provide differential pyrolysis for selective recovery of optimum quantities of the high value monomeric constituent prior to pyrolysis of other plastic components; separating the high value monomeric constituents; selecting a second higher temperature range to cause pyrolysis of a different high value monomeric constituent of said plastic waste and differentially heating the feedstream at the higher temperature program range to cause pyrolysis of the different high value monomeric constituent; and separating the different high value monomeric constituent. 83 figs.
Recalibration of the Shear Stress Transport Model to Improve Calculation of Shock Separated Flows
NASA Technical Reports Server (NTRS)
Georgiadis, Nicholas J.; Yoder, Dennis A.
2013-01-01
The Menter Shear Stress Transport (SST) k . turbulence model is one of the most widely used two-equation Reynolds-averaged Navier-Stokes turbulence models for aerodynamic analyses. The model extends Menter s baseline (BSL) model to include a limiter that prevents the calculated turbulent shear stress from exceeding a prescribed fraction of the turbulent kinetic energy via a proportionality constant, a1, set to 0.31. Compared to other turbulence models, the SST model yields superior predictions of mild adverse pressure gradient flows including those with small separations. In shock - boundary layer interaction regions, the SST model produces separations that are too large while the BSL model is on the other extreme, predicting separations that are too small. In this paper, changing a1 to a value near 0.355 is shown to significantly improve predictions of shock separated flows. Several cases are examined computationally and experimental data is also considered to justify raising the value of a1 used for shock separated flows.
Optimal control of orientation and entanglement for two dipole-dipole coupled quantum planar rotors.
Yu, Hongling; Ho, Tak-San; Rabitz, Herschel
2018-05-09
Optimal control simulations are performed for orientation and entanglement of two dipole-dipole coupled identical quantum rotors. The rotors at various fixed separations lie on a model non-interacting plane with an applied control field. It is shown that optimal control of orientation or entanglement represents two contrasting control scenarios. In particular, the maximally oriented state (MOS) of the two rotors has a zero entanglement entropy and is readily attainable at all rotor separations. Whereas, the contrasting maximally entangled state (MES) has a zero orientation expectation value and is most conveniently attainable at small separations where the dipole-dipole coupling is strong. It is demonstrated that the peak orientation expectation value attained by the MOS at large separations exhibits a long time revival pattern due to the small energy splittings arising form the extremely weak dipole-dipole coupling between the degenerate product states of the two free rotors. Moreover, it is found that the peak entanglement entropy value attained by the MES remains largely unchanged as the two rotors are transported to large separations after turning off the control field. Finally, optimal control simulations of transition dynamics between the MOS and the MES reveal the intricate interplay between orientation and entanglement.
Fu, Shuangcheng; Fang, Yong; Yuan, Huixin; Tan, Wanjiang; Dong, Yiwen
2017-09-01
Hydrocyclones can be applied to recycle waste plastics with different densities through separating plastics based on their differences in densities. In the process, the medium density is one of key parameters and the value of the medium's density is not just the average of the density of two kinds of plastics separated. Based on the force analysis and establishing the equation of motion of particles in the hydrocyclone, a formula to calculate the optimum separation medium density has been deduced. This value of the medium's density is a function of various parameters including the diameter, density, radial position and tangential velocity of particles, and viscosity of the medium. Tests on the separation performance of the hydrocyclone has been conducted with PET and PVC particles. The theoretical result appeared to be in good agreement with experimental results. Copyright © 2017 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Snow, Mathew S.; Finck, Martha R.; Carney, Kevin P.
2017-02-01
Ta, Hf, and W analyses from complex matrices (including environmental samples) require high purification of these analytes from each other and major/trace matrix constituents, however, current state-of-the-art Ta/Hf/W separations rely on traditional anion exchange approaches that suffer from relatively similar distribution coefficient (Kd) values for these analytes. This work reports assessment of three commercially available extraction chromatographic resins (TEVA, TRU, and UTEVA) for Ta/Hf/W separations. Batch contact studies show differences in Ta/W,Hf Kd values of up to 106, representing an improvement of a factor of 100 and 300 in Ta/Hf and Ta/W Kd values (respectively) over AG1x4 resin. Variations inmore » the Kd values as a function of HCl concentration for TRU resin show that this resin is well suited for Ta/Hf/W separations, with Ta/Hf, Ta/W, and W/Hf Kd value improvements of 10, 200, and 30 (respectively) over AG1x4 resin. Finally, analyses of digested soil samples (NIST 2710a) using TRU resin and tandem TEVA-TRU columns demonstrate the ability to achieve extremely high purification (>99%) of Ta and W from each other and Hf, as well as enabling very high purification of Ta and W from the major and trace elemental constituents present in soils, using a single chromatographic step.« less
Snow, Mathew S.; Finck, Martha R.; Carney, Kevin P.; ...
2017-01-08
Ta, Hf, and W analyses from complex matrices (including environmental samples) require high purification of these analytes from each other and major/trace matrix constituents, but, current state-of-the-art Ta/Hf/W separations rely on traditional anion exchange approaches that suffer from relatively similar distribution coefficient (Kd) values for these analytes. Our work reports assessment of three commercially available extraction chromatographic resins (TEVA, TRU, and UTEVA) for Ta/Hf/W separations. Batch contact studies show differences in Ta/W,Hf Kd values of up to 10 6, representing an improvement of a factor of 100 and 300 in Ta/Hf and Ta/W Kd values (respectively) over AG1x4 resin. Furthermore,more » variations in the Kd values as a function of HCl concentration for TRU resin show that this resin is well suited for Ta/Hf/W separations, with Ta/Hf, Ta/W, and W/Hf Kd value improvements of 10, 200, and 30 (respectively) over AG1x4 resin. Finally, analyses of digested soil samples (NIST 2710a) using TRU resin and tandem TEVA-TRU columns demonstrate the ability to achieve extremely high purification (>99%) of Ta and W from each other and Hf, as well as enabling very high purification of Ta and W from the major and trace elemental constituents present in soils, using a single chromatographic step.« less
Rios, Pedro; Stuart, Julie Ann; Grant, Ed
2003-12-01
Annual plastic flows through the business and consumer electronics manufacturing supply chain include nearly 3 billion lb of high-value engineering plastics derived from petroleum. The recovery of resource value from this stream presents critical challenges in areas of materials identification and recycling process design that demand new green engineering technologies applied together with life cycle assessment and ecological supply chain analysis to create viable plastics-to-plastics supply cycles. The sustainable recovery of potentially high-value engineering plastics streams requires that recyclers either avoid mixing plastic parts or purify later by separating smaller plastic pieces created in volume reduction (shredding) steps. Identification and separation constitute significant barriers in the plastics-to-plastics recycling value proposition. In the present work, we develop a model that accepts randomly arriving electronic products to study scenarios by which a recycler might identify and separate high-value engineering plastics as well as metals. Using discrete eventsimulation,we compare current mixed plastics recovery with spectrochemical plastic resin identification and subsequent sorting. Our results show that limited disassembly with whole-part identification can produce substantial yields in separated streams of recovered engineering thermoplastics. We find that disassembly with identification does not constitute a bottleneck, but rather, with relatively few workers, can be configured to pull the process and thus decrease maximum staging space requirements.
Turbulence measurements in hypersonic shock-wave boundary-layer interaction flows
NASA Technical Reports Server (NTRS)
Mikulla, V.; Horstman, C. C.
1976-01-01
Turbulent intensity and Reynolds shear stress measurements are presented for two nonadiabatic hypersonic shock-wave boundary-layer interaction flows, one with and one without separation. These measurements were obtained using a new hot-wire probe specially designed for heated flows. Comparison of the separated and attached flows shows a significant increase above equilibrium values in the turbulent intensity and shear stress downstream of the interaction region for the attached case, while for the separated case, the turbulent fluxes remain close to equilibrium values. This effect results in substantial differences in turbulence lifetimes for the two flows. It is proposed that these differences are due to a coupling between the turbulent energy and separation bubble unsteadiness, a hypothesis supported by the statistical properties of the turbulent fluctuations.
An automated graphics tool for comparative genomics: the Coulson plot generator
2013-01-01
Background Comparative analysis is an essential component to biology. When applied to genomics for example, analysis may require comparisons between the predicted presence and absence of genes in a group of genomes under consideration. Frequently, genes can be grouped into small categories based on functional criteria, for example membership of a multimeric complex, participation in a metabolic or signaling pathway or shared sequence features and/or paralogy. These patterns of retention and loss are highly informative for the prediction of function, and hence possible biological context, and can provide great insights into the evolutionary history of cellular functions. However, representation of such information in a standard spreadsheet is a poor visual means from which to extract patterns within a dataset. Results We devised the Coulson Plot, a new graphical representation that exploits a matrix of pie charts to display comparative genomics data. Each pie is used to describe a complex or process from a separate taxon, and is divided into sectors corresponding to the number of proteins (subunits) in a complex/process. The predicted presence or absence of proteins in each complex are delineated by occupancy of a given sector; this format is visually highly accessible and makes pattern recognition rapid and reliable. A key to the identity of each subunit, plus hierarchical naming of taxa and coloring are included. A java-based application, the Coulson plot generator (CPG) automates graphic production, with a tab or comma-delineated text file as input and generating an editable portable document format or svg file. Conclusions CPG software may be used to rapidly convert spreadsheet data to a graphical matrix pie chart format. The representation essentially retains all of the information from the spreadsheet but presents a graphically rich format making comparisons and identification of patterns significantly clearer. While the Coulson plot format is highly useful in comparative genomics, its original purpose, the software can be used to visualize any dataset where entity occupancy is compared between different classes. Availability CPG software is available at sourceforge http://sourceforge.net/projects/coulson and http://dl.dropbox.com/u/6701906/Web/Sites/Labsite/CPG.html PMID:23621955
Exploring the universal ecological responses to climate change in a univoltine butterfly.
Fenberg, Phillip B; Self, Angela; Stewart, John R; Wilson, Rebecca J; Brooks, Stephen J
2016-05-01
Animals with distinct life stages are often exposed to different temperatures during each stage. Thus, how temperature affects these life stages should be considered for broadly understanding the ecological consequences of climate warming on such species. For example, temperature variation during particular life stages may affect respective change in body size, phenology and geographic range, which have been identified as the "universal" ecological responses to climate change. While each of these responses has been separately documented across a number of species, it is not known whether each response occurs together within a species. The influence of temperature during particular life stages may help explain each of these ecological responses to climate change. Our goal was to determine if monthly temperature variation during particular life stages of a butterfly species can predict respective changes in body size and phenology. We also refer to the literature to assess if temperature variability during the adult stage influences range change over time. Using historical museum collections paired with monthly temperature records, we show that changes in body size and phenology of the univoltine butterfly, Hesperia comma, are partly dependent upon temporal variation in summer temperatures during key stages of their life cycle. June temperatures, which are likely to affect growth rate of the final larval instar, are important for predicting adult body size (for males only; showing a positive relationship with temperature). July temperatures, which are likely to influence the pupal stage, are important for predicting the timing of adult emergence (showing a negative relationship with temperature). Previous studies show that August temperatures, which act on the adult stage, are linked to range change. Our study highlights the importance of considering temperature variation during each life stage over historic time-scales for understanding intraspecific response to climate change. Range edge studies of ectothermic species that have annual life cycles, long time-series occurrence data, and associated temperature records (ideally at monthly resolutions) could be useful model systems for intraspecific tests of the universal ecological responses to climate change and for exploring interactive effects. © 2016 The Authors. Journal of Animal Ecology © 2016 British Ecological Society.
Measuring the economic value of wildlife: a caution
T. H. Stevens
1992-01-01
Wildlife values appear to be very sensitive to whether species are evaluated separately or together, and value estimates often seem inconsistent with neoclassical economic theory. Wildlife value estimates must therefore be used with caution. Additional research about the nature of individual value structures for wildlife is needed.
Uptake of liquid from wet surfaces by the brush-tipped proboscis of a butterfly.
Lee, Seung Chul; Lee, Sang Joon
2014-11-06
This study investigated the effect of the brush-tipped proboscis of the Asian comma (Polygonia c-aureum) on wet-surface feeding. The tip region of this proboscis was observed, especially two microstructures; the intake slits through which liquid passes into the proboscis and the brush-like sensilla styloconica. The sensilla styloconica were connected laterally to the intake slits in the tip region. The liquid-feeding flow between the proboscis and the wet surface was measured by micro-particle image velocimetry. During liquid feeding, the sensilla styloconica region accumulates liquid by pinning the air-liquid interface to the tips of the sensilla styloconica, thus the intake slit region remains immersed. The film flow that passes through the sensilla styloconica region shows a parabolic velocity profile, and the corresponding flow rate is proportional to the cubed length of the sensilla styloconica. Based on these observations, we demonstrated that the sensilla styloconica promotes the uptake of liquid from wet surfaces. This study may inspire the development of a microfluidic device to collect liquid from moist substrates.
Uptake of liquid from wet surfaces by the brush-tipped proboscis of a butterfly
Lee, Seung Chul; Lee, Sang Joon
2014-01-01
This study investigated the effect of the brush-tipped proboscis of the Asian comma (Polygonia c-aureum) on wet-surface feeding. The tip region of this proboscis was observed, especially two microstructures; the intake slits through which liquid passes into the proboscis and the brush-like sensilla styloconica. The sensilla styloconica were connected laterally to the intake slits in the tip region. The liquid-feeding flow between the proboscis and the wet surface was measured by micro-particle image velocimetry. During liquid feeding, the sensilla styloconica region accumulates liquid by pinning the air-liquid interface to the tips of the sensilla styloconica, thus the intake slit region remains immersed. The film flow that passes through the sensilla styloconica region shows a parabolic velocity profile, and the corresponding flow rate is proportional to the cubed length of the sensilla styloconica. Based on these observations, we demonstrated that the sensilla styloconica promotes the uptake of liquid from wet surfaces. This study may inspire the development of a microfluidic device to collect liquid from moist substrates. PMID:25373895
Uptake of liquid from wet surfaces by the brush-tipped proboscis of a butterfly
NASA Astrophysics Data System (ADS)
Lee, Seung Chul; Lee, Sang Joon
2014-11-01
This study investigated the effect of the brush-tipped proboscis of the Asian comma (Polygonia c-aureum) on wet-surface feeding. The tip region of this proboscis was observed, especially two microstructures; the intake slits through which liquid passes into the proboscis and the brush-like sensilla styloconica. The sensilla styloconica were connected laterally to the intake slits in the tip region. The liquid-feeding flow between the proboscis and the wet surface was measured by micro-particle image velocimetry. During liquid feeding, the sensilla styloconica region accumulates liquid by pinning the air-liquid interface to the tips of the sensilla styloconica, thus the intake slit region remains immersed. The film flow that passes through the sensilla styloconica region shows a parabolic velocity profile, and the corresponding flow rate is proportional to the cubed length of the sensilla styloconica. Based on these observations, we demonstrated that the sensilla styloconica promotes the uptake of liquid from wet surfaces. This study may inspire the development of a microfluidic device to collect liquid from moist substrates.
Mating system and the evolution of sex-specific mortality rates in two nymphalid butterflies.
Wiklund, Christer; Gotthard, Karl; Nylin, Sören
2003-01-01
Life-history theory predicts that organisms should invest resources into intrinsic components of lifespan only to the degree that it pays off in terms of reproductive success. The benefit of a long life may differ between the sexes and different mating systems may therefore select for different sex-specific mortality rates. In insects with polyandrous mating systems, females mate throughout their lives and male reproductive success is likely to increase monotonously with lifespan. In monandrous systems, where the mating season is less protracted because receptive females are available only at the beginning of the flight season, male mating success should be less dependent on a long lifespan. Here, we show, in a laboratory experiment without predation, that the duration of the mating season is longer in the polyandrous comma butterfly, Polygonia c-album, than in the monandrous peacock butterfly, Inachis io, and that, in line with predictions, male lifespan is shorter than female lifespan in I. io, whereas male and female lifespans are similar in P. c-album. PMID:12964985
BOREAS HYD-2 Estimated Snow Water Equivalent (SWE) from Microwave Measurements
NASA Technical Reports Server (NTRS)
Powell, Hugh; Chang, Alfred T. C.; Hall, Forrest G. (Editor); Knapp, David E. (Editor); Smith, David E. (Technical Monitor)
2000-01-01
The surface meteorological data collected at the Boreal Ecosystem-Atmosphere Study (BOREAS) tower and ancillary sites are being used as inputs to an energy balance model to monitor the amount of snow storage in the boreal forest region. The BOREAS Hydrology (HYD)-2 team used Snow Water Equivalent (SWE) derived from an energy balance model and in situ observed SWE to compare the SWE inferred from airborne and spaceborne microwave data, and to assess the accuracy of microwave retrieval algorithms. The major external measurements that are needed are snowpack temperature profiles, in situ snow areal extent, and SWE data. The data in this data set were collected during February 1994 and cover portions of the Southern Study Area (SSA), Northern Study Area (NSA), and the transect areas. The data are available from BORIS as comma-delimited tabular ASCII files. The SWE data are available from the Earth Observing System Data and Information System (EOSDIS) Oak Ridge National Laboratory (ORNL) Distributed Active Archive Center (DAAC). The data files are available on a CD-ROM (see document number 20010000884).
Different Trichoscopic Features of Tinea Capitis and Alopecia Areata in Pediatric Patients
El-Taweel, Abd-Elaziz; El-Esawy, Fatma; Abdel-Salam, Osama
2014-01-01
Background. Diagnosis of patchy hair loss in pediatric patients is often a matter of considerable debate among dermatologists. Trichoscopy is a rapid and noninvasive tool to detect more details of patchy hair loss. Like clinical dermatology, trichoscopy works parallel to the skin surface and perpendicular to the histological plane; like the histopathology, it thus allows the viewing of structures not discovered by the naked eye. Objective. Aiming to compare the different trichoscopic features of tinea capitis and alopecia areata in pediatric patients. Patients and Methods. This study included 40 patients, 20 patients with tinea capitis and 20 patients with alopecia areata. They were exposed toclinical examination, laboratory investigations (10% KOH and fungal culture), and trichoscope examination. Results. Our obtained results reported that, in tinea capitis patients, comma shaped hairs, corkscrew hairs, and zigzag shaped hairs are the diagnostic trichoscopic features of tinea capitis. While in alopecia areata patients, the most trichoscopic specific features were yellow dots, exclamation mark, and short vellus hairs. Conclusion. Trichoscopy can be used as a noninvasive tool for rapid diagnosis of tinea capitis and alopecia areata in pediatric patients. PMID:25024698
DOE Office of Scientific and Technical Information (OSTI.GOV)
Booth, Brian W., E-mail: brbooth@clemson.edu; Institute for Biological Interfaces of Engineering, Clemson University, Clemson, SC 29634; Boulanger, Corinne A.
2010-02-01
Amphiregulin (AREG), a ligand for epidermal growth factor receptor, is required for mammary gland ductal morphogenesis and mediates estrogen actions in vivo, emerging as an essential growth factor during mammary gland growth and differentiation. The COMMA-D {beta}-geo (CD{beta}geo) mouse mammary cell line displays characteristics of normal mammary progenitor cells including the ability to regenerate a mammary gland when transplanted into the cleared fat pad of a juvenile mouse, nuclear label retention, and the capacity to form anchorage-independent mammospheres. We demonstrate that AREG is essential for formation of floating mammospheres by CD{beta}geo cells and that the mitogen activated protein kinase signalingmore » pathway is involved in AREG-mediated mammosphere formation. Addition of exogenous AREG promotes mammosphere formation in cells where AREG expression is knocked down by siRNA and mammosphere formation by AREG{sup -/-} mammary epithelial cells. AREG knockdown inhibits mammosphere formation by duct-limited mammary progenitor cells but not lobule-limited mammary progenitor cells. These data demonstrate AREG mediates the function of a subset of mammary progenitor cells in vitro.« less
Booth, Brian W; Boulanger, Corinne A; Anderson, Lisa H; Jimenez-Rojo, Lucia; Brisken, Cathrin; Smith, Gilbert H
2010-02-01
Amphiregulin (AREG), a ligand for epidermal growth factor receptor, is required for mammary gland ductal morphogenesis and mediates estrogen actions in vivo, emerging as an essential growth factor during mammary gland growth and differentiation. The COMMA-D beta-geo (CDbetageo) mouse mammary cell line displays characteristics of normal mammary progenitor cells including the ability to regenerate a mammary gland when transplanted into the cleared fat pad of a juvenile mouse, nuclear label retention, and the capacity to form anchorage-independent mammospheres. We demonstrate that AREG is essential for formation of floating mammospheres by CDbetageo cells and that the mitogen activated protein kinase signaling pathway is involved in AREG-mediated mammosphere formation. Addition of exogenous AREG promotes mammosphere formation in cells where AREG expression is knocked down by siRNA and mammosphere formation by AREG(-/-) mammary epithelial cells. AREG knockdown inhibits mammosphere formation by duct-limited mammary progenitor cells but not lobule-limited mammary progenitor cells. These data demonstrate AREG mediates the function of a subset of mammary progenitor cells in vitro. Copyright 2009 Elsevier Inc. All rights reserved.
McAllister, Chris T.; Duszynski, Donald W.; Fisher, Robert N.; Austin, Christopher C.
2014-01-01
Between September 1990 and November 1991, 19 Sphenomorphus spp. skinks, including nine S. jobiense, three S. simus, and seven Solomon ground skinks, S. solomonis (Boulenger), were collected from Madang and Morobe Provinces, Papua New Guinea (PNG), and examined for coccidia. A single S. solomonis was found to be infected with a new species of Eimeria Schneider, 1875. Oöcysts of Eimeria perkinsae n. sp. are ellipsoidal with a smooth, colourless, bi-layered wall, measure 18.6 × 14.7 μm, and have a length/width (L/W) ratio of 1.3; both micropyle and oöcyst residuum are absent, but a fragmented polar granule is present. Sporocysts are ovoidal, 8.9 × 6.4 μm, L/W 1.4; neither Stieda, sub-Stieda or para-Stieda bodies are present; a sporocyst residuum consisted of a loose cluster of granules dispersed between sporozoites. Sporozoites are comma-shaped with spheroidal anterior and posterior refractile bodies. This represents the first report of coccidia from this skink genus.
Cohen, Michael R.; Smetzer, Judy L.
2014-01-01
These medication errors have occurred in health care facilities at least once. They will happen again—perhaps where you work. Through education and alertness of personnel and procedural safeguards, they can be avoided. You should consider publishing accounts of errors in your newsletters and/or presenting them at your inservice training programs. Your assistance is required to continue this feature. The reports described here were received through the Institute for Safe Medication Practices (ISMP) Medication Errors Reporting Program. Any reports published by ISMP will be anonymous. Comments are also invited; the writers’ names will be published if desired. ISMP may be contacted at the address shown below. Errors, close calls, or hazardous conditions may be reported directly to ISMP through the ISMP Web site (www.ismp.org), by calling 800-FAIL-SAFE, or via e-mail at ismpinfo@ismp.org. ISMP guarantees the confidentiality and security of the information received and respects reporters’ wishes as to the level of detail included in publications. PMID:25477591
Kan, Hyo; Tsukagoshi, Kazuhiko
2017-07-01
Protein mixtures were separated using tube radial distribution chromatography (TRDC) in a polytetrafluoroethylene (PTFE) capillary (internal diameter=100µm) separation tube. Separation by TRDC is based on the annular flow in phase separation multiphase flow and features an open-tube capillary without the use of specific packing agents or application of high voltages. Preliminary experiments were conducted to examine the effects of pH and salt concentration on the phase diagram of the ternary mixed solvent solution of water-acetonitrile-ethyl acetate (8:2:1 volume ratio) and on the TRDC system using the ternary mixed solvent solution. A model protein mixture containing peroxidase, lysozyme, and bovine serum albumin was analyzed via TRDC with the ternary mixed solvent solution at various pH values, i.e., buffer-acetonitrile-ethyl acetate (8:2:1 volume ratio). Protein was separated on the chromatograms by the TRDC system, where the elution order was determined by the relation between the isoelectric points of protein and the pH values of the solvent solution. Copyright © 2017 Elsevier B.V. All rights reserved.
Hosseinzadeh Nik, Tahereh; Shahsavari, Negin; Ghadirian, Hannaneh; Ostad, Seyed Nasser
2016-07-01
The aim of this randomized clinical study was to investigate the effectiveness of acetaminophen 650 mg or liquefied ibuprofen 400 mg in pain control of orthodontic patients during separation with an elastic separator. A total of 101 patients with specific inclusion criteria were divided randomly into three groups (acetaminophen, liquefied ibuprofen, and placebo). They were instructed to take their drugs one hour before separator placement and every six hours afterward (five doses in total). They recorded their discomfort on visual analog scales immediately after separator placement, 2 hours later, 6 hours later, at bedtime, and 24 hours after separator placement. Repeated measure analysis of variance (ANOVA) was used to compare the mean pain scores between the three groups. Data were collected from 89 patients. The pain increased with time in all groups. Pain scores were statistically lower in the analgesic groups compared with the placebo group (P.value<0.001), but no statistically significant difference was found in mean pain scores between the two drug groups (acetaminophen and liquefied ibuprofen) (P.value=1). Acetaminophen and liquefied ibuprofen have similar potential in pain reduction during separation.
SEPARATION OF THORIUM FROM URANIUM
Bane, R.W.
1959-09-01
A description is given for the separation of thorium from uranium by forming an aqueous acidic solution containing ionic species of thorium, uranyl uranium, and hydroxylamine, flowing the solution through a column containing the phenol-formaldehyde type cation exchange resin to selectively adsorb substantially all the thorium values and a portion of the uranium values, flowing a dilute solution of hydrochloric acid through the column to desorb the uranium values, and then flowing a dilute aqueous acidic solution containing an ion, such as bisulfate, which has a complexing effect upon thortum through the column to desorb substantially all of the thorium.
ADSORPTION METHOD FOR SEPARATING THORIUM VALUES FROM URANIUM VALUES
Boyd, G.E.; Russell, E.R.; Schubert, J.
1959-08-01
An improved ion exchange method is described for recovery of uranium and thorium values as separate functions from an aqueous acidic solution containing less than 10/sup -3/ M thorium ions and between 0.1 and 1 M uranyl ions. The solution is passed through a bed of cation exchange resin in the acid form to adsorb all the thorium ions and a portion of the uranyl ions. The uranium is eluted by means of aqueous 0.1 to 0.4 M sulfuric acid. The thorium may then be stripped from the resin by elution with aqueous 0.5 M oxalic acid.
ADSORPTION METHOD FOR SEPARATING THORIUM VALUES FROM URANIUM VALUES
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boyd, G.E.; Russell, E.R.; Schubert, J.
An improved ion exchange method is described for recovery of uranium and thorium values as separate functions from an aqueous acidic solution containing less than 10/sup -3/ M thorium ions and between 0.1 and 1 M uranyl ions. The solution is passed through a bed of cation exchange resin in the acid form to adsorb all the thorium ions and a portion of the uranyl ions. The uranium is eluted by means of aqueous 0.1 to 0.4 M sulfuric acid. The thorium may then be stripped from the resin by elution with aqueous 0.5 M oxalic acid.
Pyrolysis of polystyrene - polyphenylene oxide to recover styrene and useful products
Evans, Robert J.; Chum, Helena L.
1995-01-01
A process of using fast pyrolysis in a carrier gas to convert a polystyrene and polyphenylene oxide plastic waste to a given polystyrene and polyphenylene oxide prior to pyrolysis of other plastic components therein comprising: selecting a first temperature range to cause pyrolysis of given polystyrene and polyphenylene oxide and its high value monomeric constituent prior to a temperature range that causes pyrolysis of other plastic components; selecting a catalyst and a support and treating the feed stream with the catalyst to affect acid or base catalyzed reaction pathways to maximize yield or enhance separation of high value monomeric constituent of styrene from polystyrene and polyphenylene oxide in the first temperature range; differentially heating the feed stream at a heat rate within the first temperature range to provide differential pyrolysis for selective recovery of the high value monomeric constituent of styrene from polystyrene and polyphenylene oxide prior to pyrolysis of other plastic components; separating the high value monomer constituent of styrene; selecting a second higher temperature range to cause pyrolysis to a different derived high value product of polyphenylene oxide from the plastic waste and differentially heating the feed stream at the higher temperature range to cause pyrolysis of the plastic into a polyphenylene oxide derived product; and separating the different derived high value polyphenylene oxide product.
PROCESS OF SEPARATING PLUTONIUM VALUES BY ELECTRODEPOSITION
Whal, A.C.
1958-04-15
A process is described of separating plutonium values from an aqueous solution by electrodeposition. The process consists of subjecting an aqueous 0.1 to 1.0 N nitric acid solution containing plutonium ions to electrolysis between inert metallic electrodes. A current density of one milliampere io one ampere per square centimeter of cathode surface and a temperature between 10 and 60 d C are maintained. Plutonium is electrodeposited on the cathode surface and recovered.
Kinetics of motility-induced phase separation and swim pressure
NASA Astrophysics Data System (ADS)
Patch, Adam; Yllanes, David; Marchetti, M. Cristina
2017-01-01
Active Brownian particles (ABPs) represent a minimal model of active matter consisting of self-propelled spheres with purely repulsive interactions and rotational noise. Here we examine the pressure of ABPs in two dimensions in both closed boxes and systems with periodic boundary conditions and show that its nonmonotonic behavior with density is a general property of ABPs and is not the result of finite-size effects. We correlate the time evolution of the mean pressure towards its steady-state value with the kinetics of motility-induced phase separation. For parameter values corresponding to phase-separated steady states, we identify two dynamical regimes. The pressure grows monotonically in time during the initial regime of rapid cluster formation, overshooting its steady-state value and then quickly relaxing to it, and remains constant during the subsequent slower period of cluster coalescence and coarsening. The overshoot is a distinctive feature of active systems.
Australia's TERN: Advancing Ecosystem Data Management in Australia
NASA Astrophysics Data System (ADS)
Phinn, S. R.; Christensen, R.; Guru, S.
2013-12-01
Globally, there is a consistent movement towards more open, collaborative and transparent science, where the publication and citation of data is considered standard practice. Australia's Terrestrial Ecosystem Research Network (TERN) is a national research infrastructure investment designed to support the ecosystem science community through all stages of the data lifecycle. TERN has developed and implemented a comprehensive network of ';hard' and ';soft' infrastructure that enables Australia's ecosystem scientists to collect, publish, store, share, discover and re-use data in ways not previously possible. The aim of this poster is to demonstrate how TERN has successfully delivered infrastructure that is enabling a significant cultural and practical shift in Australia's ecosystem science community towards consistent approaches for data collection, meta-data, data licensing, and data publishing. TERN enables multiple disciplines, within the ecosystem sciences to more effectively and efficiently collect, store and publish their data. A critical part of TERN's approach has been to build on existing data collection activities, networks and skilled people to enable further coordination and collaboration to build each data collection facility and coordinate data publishing. Data collection in TERN is through discipline based facilities, covering long term collection of: (1) systematic plot based measurements of vegetation structure, composition and faunal biodiversity; (2) instrumented towers making systematic measurements of solar, water and gas fluxes; and (3) satellite and airborne maps of biophysical properties of vegetation, soils and the atmosphere. Several other facilities collect and integrate environmental data to produce national products for fauna and vegetation surveys, soils and coastal data, as well as integrated or synthesised products for modelling applications. Data management, publishing and sharing in TERN are implemented through a tailored data licensing framework suitable for ecosystem data, national standards for metadata, a DOI-minting service, and context-appropriate data repositories and portals. The TERN Data infrastructure is based on loosely coupled 'network of networks.' Overall, the data formats used across the TERN facilities vary from NetCDF, comma-separated values and descriptive documents. Metadata standards include ISO19115, Ecological Metadata Language and rich semantic enabled contextual information. Data services vary from Web Mapping Service, Web Feature Service, OpeNDAP, file servers and KNB Metacat. These approaches enable each data collection facility to maintain their discipline based data collection and storage protocols. TERN facility meta-data are harvested regularly for the central TERN Data Discovery Portal and converted to a national standard format. This approach enables centralised discovery, access, and re-use of data simply and effectively, while maintaining disciplinary diversity. Effort is still required to support the cultural shift towards acceptance of effective data management, publication, sharing and re-use as standard practice. To this end TERN's future activities will be directed to supporting this transformation and undertaking ';education' to enable ecosystem scientists to take full advantage of TERN's infrastructure, and providing training and guidance for best practice data management.
NASA Astrophysics Data System (ADS)
Leadbetter, Adam; Arko, Robert; Chandler, Cynthia; Shepherd, Adam
2014-05-01
"Linked Data" is a term used in Computer Science to encapsulate a methodology for publishing data and metadata in a structured format so that links may be created and exploited between objects. Berners-Lee (2006) outlines the following four design principles of a Linked Data system: Use Uniform Resource Identifiers (URIs) as names for things. Use HyperText Transfer Protocol (HTTP) URIs so that people can look up those names. When someone looks up a URI, provide useful information, using the standards (Resource Description Framework [RDF] and the RDF query language [SPARQL]). Include links to other URIs so that they can discover more things. In 2010, Berners-Lee revisited his original design plan for Linked Data to encourage data owners along a path to "good Linked Data". This revision involved the creation of a five star rating system for Linked Data outlined below. One star: Available on the web (in any format). Two stars: Available as machine-readable structured data (e.g. An Excel spreadsheet instead of an image scan of a table). Three stars: As two stars plus the use of a non-proprietary format (e.g. Comma Separated Values instead of Excel). Four stars: As three stars plus the use of open standards from the World Wide Web Commission (W3C) (i.e. RDF and SPARQL) to identify things, so that people can point to your data and metadata. Five stars: All the above plus link your data to other people's data to provide context Here we present work building on the SeaDataNet common vocabularies served by the NERC Vocabulary Server, connecting projects such as the Rolling Deck to Repository (R2R) and the Biological and Chemical Oceanography Data Management Office (BCO-DMO) and other vocabularies such as the Marine Metadata Interoperability Ontology Register and Repository and the NASA Global Change Master Directory to create a Linked Ocean Data cloud. Publishing the vocabularies and metadata in standard RDF XML and exposing SPARQL endpoints renders them five-star Linked Data repositories. The benefits of this approach include: increased interoperability between the metadata created by projects; improved data discovery as users of SeaDataNet, R2R and BCO-DMO terms can find data using labels with which they are familiar both standard tools and newly developed custom tools may be used to explore the data; and using standards means the custom tools are easier to develop Linked Data is a concept which has been in existence for nearly a decade, and has a simple set of formal best practices associated with it. Linked Data is increasingly being seen as a driver of the next generation of "community science" activities. While many data providers in the oceanographic domain may be unaware of Linked Data, they may also be providing it at one of its lower levels. Here we have shown that it is possible to deliver the highest standard of Linked Oceanographic Data, and some of the benefits of the approach.
Automated collection of imaging and phenotypic data to centralized and distributed data repositories
King, Margaret D.; Wood, Dylan; Miller, Brittny; Kelly, Ross; Landis, Drew; Courtney, William; Wang, Runtang; Turner, Jessica A.; Calhoun, Vince D.
2014-01-01
Accurate data collection at the ground level is vital to the integrity of neuroimaging research. Similarly important is the ability to connect and curate data in order to make it meaningful and sharable with other investigators. Collecting data, especially with several different modalities, can be time consuming and expensive. These issues have driven the development of automated collection of neuroimaging and clinical assessment data within COINS (Collaborative Informatics and Neuroimaging Suite). COINS is an end-to-end data management system. It provides a comprehensive platform for data collection, management, secure storage, and flexible data retrieval (Bockholt et al., 2010; Scott et al., 2011). It was initially developed for the investigators at the Mind Research Network (MRN), but is now available to neuroimaging institutions worldwide. Self Assessment (SA) is an application embedded in the Assessment Manager (ASMT) tool in COINS. It is an innovative tool that allows participants to fill out assessments via the web-based Participant Portal. It eliminates the need for paper collection and data entry by allowing participants to submit their assessments directly to COINS. Instruments (surveys) are created through ASMT and include many unique question types and associated SA features that can be implemented to help the flow of assessment administration. SA provides an instrument queuing system with an easy-to-use drag and drop interface for research staff to set up participants' queues. After a queue has been created for the participant, they can access the Participant Portal via the internet to fill out their assessments. This allows them the flexibility to participate from home, a library, on site, etc. The collected data is stored in a PostgresSQL database at MRN. This data is only accessible by users that have explicit permission to access the data through their COINS user accounts and access to MRN network. This allows for high volume data collection and with minimal user access to PHI (protected health information). An added benefit to using COINS is the ability to collect, store and share imaging data and assessment data with no interaction with outside tools or programs. All study data collected (imaging and assessment) is stored and exported with a participant's unique subject identifier so there is no need to keep extra spreadsheets or databases to link and keep track of the data. Data is easily exported from COINS via the Query Builder and study portal tools, which allow fine grained selection of data to be exported into comma separated value file format for easy import into statistical programs. There is a great need for data collection tools that limit human intervention and error while at the same time providing users with intuitive design. COINS aims to be a leader in database solutions for research studies collecting data from several different modalities. PMID:24926252
NASA Astrophysics Data System (ADS)
Palanisamy, Giriprakash; Wilson, Bruce E.; Cook, Robert B.; Lenhardt, Chris W.; Santhana Vannan, Suresh; Pan, Jerry; McMurry, Ben F.; Devarakonda, Ranjeet
2010-12-01
The Oak Ridge National Laboratory Distributed Active Archive Center (ORNL DAAC) is one of the science-oriented data centers in EOSDIS, aligned primarily with terrestrial ecology. The ORNL DAAC archives and serves data from NASA-funded field campaigns (such as BOREAS, FIFE, and LBA), regional and global data sets relevant to biogeochemical cycles, land validation studies for remote sensing, and source code for some terrestrial ecology models. Users of the ORNL DAAC include field ecologists, remote sensing scientists, modelers at various scales, synthesis scientific groups, a range of educational users (particularly baccalaureate and graduate instruction), and decision support analysts. It is clear that the wide range of users served by the ORNL DAAC have differing needs and differing capabilities for accessing and using data. It is also not possible for the ORNL DAAC, or the other data centers in EDSS to develop all of the tools and interfaces to support even most of the potential uses of data directly. As is typical of Information Technology to support a research enterprise, the user needs will continue to evolve rapidly over time and users themselves cannot predict future needs, as those needs depend on the results of current investigation. The ORNL DAAC is addressing these needs by targeted implementation of web services and tools which can be consumed by other applications, so that a modeler can retrieve data in netCDF format with the Climate Forecasting convention and a field ecologist can retrieve subsets of that same data in a comma separated value format, suitable for use in Excel or R. Tools such as our MODIS Subsetting capability, the Spatial Data Access Tool (SDAT; based on OGC web services), and OPeNDAP-compliant servers such as THREDDS particularly enable such diverse means of access. We also seek interoperability of metadata, recognizing that terrestrial ecology is a field where there are a very large number of relevant data repositories. ORNL DAAC metadata is published to several metadata repositories using the Open Archive Initiative Protocol for Metadata Handling (OAI-PMH), to increase the chances that users can find data holdings relevant to their particular scientific problem. ORNL also seeks to leverage technology across these various data projects and encourage standardization of processes and technical architecture. This standardization is behind current efforts involving the use of Drupal and Fedora Commons. This poster describes the current and planned approaches that the ORNL DAAC is taking to enable cost-effective interoperability among data centers, both across the NASA EOSDIS data centers and across the international spectrum of terrestrial ecology-related data centers. The poster will highlight the standards that we are currently using across data formats, metadata formats, and data protocols. References: [1]Devarakonda R., et al. Mercury: reusable metadata management, data discovery and access system. Earth Science Informatics (2010), 3(1): 87-94. [2]Devarakonda R., et al. Data sharing and retrieval using OAI-PMH. Earth Science Informatics (2011), 4(1): 1-5.
King, Margaret D; Wood, Dylan; Miller, Brittny; Kelly, Ross; Landis, Drew; Courtney, William; Wang, Runtang; Turner, Jessica A; Calhoun, Vince D
2014-01-01
Accurate data collection at the ground level is vital to the integrity of neuroimaging research. Similarly important is the ability to connect and curate data in order to make it meaningful and sharable with other investigators. Collecting data, especially with several different modalities, can be time consuming and expensive. These issues have driven the development of automated collection of neuroimaging and clinical assessment data within COINS (Collaborative Informatics and Neuroimaging Suite). COINS is an end-to-end data management system. It provides a comprehensive platform for data collection, management, secure storage, and flexible data retrieval (Bockholt et al., 2010; Scott et al., 2011). It was initially developed for the investigators at the Mind Research Network (MRN), but is now available to neuroimaging institutions worldwide. Self Assessment (SA) is an application embedded in the Assessment Manager (ASMT) tool in COINS. It is an innovative tool that allows participants to fill out assessments via the web-based Participant Portal. It eliminates the need for paper collection and data entry by allowing participants to submit their assessments directly to COINS. Instruments (surveys) are created through ASMT and include many unique question types and associated SA features that can be implemented to help the flow of assessment administration. SA provides an instrument queuing system with an easy-to-use drag and drop interface for research staff to set up participants' queues. After a queue has been created for the participant, they can access the Participant Portal via the internet to fill out their assessments. This allows them the flexibility to participate from home, a library, on site, etc. The collected data is stored in a PostgresSQL database at MRN. This data is only accessible by users that have explicit permission to access the data through their COINS user accounts and access to MRN network. This allows for high volume data collection and with minimal user access to PHI (protected health information). An added benefit to using COINS is the ability to collect, store and share imaging data and assessment data with no interaction with outside tools or programs. All study data collected (imaging and assessment) is stored and exported with a participant's unique subject identifier so there is no need to keep extra spreadsheets or databases to link and keep track of the data. Data is easily exported from COINS via the Query Builder and study portal tools, which allow fine grained selection of data to be exported into comma separated value file format for easy import into statistical programs. There is a great need for data collection tools that limit human intervention and error while at the same time providing users with intuitive design. COINS aims to be a leader in database solutions for research studies collecting data from several different modalities.
2016-01-01
Background After the Fukushima Dai-ichi Nuclear Power Station accident in Japan on March 11, 2011, a large number of comments, both positive and negative, were posted on social media. Objective The objective of this study was to clarify the characteristics of the trend in the number of tweets posted on Twitter, and to estimate how long public concern regarding the accident continued. We surveyed the attenuation period of the first term occurrence related to radiation exposure as a surrogate endpoint for the duration of concern. Methods We retrieved 18,891,284 tweets from Twitter data between March 11, 2011 and March 10, 2012, containing 143 variables in Japanese. We selected radiation, radioactive, Sievert (Sv), Becquerel (Bq), and gray (Gy) as keywords to estimate the attenuation period of public concern regarding radiation exposure. These data, formatted as comma-separated values, were transferred into a Statistical Analysis System (SAS) dataset for analysis, and survival analysis methodology was followed using the SAS LIFETEST procedure. This study was approved by the institutional review board of Hokkaido University and informed consent was waived. Results A Kaplan-Meier curve was used to show the rate of Twitter users posting a message after the accident that included one or more of the keywords. The term Sv occurred in tweets up to one year after the first tweet. Among the Twitter users studied, 75.32% (880,108/1,168,542) tweeted the word radioactive and 9.20% (107,522/1,168,542) tweeted the term Sv. The first reduction was observed within the first 7 days after March 11, 2011. The means and standard errors (SEs) of the duration from the first tweet on March 11, 2011 were 31.9 days (SE 0.096) for radioactive and 300.6 days (SE 0.181) for Sv. These keywords were still being used at the end of the study period. The mean attenuation period for radioactive was one month, and approximately one year for radiation and radiation units. The difference in mean duration between the keywords was attributed to the effect of mass media. Regularly posted messages, such as daily radiation dose reports, were relatively easy to detect from their time and formatted contents. The survival estimation indicated that public concern about the nuclear power plant accident remained after one year. Conclusions Although the simple plot of the number of tweets did not show clear results, we estimated the mean attenuation period as approximately one month for the keyword radioactive, and found that the keywords were still being used in posts at the end of the study period. Further research is required to quantify the effect of other phrases in social media data. The results of this exploratory study should advance progress in influencing and quantifying the communication of risk. PMID:27888168
Parallel basal ganglia circuits for decision making.
Hikosaka, Okihide; Ghazizadeh, Ali; Griggs, Whitney; Amita, Hidetoshi
2018-03-01
The basal ganglia control body movements, mainly, based on their values. Critical for this mechanism is dopamine neurons, which sends unpredicted value signals, mainly, to the striatum. This mechanism enables animals to change their behaviors flexibly, eventually choosing a valuable behavior. However, this may not be the best behavior, because the flexible choice is focused on recent, and, therefore, limited, experiences (i.e., short-term memories). Our old and recent studies suggest that the basal ganglia contain separate circuits that process value signals in a completely different manner. They are insensitive to recent changes in value, yet gradually accumulate the value of each behavior (i.e., movement or object choice). These stable circuits eventually encode values of many behaviors and then retain the value signals for a long time (i.e., long-term memories). They are innervated by a separate group of dopamine neurons that retain value signals, even when no reward is predicted. Importantly, the stable circuits can control motor behaviors (e.g., hand or eye) quickly and precisely, which allows animals to automatically acquire valuable outcomes based on historical life experiences. These behaviors would be called 'skills', which are crucial for survival. The stable circuits are localized in the posterior part of the basal ganglia, separately from the flexible circuits located in the anterior part. To summarize, the flexible and stable circuits in the basal ganglia, working together but independently, enable animals (and humans) to reach valuable goals in various contexts.
Chum, H.L.; Evans, R.J.
1992-08-04
A process is described for using fast pyrolysis in a carrier gas to convert a waste phenolic resin containing feedstreams in a manner such that pyrolysis of said resins and a given high value monomeric constituent occurs prior to pyrolyses of the resins in other monomeric components therein comprising: selecting a first temperature program range to cause pyrolysis of said resin and a given high value monomeric constituent prior to a temperature range that causes pyrolysis of other monomeric components; selecting, if desired, a catalyst and a support and treating said feedstreams with said catalyst to effect acid or basic catalyzed reaction pathways to maximize yield or enhance separation of said high value monomeric constituent in said first temperature program range to utilize reactive gases such as oxygen and steam in the pyrolysis process to drive the production of specific products; differentially heating said feedstreams at a heat rate within the first temperature program range to provide differential pyrolysis for selective recovery of optimum quantity of said high value monomeric constituent prior to pyrolysis of other monomeric components therein; separating said high value monomeric constituent; selecting a second higher temperature program range to cause pyrolysis of a different high value monomeric constituent of said phenolic resins waste and differentially heating said feedstreams at said higher temperature program range to cause pyrolysis of said different high value monomeric constituent; and separating said different high value monomeric constituent. 11 figs.
Chum, Helena L.; Evans, Robert J.
1992-01-01
A process of using fast pyrolysis in a carrier gas to convert a waste phenolic resin containing feedstreams in a manner such that pyrolysis of said resins and a given high value monomeric constituent occurs prior to pyrolyses of the resins in other monomeric components therein comprising: selecting a first temperature program range to cause pyrolysis of said resin and a given high value monomeric constituent prior to a temperature range that causes pyrolysis of other monomeric components; selecting, if desired, a catalyst and a support and treating said feedstreams with said catalyst to effect acid or basic catalyzed reaction pathways to maximize yield or enhance separation of said high value monomeric constituent in said first temperature program range to utilize reactive gases such as oxygen and steam in the pyrolysis process to drive the production of specific products; differentially heating said feedstreams at a heat rate within the first temperature program range to provide differential pyrolysis for selective recovery of optimum quantity of said high value monomeric constituent prior to pyrolysis of other monomeric components therein; separating said high value monomeric constituent; selecting a second higher temperature program range to cause pyrolysis of a different high value monomeric constituent of said phenolic resins waste and differentially heating said feedstreams at said higher temperature program range to cause pyrolysis of said different high value monomeric constituent; and separating said different high value monomeric constituent.
Koltun, G.F.; Ostheimer, Chad J.; Griffin, Michael S.
2006-01-01
Velocity, bathymetry, and transverse (cross-channel) mixing characteristics were studied in a 34-mile study reach of the Ohio River extending from the lower pool of the Captain Anthony Meldahl Lock and Dam, near Willow Grove, Ky, to just downstream from the confluence of the Licking and Ohio Rivers, near Newport, Ky. Information gathered in this study ultimately will be used to parameterize hydrodynamic and water-quality models that are being developed for the study reach. Velocity data were measured at an average cross-section spacing of about 2,200 feet by means of boat-mounted acoustic Doppler current profilers (ADCPs). ADCP data were postprocessed to create text files describing the three-dimensional velocity characteristics in each transect. Bathymetry data were measured at an average transect spacing of about 800 feet by means of a boat-mounted single-beam echosounder. Depth information obtained from the echosounder were postprocessed with water-surface slope and elevation information collected during the surveys to compute stream-bed elevations. The bathymetry data were written to text files formatted as a series of space-delimited x-, y-, and z-coordinates. Two separate dye-tracer studies were done on different days in overlapping stream segments in an 18.3-mile section of the study reach to assess transverse mixing characteristics in the Ohio River. Rhodamine WT dye was injected into the river at a constant rate, and concentrations were measured in downstream cross sections, generally spaced 1 to 2 miles apart. The dye was injected near the Kentucky shoreline during the first study and near the Ohio shoreline during the second study. Dye concentrations were measured along transects in the river by means of calibrated fluorometers equipped with flow-through chambers, automatic temperature compensation, and internal data loggers. The use of flow-through chambers permitted water to be pumped continuously out of the river from selected depths and through the fluorometer for measurement as the boat traversed the river. Time-tagged concentration readings were joined with horizontal coordinate data simultaneously captured from a differentially corrected Global Positioning System (GPS) device to create a plain-text, comma-separated variable file containing spatially tagged dye-concentration data. Plots showing the transverse variation in relative dye concentration indicate that, within the stream segments sampled, complete transverse mixing of the dye did not occur. In addition, the highest concentrations of dye tended to be nearest the side of the river from which the dye was injected. Velocity, bathymetry, and dye-concentration data collected during this study are available for Internet download by means of hyperlinks in this report. Data contained in this report were collected between October 2004 and March 2006.
Discordant Place-Based Literacies in the Hilton Head, South Carolina Runway Extension Debate
ERIC Educational Resources Information Center
Cooney, Emily
2014-01-01
In making a case for ecocomposition, Sidney Dobrin has claimed that writing, place, and environment cannot be separated. As Donehower, Hogg, and Schell and Deborah Brandt might argue, literacy cannot be separated from place either. But it might sometimes be separated from environment as an ecosystem that has value distinct from, and without the…
TRIDENT 1 third stage motor separation system
NASA Technical Reports Server (NTRS)
Welch, B. H.; Richter, B. J.; Sue, P.
1977-01-01
The third stage engine separation system has shown through test and analysis that it can effectively and reliably perform its function. The weight of the hardware associated with this system is well within the targeted value.
NASA Astrophysics Data System (ADS)
Turan, Muhammed K.; Sehirli, Eftal; Elen, Abdullah; Karas, Ismail R.
2015-07-01
Gel electrophoresis (GE) is one of the most used method to separate DNA, RNA, protein molecules according to size, weight and quantity parameters in many areas such as genetics, molecular biology, biochemistry, microbiology. The main way to separate each molecule is to find borders of each molecule fragment. This paper presents a software application that show columns edges of DNA fragments in 3 steps. In the first step the application obtains lane histograms of agarose gel electrophoresis images by doing projection based on x-axis. In the second step, it utilizes k-means clustering algorithm to classify point values of lane histogram such as left side values, right side values and undesired values. In the third step, column edges of DNA fragments is shown by using mean algorithm and mathematical processes to separate DNA fragments from the background in a fully automated way. In addition to this, the application presents locations of DNA fragments and how many DNA fragments exist on images captured by a scientific camera.
Formation of ordered microphase-separated pattern during spin coating of ABC triblock copolymer.
Huang, Weihuan; Luo, Chunxia; Zhang, Jilin; Han, Yanchun
2007-03-14
In this paper, the authors have systematically studied the microphase separation and crystallization during spin coating of an ABC triblock copolymer, polystyrene-b-poly(2-vinylpyridine)-b-poly(ethylene oxide) (PS-b-P2VP-b-PEO). The microphase separation of PS-b-P2VP-b-PEO and the crystallization of PEO blocks can be modulated by the types of the solvent and the substrate, the spinning speed, and the copolymer concentration. Ordered microphase-separated pattern, where PEO and P2VP blocks adsorbed to the substrate and PS blocks protrusions formed hexagonal dots above the P2VP domains, can only be obtained when PS-b-P2VP-b-PEO is dissolved in N,N-dimethylformamide and the films are spin coated onto the polar substrate, silicon wafers or mica. The mechanism of the formation of regular pattern by microphase separation is found to be mainly related to the inducement of the substrate (middle block P2VP wetting the polar substrate), the quick vanishment of the solvent during the early stage of the spin coating, and the slow evaporation of the remaining solvent during the subsequent stage. On the other hand, the probability of the crystallization of PEO blocks during spin coating decreases with the reduced film thickness. When the film thickness reaches a certain value (3.0 nm), the extensive crystallization of PEO is effectively prohibited and ordered microphase-separated pattern over large areas can be routinely prepared. When the film thickness exceeds another definite value (12.0 nm), the crystallization of PEO dominates the surface morphology. For films with thickness between these two values, microphase separation and crystallization can simultaneously occur.
Separate-channel analysis of two-channel microarrays: recovering inter-spot information.
Smyth, Gordon K; Altman, Naomi S
2013-05-26
Two-channel (or two-color) microarrays are cost-effective platforms for comparative analysis of gene expression. They are traditionally analysed in terms of the log-ratios (M-values) of the two channel intensities at each spot, but this analysis does not use all the information available in the separate channel observations. Mixed models have been proposed to analyse intensities from the two channels as separate observations, but such models can be complex to use and the gain in efficiency over the log-ratio analysis is difficult to quantify. Mixed models yield test statistics for the null distributions can be specified only approximately, and some approaches do not borrow strength between genes. This article reformulates the mixed model to clarify the relationship with the traditional log-ratio analysis, to facilitate information borrowing between genes, and to obtain an exact distributional theory for the resulting test statistics. The mixed model is transformed to operate on the M-values and A-values (average log-expression for each spot) instead of on the log-expression values. The log-ratio analysis is shown to ignore information contained in the A-values. The relative efficiency of the log-ratio analysis is shown to depend on the size of the intraspot correlation. A new separate channel analysis method is proposed that assumes a constant intra-spot correlation coefficient across all genes. This approach permits the mixed model to be transformed into an ordinary linear model, allowing the data analysis to use a well-understood empirical Bayes analysis pipeline for linear modeling of microarray data. This yields statistically powerful test statistics that have an exact distributional theory. The log-ratio, mixed model and common correlation methods are compared using three case studies. The results show that separate channel analyses that borrow strength between genes are more powerful than log-ratio analyses. The common correlation analysis is the most powerful of all. The common correlation method proposed in this article for separate-channel analysis of two-channel microarray data is no more difficult to apply in practice than the traditional log-ratio analysis. It provides an intuitive and powerful means to conduct analyses and make comparisons that might otherwise not be possible.
METHOD OF SEPARATING TETRAVALENT PLUTONIUM VALUES FROM CERIUM SUB-GROUP RARE EARTH VALUES
Duffield, R.B.; Stoughton, R.W.
1959-02-01
A method is presented for separating plutonium from the cerium sub-group of rare earths when both are present in an aqueous solution. The method consists in adding an excess of alkali metal carbonate to the solution, which causes the formation of a soluble plutonium carbonate precipitate and at the same time forms an insoluble cerium-group rare earth carbonate. The pH value must be adjusted to bctween 5.5 and 7.5, and prior to the precipitation step the plutonium must be reduced to the tetravalent state since only tetravalent plutonium will form the soluble carbonate complex.
Intelligibility for Binaural Speech with Discarded Low-SNR Speech Components.
Schoenmaker, Esther; van de Par, Steven
2016-01-01
Speech intelligibility in multitalker settings improves when the target speaker is spatially separated from the interfering speakers. A factor that may contribute to this improvement is the improved detectability of target-speech components due to binaural interaction in analogy to the Binaural Masking Level Difference (BMLD). This would allow listeners to hear target speech components within specific time-frequency intervals that have a negative SNR, similar to the improvement in the detectability of a tone in noise when these contain disparate interaural difference cues. To investigate whether these negative-SNR target-speech components indeed contribute to speech intelligibility, a stimulus manipulation was performed where all target components were removed when local SNRs were smaller than a certain criterion value. It can be expected that for sufficiently high criterion values target speech components will be removed that do contribute to speech intelligibility. For spatially separated speakers, assuming that a BMLD-like detection advantage contributes to intelligibility, degradation in intelligibility is expected already at criterion values below 0 dB SNR. However, for collocated speakers it is expected that higher criterion values can be applied without impairing speech intelligibility. Results show that degradation of intelligibility for separated speakers is only seen for criterion values of 0 dB and above, indicating a negligible contribution of a BMLD-like detection advantage in multitalker settings. These results show that the spatial benefit is related to a spatial separation of speech components at positive local SNRs rather than to a BMLD-like detection improvement for speech components at negative local SNRs.
Autonomously Propelled Motors for Value-Added Product Synthesis and Purification.
Srivastava, Sarvesh K; Schmidt, Oliver G
2016-06-27
A proof-of-concept design for autonomous, self-propelling motors towards value-added product synthesis and separation is presented. The hybrid motor design consists of two distinct functional blocks. The first, a sodium borohydride (NaBH4 ) granule, serves both as a reaction prerequisite for the reduction of vanillin and also as a localized solid-state fuel in the reaction mixture. The second capping functional block consisting of a graphene-polymer composite serves as a hydrophobic matrix to attract the reaction product vanillyl alcohol (VA), resulting in facile separation of this edible value-added product. These autonomously propelled motors were fabricated at a length scale down to 400 μm, and once introduced in the reaction environment showed rapid bubble-propulsion followed by high-purity separation of the reaction product (VA) by the virtue of the graphene-polymer cap acting as a mesoporous sponge. The concept has excellent potential towards the synthesis/isolation of industrially important compounds, affinity-based product separation, pollutant remediation (such as heavy metal chelation/adsorption), as well as localized fuel-gradients as an alternative to external fuel dependency. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Separability of spatiotemporal spectra of image sequences. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Eckert, Michael P.; Buchsbaum, Gershon; Watson, Andrew B.
1992-01-01
The spatiotemporal power spectrum was calculated of 14 image sequences in order to determine the degree to which the spectra are separable in space and time, and to assess the validity of the commonly used exponential correlation model found in the literature. The spectrum was expanded by a Singular Value Decomposition into a sum of separable terms and an index was defined of spatiotemporal separability as the fraction of the signal energy that can be represented by the first (largest) separable term. All spectra were found to be highly separable with an index of separability above 0.98. The power spectra of the sequences were well fit by a separable model. The power spectrum model corresponds to a product of exponential autocorrelation functions separable in space and time.
NASA Astrophysics Data System (ADS)
Reese, D. R.; Lignières, F.; Ballot, J.; Dupret, M.-A.; Barban, C.; van't Veer-Menneret, C.; MacGregor, K. B.
2017-05-01
Context. Mode identification has remained a major obstacle in the interpretation of pulsation spectra in rapidly rotating stars. This has motivated recent work on calculating realistic multi-colour mode visibilities in this type of star. Aims: We would like to test mode identification methods and seismic diagnostics in rapidly rotating stars, using oscillation spectra that are based on these new theoretical predictions. Methods: We investigate the auto-correlation function and Fourier transform of theoretically calculated frequency spectra, in which modes are selected according to their visibilities. Given that intrinsic mode amplitudes are determined by non-linear saturation and cannot currently be theoretically predicted, we experimented with various ad-hoc prescriptions for setting the mode amplitudes, including using random values. Furthermore, we analyse the ratios between mode amplitudes observed in different photometric bands to see up to what extent they can identify modes. Results: When non-random intrinsic mode amplitudes are used, our results show that it is possible to extract a mean value for the large frequency separation or half its value and, sometimes, twice the rotation rate, from the auto-correlation of the frequency spectra. Furthermore, the Fourier transforms are mostly sensitive to the large frequency separation or half its value. The combination of the two methods may therefore measure and distinguish the two types of separations. When the intrinsic mode amplitudes include random factors, which seems more representative of real stars, the results are far less favourable. It is only when the large separation or half its value coincides with twice the rotation rate, that it might be possible to detect the signature of a frequency regularity. We also find that amplitude ratios are a good way of grouping together modes with similar characteristics. By analysing the frequencies of these groups, it is possible to constrain mode identification, as well as determine the large frequency separation and the rotation rate.
METHOD FOR RECOVERING PLUTONIUM VALUES FROM SOLUTION USING A BISMUTH HYDROXIDE CARRIER PRECIPITATE
Faris, B.F.
1961-04-25
Carrier precipitation processes for separating plutonium values from aqueous solutions are described. In accordance with the invention a bismuth hydroxide precipitate is formed in the plutonium-containing solution, thereby carrying plutonium values from the solution.
Sartory, Walter K.; Eveleigh, John W.
1976-01-01
A method and apparatus for operating a continuous flow blood separation centrifuge are provided. The hematocrit of the entrant whole blood is continuously maintained at an optimum constant value by the addition of plasma to the entrant blood. The hematocrit of the separated red cells is monitored to indicate the degree of separation taking place, thereby providing a basis for regulating the flow through the centrifuge.
Improved separability criteria via some classes of measurements
NASA Astrophysics Data System (ADS)
Shen, Shu-Qian; Li, Ming; Li-Jost, Xianqing; Fei, Shao-Ming
2018-05-01
The entanglement detection via local measurements can be experimentally implemented. Based on mutually unbiased measurements and general symmetric informationally complete positive-operator-valued measures, we present separability criteria for bipartite quantum states, which, by theoretical analysis, are stronger than the related existing criteria via these measurements. Two detailed examples are supplemented to show the efficiency of the presented separability criteria.
Lee, Kyung Hee; Kang, Seung Kwan; Goo, Jin Mo; Lee, Jae Sung; Cheon, Gi Jeong; Seo, Seongho; Hwang, Eui Jin
2017-03-01
To compare the relationship between K trans from DCE-MRI and K 1 from dynamic 13 N-NH 3 -PET, with simultaneous and separate MR/PET in the VX-2 rabbit carcinoma model. MR/PET was performed simultaneously and separately, 14 and 15 days after VX-2 tumor implantation at the paravertebral muscle. The K trans and K 1 values were estimated using an in-house software program. The relationships between K trans and K 1 were analyzed using Pearson's correlation coefficients and linear/non-linear regression function. Assuming a linear relationship, K trans and K 1 exhibited a moderate positive correlations with both simultaneous (r=0.54-0.57) and separate (r=0.53-0.69) imaging. However, while the K trans and K 1 from separate imaging were linearly correlated, those from simultaneous imaging exhibited a non-linear relationship. The amount of change in K 1 associated with a unit increase in K trans varied depending on K trans values. The relationship between K trans and K 1 may be mis-interpreted with separate MR and PET acquisition. Copyright© 2017, International Institute of Anticancer Research (Dr. George J. Delinasios), All rights reserved.
Numerical simulation of adverse-pressure-gradient boundary layer with or without roughness
NASA Astrophysics Data System (ADS)
Mottaghian, Pouya; Yuan, Junlin; Piomelli, Ugo
2014-11-01
Large-eddy and direct numerical simulations are carried out on flat-plate boundary layer over smooth and rough surfaces, with adverse pressure gradient.The deceleration is achieved by imposing a wall-normal freestream velocity profile, and is strong enough to cause separation at the wall. The Reynolds number based on momentum thickness and freestream velocity at inlet is 600. Numerical sandgrain roughness is applied based on an immersed boundary method, yielding a flow that is transitionally rough. The turbulence intensity increases before separation, and reaches a higher value for the rough case, indicating stronger mixing. Roughness also causes higher momentum deficit near the wall, leading to earlier separation. This is consistent with previous observation made on rough-wall flow separation over a ramp. In both cases, the turbulent kinetic energy peaks inside the shear layer above the detachment region, with higher values in the rough case; it then decreases approaching the reattachment region. Near the wall inside the separation bubble, the near-zero turbulent intensity indicates that the turbulent structures are lifted up in the separation region. Compared with the smooth case, the shear layer is farther from the wall and the reattachment length is longer on the rough wall.
Incidence angle bounds for lip flow separation of three 13.97-centimeter-diameter inlets
NASA Technical Reports Server (NTRS)
Luidens, R. W.; Abbott, J. M.
1976-01-01
Low speed wind tunnel tests were conducted to establish a procedure for determining inlet-lip flow separation and to make preliminary examination of the incidence angle bounds for lip flow separation on inlets intended for the nacelles of STOL (short takeoff and landing) aircraft. Three inlets were tested. Two of the inlets had short centerbodies with lower lip area contraction ratios of 1.30 and 1.44. The third inlet had a cylindrical centerbody extended forward into the inlet throat with a lower lip area contraction ratio of 1.44. The inlets were sized to fit a 13.97 centimeter-diameter fan. For inlet throat Mach numbers less than about 0.43, the lip flow separation angle was increased by either increasing the ratio of throat velocity to freestream velocity (Vt/Vo) or by increasing the lower lip area contraction ratio. For throat Mach numbers greater than a certain value (ranging from 0.43 to 0.52), increasing throat Mach number in some cases resulted in a decrease in the lip flow separation angle. Extending a cylindrical centerbody into the inlet throat increased the flow separation angle for nearly all values of Vt/Vo.
May, Jody C.; McLean, John A.
2013-01-01
The influence of three different drift gases (helium, nitrogen, and argon) on the separation mechanism in traveling wave ion mobility spectrometry is explored through ion trajectory simulations which include considerations for ion diffusion based on kinetic theory and the electrodynamic traveling wave potential. The model developed for this work is an accurate depiction of a second-generation commercial traveling wave instrument. Three ion systems (cocaine, MDMA, and amphetamine) whose reduced mobility values have previously been measured in different drift gases are represented in the simulation model. The simulation results presented here provide a fundamental understanding of the separation mechanism in traveling wave, which is characterized by three regions of ion motion: (1) ions surfing on a single wave, (2) ions exhibiting intermittent roll-over onto subsequent waves, and (3) ions experiencing a steady state roll-over which repeats every few wave cycles. These regions of ion motion are accessed through changes in the gas pressure, wave amplitude, and wave velocity. Resolving power values extracted from simulated arrival times suggest that momentum transfer in helium gas is generally insufficient to access regions (2) and (3) where ion mobility separations occur. Ion mobility separations by traveling wave are predicted to be effectual for both nitrogen and argon, with slightly lower resolving power values observed for argon as a result of band-broadening due to collisional scattering. For the simulation conditions studied here, the resolving power in traveling wave plateaus between regions (2) and (3), with further increases in wave velocity contributing only minor improvements in separations. PMID:23888124
May, Jody C; McLean, John A
2003-06-01
The influence of three different drift gases (helium, nitrogen, and argon) on the separation mechanism in traveling wave ion mobility spectrometry is explored through ion trajectory simulations which include considerations for ion diffusion based on kinetic theory and the electrodynamic traveling wave potential. The model developed for this work is an accurate depiction of a second-generation commercial traveling wave instrument. Three ion systems (cocaine, MDMA, and amphetamine) whose reduced mobility values have previously been measured in different drift gases are represented in the simulation model. The simulation results presented here provide a fundamental understanding of the separation mechanism in traveling wave, which is characterized by three regions of ion motion: (1) ions surfing on a single wave, (2) ions exhibiting intermittent roll-over onto subsequent waves, and (3) ions experiencing a steady state roll-over which repeats every few wave cycles. These regions of ion motion are accessed through changes in the gas pressure, wave amplitude, and wave velocity. Resolving power values extracted from simulated arrival times suggest that momentum transfer in helium gas is generally insufficient to access regions (2) and (3) where ion mobility separations occur. Ion mobility separations by traveling wave are predicted to be effectual for both nitrogen and argon, with slightly lower resolving power values observed for argon as a result of band-broadening due to collisional scattering. For the simulation conditions studied here, the resolving power in traveling wave plateaus between regions (2) and (3), with further increases in wave velocity contributing only minor improvements in separations.
Code of Federal Regulations, 2012 CFR
2012-04-01
... contracts, including, but not limited to, premium rate structure and premium processing, insurance... discrete cash values that may vary in amount in accordance with the investment experience of the separate...
Code of Federal Regulations, 2010 CFR
2010-04-01
... contracts, including, but not limited to, premium rate structure and premium processing, insurance... discrete cash values that may vary in amount in accordance with the investment experience of the separate...
Code of Federal Regulations, 2014 CFR
2014-04-01
... contracts, including, but not limited to, premium rate structure and premium processing, insurance... discrete cash values that may vary in amount in accordance with the investment experience of the separate...
Code of Federal Regulations, 2013 CFR
2013-04-01
... contracts, including, but not limited to, premium rate structure and premium processing, insurance... discrete cash values that may vary in amount in accordance with the investment experience of the separate...
Code of Federal Regulations, 2011 CFR
2011-04-01
... contracts, including, but not limited to, premium rate structure and premium processing, insurance... discrete cash values that may vary in amount in accordance with the investment experience of the separate...
METHOD OF SEPARATING URANIUM VALUES, PLUTONIUM VALUES AND FISSION PRODUCTS BY CHLORINATION
Brown, H.S.; Seaborg, G.T.
1959-02-24
The separation of plutonium and uranium from each other and from other substances is described. In general, the method comprises the steps of contacting the uranium with chlorine in the presence of a holdback material selected from the group consisting of lanthanum oxide and thorium oxide to form a uranium chloride higher than uranium tetrachloride, and thereafter heating the uranium chloride thus formed to a temperature at which the uranium chloride is volatilized off but below the volatilizalion temperature of plutonium chloride.
Green, Norman W.; Duraiswamy, Kandaswamy; Lumpkin, Robert E.
1978-07-25
In a continuous process for recovery of values contained in a solid carbonaceous material, the carbonaceous material is comminuted and then subjected to flash pyrolysis in the presence of a particulate heat source over an overflow weir to form a pyrolysis product stream containing a carbon containing solid residue and volatilized hydrocarbons. After the carbon containing solid residue is separated from the pyrolysis product stream, values are obtained by condensing volatilized hydrocarbons. The particulate source of heat is formed by oxidizing carbon in the solid residue and separating out the fines.
Wang, Yin; Zhao, Nan-jing; Liu, Wen-qing; Yu, Yang; Fang, Li; Meng, De-shuo; Hu, Li; Zhang, Da-hai; Ma, Min-jun; Xiao, Xue; Wang, Yu; Liu, Jian-guo
2015-02-01
In recent years, the technology of laser induced breakdown spectroscopy has been developed rapidly. As one kind of new material composition detection technology, laser induced breakdown spectroscopy can simultaneously detect multi elements fast and simply without any complex sample preparation and realize field, in-situ material composition detection of the sample to be tested. This kind of technology is very promising in many fields. It is very important to separate, fit and extract spectral feature lines in laser induced breakdown spectroscopy, which is the cornerstone of spectral feature recognition and subsequent elements concentrations inversion research. In order to realize effective separation, fitting and extraction of spectral feature lines in laser induced breakdown spectroscopy, the original parameters for spectral lines fitting before iteration were analyzed and determined. The spectral feature line of' chromium (Cr I : 427.480 nm) in fly ash gathered from a coal-fired power station, which was overlapped with another line(FeI: 427.176 nm), was separated from the other one and extracted by using damped least squares method. Based on Gauss-Newton iteration, damped least squares method adds damping factor to step and adjust step length dynamically according to the feedback information after each iteration, in order to prevent the iteration from diverging and make sure that the iteration could converge fast. Damped least squares method helps to obtain better results of separating, fitting and extracting spectral feature lines and give more accurate intensity values of these spectral feature lines: The spectral feature lines of chromium in samples which contain different concentrations of chromium were separated and extracted. And then, the intensity values of corresponding spectral lines were given by using damped least squares method and least squares method separately. The calibration curves were plotted, which showed the relationship between spectral line intensity values and chromium concentrations in different samples. And then their respective linear correlations were compared. The experimental results showed that the linear correlation of the intensity values of spectral feature lines and the concentrations of chromium in different samples, which was obtained by damped least squares method, was better than that one obtained by least squares method. And therefore, damped least squares method was stable, reliable and suitable for separating, fitting and extracting spectral feature lines in laser induced breakdown spectroscopy.
26 CFR 1.148-9 - Arbitrage rules for refunding issues.
Code of Federal Regulations, 2010 CFR
2010-04-01
... par bond, its stated principal amount, and in reference to any other bond, its present value. (3... allocated to the refunding of a separate prior issue is based on the present value of the refunded debt... satisfying the representative allocation method if that investment is valued at fair market value on the...
Method and apparatus for ion mobility spectrometry with alignment of dipole direction (IMS-ADD)
Shvartsburg, Alexandre A [Richland, WA; Tang, Keqi [Richland, WA; Smith, Richard D [Richland, WA
2007-01-30
Techniques and instrumentation are described for analyses of substances, including complex samples/mixtures that require separation prior to characterization of individual components. A method is disclosed for separation of ion mixtures and identification of ions, including protein and other macromolecular ions and their different structural isomers. Analyte ions are not free to rotate during the separation, but are substantially oriented with respect to the drift direction. Alignment is achieved by applying, at a particular angle to the drift field, a much stronger alternating electric field that "locks" the ion dipoles with moments exceeding a certain value. That value depends on the buffer gas composition, pressure, and temperature, but may be as low as .about.3 Debye under certain conditions. The presently disclosed method measures the direction-specific cross-sections that provide the structural information complementing that obtained from known methods, and, when coupled to those methods, increases the total peak capacity and specificity of gas-phase separations. Simultaneous 2-D separations by direction-specific cross sections along and orthogonally to the ion dipole direction are also possible.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sugiyama, T.; Sugura, K.; Enokida, Y.
2015-03-15
Lithium-6 is used as a blanket material for sufficient tritium production in DT fueled fusion reactors. A core-shell type adsorbent was proposed for lithium isotope separation by chromatography. The mass transfer model in a chromatographic column consisted of 4 steps, such as convection and dispersion in the column, transfer through liquid films, intra-particle diffusion and and adsorption or desorption at the local adsorption sites. A model was developed and concentration profiles and time variation in the column were numerically simulated. It became clear that core-shell type adsorbents with thin porous shell were saturated rapidly relatively to fully porous one andmore » established a sharp edge of adsorption band. This is very important feature because lithium isotope separation requires long-distance development of adsorption band. The values of HETP (Height Equivalent of a Theoretical Plate) for core-shell adsorbent packed column were estimated by statistical moments of the step response curve. The value of HETP decreased with the thickness of the porous shell. A core-shell type adsorbent is, then, useful for lithium isotope separation. (authors)« less
Dell'Aquila, Caterina
2002-09-05
Five tricyclic antidepressants (TADs), desipramyne, nortriptyline, imipramine, doxepin and amitriptyline, were separated by using the N,N,N',N'-tetramethyl-1,3-butanediamine (TMBD) as additive in the background electrolyte solution. Because the tricyclic antidepressants are similar in structure, mass and pka values, their separation, by capillary zone electrophoresis, requires the careful manipulation of parameters, such as the pH and the composition of the electrolyte solution. As basic drugs, the TADs interact with the silanol groups on the capillary wall giving rise to peak broadening and asymmetry, non reproducible migration times and failing in selectivity. Different concentrations of TMBD (40, 60, 100 and 150 mM) were used at pH 9.5, but only a 100 mM TMBD allowed a good separation and a high efficiency for all the TADs. At this pH the separation was not possible without additive. This result is due to the reduced electroosmotic flow whose mobility is at a value of 10(-9) m(2) V(-1) s(-1).
NASA Astrophysics Data System (ADS)
Bogovalov, S. V.; Borman, V. D.; Borisevich, V. D.; Davidenko, O. V.; Tronin, I. V.; Tronin, V. N.
2016-09-01
The results of optimization calculations of the separative power of the ’’high-speed” Iguasu gas centrifuge are presented. Iguasu gas centrifuge has the rotational speed of 1000 m/s, the rotor length of 1 m. The dependence of the optimal separative power on the pressure of the working gas on the rotor wall was obtained using the numerical simulations. It is shown, that maximum of the optimal separative power corresponds to the pressure of 1100 mmHg. Maximum value of separative power is 31.9 SWU.
CONTINUOUS CHELATION-EXTRACTION PROCESS FOR THE SEPARATION AND PURIFICATION OF METALS
Thomas, J.R.; Hicks, T.E.; Rubin, B.; Crandall, H.W.
1959-12-01
A continuous process is presented for separating metal values and groups of metal values from each other. A complex mixture. e.g., neutron-irradiated uranium, can be resolved into component parts. In the present process the values are dissolved in an acidic solution and adjusted to the proper oxidation state. Thenceforth the solution is contacted with an extractant phase comprising a fluorinated beta -diketone in an organic solvent under centain pH conditions whereupon plutonium and zirconium are extracted. Plutonium is extracted from the foregoing extract with reducing aqueous solutions or under specified acidic conditions and can be recovered from the aqueous solution. Zirconium is then removed with an oxalic acid aqueous phase. The uranium is recovered from the residual original solution using hexone and hexone-diketone extractants leaving residual fission products in the original solution. The uranium is extracted from the hexone solution with dilute nitric acid. Improved separations and purifications are achieved using recycled scrub solutions and the "self-salting" effect of uranyl ions.
Lai, S; Wang, J; Jahng, G H
2001-01-01
A new pulse sequence, dubbed FAIR exempting separate T(1) measurement (FAIREST) in which a slice-selective saturation recovery acquisition is added in addition to the standard FAIR (flow-sensitive alternating inversion recovery) scheme, was developed for quantitative perfusion imaging and multi-contrast fMRI. The technique allows for clean separation between and thus simultaneous assessment of BOLD and perfusion effects, whereas quantitative cerebral blood flow (CBF) and tissue T(1) values are monitored online. Online CBF maps were obtained using the FAIREST technique and the measured CBF values were consistent with the off-line CBF maps obtained from using the FAIR technique in combination with a separate sequence for T(1) measurement. Finger tapping activation studies were carried out to demonstrate the applicability of the FAIREST technique in a typical fMRI setting for multi-contrast fMRI. The relative CBF and BOLD changes induced by finger-tapping were 75.1 +/- 18.3 and 1.8 +/- 0.4%, respectively, and the relative oxygen consumption rate change was 2.5 +/- 7.7%. The results from correlation of the T(1) maps with the activation images on a pixel-by-pixel basis show that the mean T(1) value of the CBF activation pixels is close to the T(1) of gray matter while the mean T(1) value of the BOLD activation pixels is close to the T(1) range of blood and cerebrospinal fluid. Copyright 2001 John Wiley & Sons, Ltd.
Mu'min, Gea Fardias; Prawisudha, Pandji; Zaini, Ilman Nuran; Aziz, Muhammad; Pasek, Ari Darmawan
2017-09-01
This study employs wet torrefaction process (also known as hydrothermal) at low temperature. This process simultaneously acts as waste processing and separation of mixed waste, for subsequent utilization as an alternative fuel. The process is also applied for the delamination and separation of non-recyclable laminated aluminum waste into separable aluminum and plastic. A 2.5-L reactor was used to examine the wet torrefaction process at temperatures below 200°C. It was observed that the processed mixed waste was converted into two different products: a mushy organic part and a bulky plastic part. Using mechanical separation, the two products can be separated into a granular organic product and a plastic bulk for further treatment. TGA analysis showed that no changes in the plastic composition and no intrusion from plastic fraction to the organic fraction. It can be proclaimed that both fractions have been completely separated by wet torrefaction. The separated plastic fraction product obtained from the wet torrefaction treatment also contained relatively high calorific value (approximately 44MJ/kg), therefore, justifying its use as an alternative fuel. The non-recyclable plastic fraction of laminated aluminum was observed to be delaminated and separated from its aluminum counterpart at a temperature of 170°C using an additional acetic acid concentration of 3%, leaving less than 25% of the plastic content in the aluminum part. Plastic products from both samples had high calorific values of more than 30MJ/kg, which is sufficient to be converted and used as a fuel. Copyright © 2017 Elsevier Ltd. All rights reserved.
Time Variation of the Distance Separating Bomb and Dive Bomber Subsequent to Bomb Release
NASA Technical Reports Server (NTRS)
Mathews, Charles W.
1952-01-01
A study has been made of the variation of the distance separating bomb and aircraft with time after release as applied to dive-bombing operations, Separation distances determined from this study are presented in terms of two variables only, dive angle and maximum airplane accelerometer reading; the values of separation distance include the effects of delay in initiation of the pull-out and lag in attainment of the maximum normal acceleration.Contains analysis and calculations of the separation distances between bomb and dive bomber following bomb release, Separation distances as determined by the dive angle and the maximum airplane accelerometer reading are presented in a single chart.
ERIC Educational Resources Information Center
Machamer, Peter; Douglas, Heather
1999-01-01
Criticizes Hugh Lacey's separation of cognitive values and social values in discussions of the nature of science. Claims that attempting to distinguish between cognitive and social ignores crucial complexities in the development and use of knowledge. Proposes that the proper distinction be between legitimate and illegitimate reasons in science as…
American Teachers: What Values Do They Hold?
ERIC Educational Resources Information Center
Slater, Robert O.
2008-01-01
In a liberal-democratic society there is always a desire to separate the teaching of values from the teaching of reading, writing, and mathematics, the so-called value-neutral subjects. But teachers have learned--and every parent who has done homework with his child knows--that, like it or not, they teach values in the course of teaching these…
ION-EXCHANGE METHOD FOR SEPARATING RADIUM FROM RADIUM-BARIUM MIXTURES
Fuentevilla, M.E.
1959-06-30
An improved process is presented for separating radium from an aqueous feed solution containing radium and barium values and a complexing agent for these metals. In this process a feed solutlon containing radium and barium ions and a complexing agent for said ions ls cycled through an exchange zone in resins. The radiumenriched resin is then stripped of radium values to form a regeneration liquid, a portion of which is collected as an enriched product, the remaining portion being recycled to the exchange zone to further enrich the ion exchange resin in radium.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Farid, N.R.; Kennedy, C.
We assessed the efficacy of a new thyroxine radioimmunoassay kit (Abbott) in which polyethylene glycol is used to separate bound from free hormone. Mean serum thyroxine was 88 +- 15 (+-SD) ..mu..g/liter for 96 normal persons. Results for hypothyroid and hyperthyroid persons were clearly separated from those for normal individuals. Women taking oral contraceptive preparations showed variable increases in their serum thyroxine values. The coefficient of variation ranged from 1 to 3% within assay and from 5.4 to 11% among different assays. Excellent parallelism was demonstrated between thyroxine values estimated by this method and those obtained either by competitive proteinmore » binding or by a separate radioimmunoassay for the hormone.« less
The Determination of Caffeine in Coffee: Sense or Nonsense?
ERIC Educational Resources Information Center
Beckers, Jozef L.
2004-01-01
The presence of caffeine in coffee is determined by the use of separation devices and UV-vis spectrophotometry. The results indicate that the use of various analytical tools helps to perceive the higher concentration values obtained through UV-vis spectrophotometry than with separation methods.
Integral equation theory study on the phase separation in star polymer nanocomposite melts.
Zhao, Lei; Li, Yi-Gui; Zhong, Chongli
2007-10-21
The polymer reference interaction site model theory is used to investigate phase separation in star polymer nanocomposite melts. Two kinds of spinodal curves were obtained: classic fluid phase boundary for relatively low nanoparticle-monomer attraction strength and network phase boundary for relatively high nanoparticle-monomer attraction strength. The network phase boundaries are much more sensitive with nanoparticle-monomer attraction strength than the fluid phase boundaries. The interference among the arm number, arm length, and nanoparticle-monomer attraction strength was systematically investigated. When the arm lengths are short, the network phase boundary shows a marked shift toward less miscibility with increasing arm number. When the arm lengths are long enough, the network phase boundaries show opposite trends. There exists a crossover arm number value for star polymer nanocomposite melts, below which the network phase separation is consistent with that of chain polymer nanocomposite melts. However, the network phase separation shows qualitatively different behaviors when the arm number is larger than this value.
Joint Blind Source Separation by Multi-set Canonical Correlation Analysis
Li, Yi-Ou; Adalı, Tülay; Wang, Wei; Calhoun, Vince D
2009-01-01
In this work, we introduce a simple and effective scheme to achieve joint blind source separation (BSS) of multiple datasets using multi-set canonical correlation analysis (M-CCA) [1]. We first propose a generative model of joint BSS based on the correlation of latent sources within and between datasets. We specify source separability conditions, and show that, when the conditions are satisfied, the group of corresponding sources from each dataset can be jointly extracted by M-CCA through maximization of correlation among the extracted sources. We compare source separation performance of the M-CCA scheme with other joint BSS methods and demonstrate the superior performance of the M-CCA scheme in achieving joint BSS for a large number of datasets, group of corresponding sources with heterogeneous correlation values, and complex-valued sources with circular and non-circular distributions. We apply M-CCA to analysis of functional magnetic resonance imaging (fMRI) data from multiple subjects and show its utility in estimating meaningful brain activations from a visuomotor task. PMID:20221319
SEPARATION OF PLUTONIUM FROM FISSION PRODUCTS BY A COLLOID REMOVAL PROCESS
Schubert, J.
1960-05-24
A method is given for separating plutonium from uranium fission products. An acidic aqueous solution containing plutonium and uranium fission products is subjected to a process for separating ionic values from colloidal matter suspended therein while the pH of the solution is maintained between 0 and 4. Certain of the fission products, and in particular, zirconium, niobium, lanthanum, and barium are in a colloidal state within this pH range, while plutonium remains in an ionic form, Dialysis, ultracontrifugation, and ultrafiltration are suitable methods of separating plutonium ions from the colloids.
Personality and Values as Predictors of Medical Specialty Choice
ERIC Educational Resources Information Center
Taber, Brian J.; Hartung, Paul J.; Borges, Nicole J.
2011-01-01
Research rarely considers the combined influence of personality traits and values in predicting behavioral outcomes. We aimed to advance a germinal line of inquiry that addresses this gap by separately and simultaneously examining personality traits and physician work values to predict medical specialty choice. First-year medical students (125…
46 CFR 164.023-13 - Production tests and inspections.
Code of Federal Regulations, 2011 CFR
2011-10-01
... Constant Rate of Traverse tensile testing machine, capable of initial clamp separation of ten inches and a... the acceptance testing values but not less than the performance minimums. (2) Length/weight values must be within 5 percent of the acceptance testing values but not less than the performance minimums...
7 CFR 1767.19 - Liabilities and other credits.
Code of Federal Regulations, 2013 CFR
2013-01-01
... Issued A. This account shall include the face value of membership certificates outstanding. A detailed....2Memberships Subscribed But Unissued This account shall include the face value of memberships subscribed for... account shall include, in a separate subdivision for each class and series of bonds, the face value of the...
7 CFR 1767.19 - Liabilities and other credits.
Code of Federal Regulations, 2011 CFR
2011-01-01
... Issued A. This account shall include the face value of membership certificates outstanding. A detailed....2Memberships Subscribed But Unissued This account shall include the face value of memberships subscribed for... account shall include, in a separate subdivision for each class and series of bonds, the face value of the...
7 CFR 1767.19 - Liabilities and other credits.
Code of Federal Regulations, 2012 CFR
2012-01-01
... Issued A. This account shall include the face value of membership certificates outstanding. A detailed....2Memberships Subscribed But Unissued This account shall include the face value of memberships subscribed for... account shall include, in a separate subdivision for each class and series of bonds, the face value of the...
7 CFR 1767.19 - Liabilities and other credits.
Code of Federal Regulations, 2014 CFR
2014-01-01
... Issued A. This account shall include the face value of membership certificates outstanding. A detailed....2Memberships Subscribed But Unissued This account shall include the face value of memberships subscribed for... account shall include, in a separate subdivision for each class and series of bonds, the face value of the...
Zhu, L-D; Hiltunen, Erkki; Li, Zhaohua
2017-12-15
Using naked iron oxide (Fe 3 O 4 ) and yttrium iron oxide (Y 3 Fe 5 O 12 ) nanoparticles as flocculants, the harvesting efficiency of Chlorella vulgaris biomass was investigated. The harvesting process includes two steps, which are the separation of microalgae from the culture solution with the magnetic nanoparticles and then the separation of the algae from the magnetic nanoparticles. The optimal dosages and pH values for the magnetic harvesting of microalgal biomass were determined. Results showed that Y 3 Fe 5 O 12 nanoparticles were more efficient in microalgal biomass harvesting than Fe 3 O 4 nanoparticles. In an effort to achieve more than 90% of harvesting efficiency, optimal dosages for Fe 3 O 4 and Y 3 Fe 5 O 12 to harvest microalgal biomass were 10 and 2.5 g/L, while the appropriate pH values were 6.2 and 7.3, respectively. The harvesting efficiency of Fe 3 O 4 and Y 3 Fe 5 O 12 nanoparticles increased as the pH value decreased. The experimental results also showed that under a higher pH value Fe 3 O 4 nanoparticles were much easier to be separated from the flocs than Y 3 Fe 5 O 12 . 62.9% of Fe 3 O 4 nanoparticles could be de-attached from the aggregates, when the floc pH value reached 12.3.
Separation Potential for Multicomponent Mixtures: State-of-the Art of the Problem
NASA Astrophysics Data System (ADS)
Sulaberidze, G. A.; Borisevich, V. D.; Smirnov, A. Yu.
2017-03-01
Various approaches used in introducing a separation potential (value function) for multicomponent mixtures have been analyzed. It has been shown that all known potentials do not satisfy the Dirac-Peierls axioms for a binary mixture of uranium isotopes, which makes their practical application difficult. This is mainly due to the impossibility of constructing a "standard" cascade, whose role in the case of separation of binary mixtures is played by the ideal cascade. As a result, the only universal search method for optimal parameters of the separation cascade is their numerical optimization by the criterion of the minimum number of separation elements in it.
NASA Aeroelasticity Handbook Volume 2: Design Guides Part 2
NASA Technical Reports Server (NTRS)
Ramsey, John K. (Editor)
2006-01-01
The NASA Aeroelasticity Handbook comprises a database (in three formats) of NACA and NASA aeroelasticity flutter data through 1998 and a collection of aeroelasticity design guides. The Microsoft Access format provides the capability to search for specific data, retrieve it, and present it in a tabular or graphical form unique to the application. The full-text NACA and NASA documents from which the data originated are provided in portable document format (PDF), and these are hyperlinked to their respective data records. This provides full access to all available information from the data source. Two other electronic formats, one delimited by commas and the other by spaces, are provided for use with other software capable of reading text files. To the best of the author s knowledge, this database represents the most extensive collection of NACA and NASA flutter data in electronic form compiled to date by NASA. Volume 2 of the handbook contains a convenient collection of aeroelastic design guides covering fixed wings, turbomachinery, propellers and rotors, panels, and model scaling. This handbook provides an interactive database and design guides for use in the preliminary aeroelastic design of aerospace systems and can also be used in validating or calibrating flutter-prediction software.
Perfect harmony: A mathematical analysis of four historical tunings
NASA Astrophysics Data System (ADS)
Page, Michael F.
2004-10-01
In Western music, a musical interval defined by the frequency ratio of two notes is generally considered consonant when the ratio is composed of small integers. Perfect harmony or an ``ideal just scale,'' which has no exact solution, would require the division of an octave into 12 notes, each of which would be used to create six other consonant intervals. The purpose of this study is to analyze four well-known historical tunings to evaluate how well each one approximates perfect harmony. The analysis consists of a general evaluation in which all consonant intervals are given equal weighting and a specific evaluation for three preludes from Bach's ``Well-Tempered Clavier,'' for which intervals are weighted in proportion to the duration of their occurrence. The four tunings, 5-limit just intonation, quarter-comma meantone temperament, well temperament (Werckmeister III), and equal temperament, are evaluated by measures of centrality, dispersion, distance, and dissonance. When all keys and consonant intervals are equally weighted, equal temperament demonstrates the strongest performance across a variety of measures, although it is not always the best tuning. Given C as the starting note for each tuning, equal temperament and well temperament perform strongly for the three ``Well-Tempered Clavier'' preludes examined. .
NASA Astrophysics Data System (ADS)
Whyte, C.; Leigh, R. J.; Lobb, D.; Williams, T.; Remedios, J. J.; Cutter, M.; Monks, P. S.
2009-12-01
A breadboard demonstrator of a novel UV/VIS grating spectrometer has been developed based upon a concentric arrangement of a spherical meniscus lens, concave spherical mirror and curved diffraction grating suitable for a range of atmospheric remote sensing applications from the ground or space. The spectrometer is compact and provides high optical efficiency and performance benefits over traditional instruments. The concentric design is capable of handling high relative apertures, owing to spherical aberration and comma being near zero at all surfaces. The design also provides correction for transverse chromatic aberration and distortion, in addition to correcting for the distortion called "smile", the curvature of the slit image formed at each wavelength. These properties render this design capable of superior spectral and spatial performance with size and weight budgets significantly lower than standard configurations. This form of spectrometer design offers the potential for exceptionally compact instrument for differential optical absorption spectroscopy (DOAS) applications from LEO, GEO, HAP or ground-based platforms. The breadboard demonstrator has been shown to offer high throughput and a stable Gaussian line shape with a spectral range from 300 to 450 nm at 0.5 nm resolution, suitable for a number of typical DOAS applications.
The parser doesn't ignore intransitivity, after all
Staub, Adrian
2015-01-01
Several previous studies (Adams, Clifton, & Mitchell, 1998; Mitchell, 1987; van Gompel & Pickering, 2001) have explored the question of whether the parser initially analyzes a noun phrase that follows an intransitive verb as the verb's direct object. Three eyetracking experiments examined this issue in more detail. Experiment 1 strongly replicated the finding (van Gompel & Pickering, 2001) that readers experience difficulty on this noun phrase in normal reading, and found that this difficulty occurs even with a class of intransitive verbs for which a direct object is categorically prohibited. Experiment 2, however, demonstrated that this effect is not due to syntactic misanalysis, but is instead due to disruption that occurs when a comma is absent at a subordinate clause/main clause boundary. Exploring a different construction, Experiment 3 replicated the finding (Pickering & Traxler, 2003; Traxler & Pickering, 1996) that when a noun phrase “filler” is an implausible direct object for an optionally transitive relative clause verb, processing difficulty results; however, there was no evidence for such difficulty when the relative clause verb was strictly intransitive. Taken together, the three experiments undermine the support for the claim that the parser initially ignores a verb's subcategorization restrictions. PMID:17470005
Paneth, N; Vinten-Johansen, P; Brody, H; Rip, M
1998-10-01
Contemporaneous with John Snow's famous study of the 1854 London cholera epidemic were 2 other investigations: a local study of the Broad Street outbreak and an investigation of the entire epidemic, undertaken by England's General Board of Health. More than a quarter-century prior to Koch's description of Vibrio comma, a Board of Health investigator saw microscopic "vibriones" in the rice-water stools of cholera patients that, in his later life, he concluded had been cholera bacilli. Although this finding was potential evidence for Snow's view that cholera was due to a contagious and probably live agent transmitted in the water supply, the Board of Health rejected Snow's conclusions. The Board of Health amassed a huge amount of information which it interpreted as supportive of its conclusion that the epidemic was attributable not so much to water as to air. Snow, by contrast, systematically tested his hypothesis that cholera was water-borne by exploring evidence that at first glance ran contrary to his expectations. Snow's success provides support for using a hypothetico-deductive approach in epidemiology, based on tightly focused hypotheses strongly grounded in pathophysiology.
Code of Federal Regulations, 2012 CFR
2012-07-01
... economy values require input of the weighted grams/mile values for total hydrocarbons (HC), carbon... the weighted grams/mile values for the FTP test for HC, CO and CO2; and, additionally for methanol... paragraph (f) of this section. (2) Calculate separately the grams/mile values for the cold transient phase...
RECOVERY AND SEPARATION OF LITHIUM VALUES FROM SALVAGE SOLUTIONS
Hansford, D.L.; Raabe, E.W.
1963-08-20
Lithium values can be recovered from an aqueous basic solution by reacting the values with a phosphate salt soluble in the solution, forming an aqueous slurry of the resultant aqueous insoluble lithium phosphate, contacting the slurry with an organic cation exchange resin in the acid form until the slurry has been clarified, and thereafter recovering lithium values from the resin. (AEC)
Value Representations by Rank Order in a Distributed Network of Varying Context Dependency
ERIC Educational Resources Information Center
Mullett, Timothy L.; Tunney, Richard J.
2013-01-01
We report the results of a human fMRI experiment investigating the influence of context upon value judgement. Trials were separated into high and low value blocks such that it is possible to investigate the effect of a change in surrounding trials upon the encoding of financial value. The ventral striatum was dependent upon "local context", with…
Value loss of hardwood lumber during air-drying
Leland F. Hanks; Margaret K. Peirsol
1975-01-01
Dry lumber prices were applied to green and air-dried lumber that was measured with a dry board rule. Values were summed by species, lumber grade, and thickness class. Differences between green and air-dried lumber value have been termed value losses and are given in dollars and in percentages. The percentages have been separated into loss due to shrinkage and loss due...
Arbitration between controlled and impulsive choices
Economides, M.; Guitart-Masip, M.; Kurth-Nelson, Z.; Dolan, R.J.
2015-01-01
The impulse to act for immediate reward often conflicts with more deliberate evaluations that support long-term benefit. The neural architecture that negotiates this conflict remains unclear. One account proposes a single neural circuit that evaluates both immediate and delayed outcomes, while another outlines separate impulsive and patient systems that compete for behavioral control. Here we designed a task in which a complex payout structure divorces the immediate value of acting from the overall long-term value, within the same outcome modality. Using model-based fMRI in humans, we demonstrate separate neural representations of immediate and long-term values, with the former tracked in the anterior caudate (AC) and the latter in the ventromedial prefrontal cortex (vmPFC). Crucially, when subjects' choices were compatible with long-run consequences, value signals in AC were down-weighted and those in vmPFC were enhanced, while the opposite occurred when choice was impulsive. Thus, our data implicate a trade-off in value representation between AC and vmPFC as underlying controlled versus impulsive choice. PMID:25573670
26 CFR 1.149(d)-1A - Limitations on advance refundings.
Code of Federal Regulations, 2010 CFR
2010-04-01
... savings test. If any separate issue in a multipurpose issue increases the aggregate present value debt service savings on the entire multipurpose issue or reduces the present value debt service losses on that...
Peppard, D.F.
1960-02-01
A process of separating hafnium nitrate from zirconium nitrate contained in a nitric acid solution by selectively. extracting the zirconium nitrate with a water-immiscible alkyl phosphate is reported.
Giddings, J C
1995-05-26
While the use of multiple dimensions in separation systems can create very high peak capacities, the effectiveness of the enhanced peak capacity in resolving large numbers of components depends strongly on whether the distribution of component peaks is ordered or disordered. Peak overlap is common in disordered distributions, even with a very high peak capacity. It is therefore of great importance to understand the origin of peak order/disorder in multidimensional separations and to address the question of whether any control can be exerted over observed levels of order and disorder and thus separation efficacy. It is postulated here that the underlying difference between ordered and disordered distributions of component peaks in separation systems is related to sample complexity as measured by a newly defined parameter, the sample dimensionality s, and by the derivative dimensionality s'. It is concluded that the type and degree of order and disorder is determined by the relationship of s (or s') to the dimensionality n of the separation system employed. Thus for some relatively simple samples (defined as having small s values), increased order and a consequent enhancement of resolution can be realized by increasing n. The resolution enhancement is in addition to the normal gain in resolving power resulting from the increased peak capacity of multidimensional systems. However, for other samples (having even smaller s values), an increase in n provides no additional benefit in enhancing component separability.
Double Star Measurements Using a Webcam and CCD Camera, Annual Report of 2016
NASA Astrophysics Data System (ADS)
Schlimmer, Jeorg
2018-01-01
This report shows the results on 223 double star measurements from 2016; mini-mum separation is 1.23 a.s. (STF1024AB), maximum separation is 371 a.s. (STF1424AD). The mean value of all measurements is 18.7 a.s.
The separation of some recalcitrant polychlorinated biphenyl (PCB) isomers in extracts from environmental compartments has been a daunting task for environmental chemists. Summed quantitation values for coeluting PCB isomers are often reported. This composite data obscures the ac...
Laplace Boundary-Value Problem in Paraboloidal Coordinates
ERIC Educational Resources Information Center
Duggen, L.; Willatzen, M.; Voon, L. C. Lew Yan
2012-01-01
This paper illustrates both a problem in mathematical physics, whereby the method of separation of variables, while applicable, leads to three ordinary differential equations that remain fully coupled via two separation constants and a five-term recurrence relation for series solutions, and an exactly solvable problem in electrostatics, as a…
Separation methods and chemical and nutritional characteristics of tomato pomace
USDA-ARS?s Scientific Manuscript database
Tomato processing generates a large amount of pomace as a low value by-product primarily used as livestock feed or disposed. The objectives of this research were to investigate the chemical and nutritional characteristics and determine effective separation methods of peel and seed of commercial toma...
ERIC Educational Resources Information Center
Nicholson, Linda J.
1980-01-01
Schools are socializing agents, acting in addition to the family to maintain gender bias. Historically, schools were intended to channel young men out of the family into the public sphere. It is in the schools that sex role separation occurs through the separation of spheres in which tasks and abilities are valued. (FG)
40 CFR 1054.505 - How do I test engines?
Code of Federal Regulations, 2010 CFR
2010-07-01
... following methods for confirming torque values for nonhandheld engines: (i) Calculate torque-related cycle... is valid. (ii) Evaluate each mode separately to validate the duty cycle. All torque feedback values recorded during non-idle sampling periods must be within ±2 percent of the reference value or within ±0.27...
Validation of Scale of Commitment to Democratic Values among Secondary Students
ERIC Educational Resources Information Center
Gafoor, K. Abdul
2015-01-01
This study reports development of a reliable and valid instrument for assessing the commitment to democratic values among secondary school students in Kerala from 57 likert type statements originally developed in 2007 by Gafoor and Thushara to assess commitment to nine values avowed in the Indian Constitution. Nine separate maximum likelihood…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-23
... OFFICE OF PERSONNEL MANAGEMENT 5 CFR Part 843 RIN 3206-AM29 Federal Employees' Retirement System; Present Value Conversion Factors for Spouses of Deceased Separated Employees AGENCY: Office of Personnel... Subpart C of Part 843--Present Value Conversion Factors for Earlier Commencing Date of Annuities of...
American Teachers: What Do They Believe?
ERIC Educational Resources Information Center
Slater, Robert O.
2008-01-01
While educators desire to separate the teaching of values from the teaching of reading, writing, and mathematics--the so-called value neutral subjects--they nonetheless do teach values in the course of teaching these subjects. Teaching is as much a moral effort as it is an intellectual enterprise. This article examines the National Opinion…
The Potential Consequence of Using Value-Added Models to Evaluate Teachers
ERIC Educational Resources Information Center
Shen, Zuchao; Simon, Carlee Escue; Kelcey, Ben
2016-01-01
Value-added models try to separate the contribution of individual teachers or schools to students' learning growth measured by standardized test scores. There is a policy trend to use value-added modeling to evaluate teachers because of its face validity and superficial objectiveness. This article investigates the potential long term consequences…
Badiani, Anna; Montellato, Lara; Bochicchio, Davide; Anfossi, Paola; Zanardi, Emanuela; Maranesi, Magda
2004-08-11
Proximate composition and fatty acid profile, conjugated linoleic acid (CLA) isomers included, were determined in separable lean of raw and cooked lamb rib loins. The cooking methods compared, which were also investigated for cooking yields and true nutrient retention values, were dry heating of fat-on cuts and moist heating of fat-off cuts; the latter method was tested as a sort of dietetic approach against the more traditional former type. With significantly (P < 0.05) lower cooking losses, dry heating of fat-on rib-loins produced slightly (although only rarely significantly) higher retention values for all of the nutrients considered, including CLA isomers. On the basis of the retention values obtained, both techniques led to a minimum migration of lipids into the separable lean, which was higher (P < 0.05) in dry heating than in moist heating, and was characterized by the prevalence of saturated and monounsaturated fatty acids. On the whole, the response to cooking of the class of CLA isomers (including that of the nutritionally most important isomer cis-9,trans-11) was more similar to that of the monounsaturated than the polyunsaturated fatty acids.
Theoretical Study of pKa Values for Trivalent Rare-Earth Metal Cations in Aqueous Solution.
Yu, Donghai; Du, Ruobing; Xiao, Ji-Chang; Xu, Shengming; Rong, Chunying; Liu, Shubin
2018-01-18
Molecular acidity of trivalent rare-earth metal cations in aqueous solution is an important factor dedicated to the efficiency of their extraction and separation processes. In this work, the aqueous acidity of these metal ions has been quantitatively investigated using a few theoretical approaches. Our computational results expressed in terms of pK a values agree well with the tetrad effect of trivalent rare-earth ions extensively reported in the extraction and separation of these elements. Strong linear relationships have been observed between the acidity and quantum electronic descriptors such as the molecular electrostatic potential on the acidic nucleus and the sum of the valence natural atomic orbitals energies of the dissociating proton. Making use of the predicted pK a values, we have also predicted the major ionic forms of these species in the aqueous environment with different pH values, which can be employed to rationalize the behavior difference of different rare-earth metal cations during the extraction process. Our present results should provide needed insights not only for the qualitatively understanding about the extraction and separation between yttrium and lanthanide elements but also for the prediction of novel and more efficient rare-earth metal extractants in the future.
Towards a Logical Distinction Between Swarms and Aftershock Sequences
NASA Astrophysics Data System (ADS)
Gardine, M.; Burris, L.; McNutt, S.
2007-12-01
The distinction between swarms and aftershock sequences has, up to this point, been fairly arbitrary and non- uniform. Typically 0.5 to 1 order of magnitude difference between the mainshock and largest aftershock has been a traditional choice, but there are many exceptions. Seismologists have generally assumed that the mainshock carries most of the energy, but this is only true if it is sufficiently large compared to the size and numbers of aftershocks. Here we present a systematic division based on energy of the aftershock sequence compared to the energy of the largest event of the sequence. It is possible to calculate the amount of aftershock energy assumed to be in the sequence using the b-value of the frequency-magnitude relation with a fixed choice of magnitude separation (M-mainshock minus M-largest aftershock). Assuming that the energy of an aftershock sequence is less than the energy of the mainshock, the b-value at which the aftershock energy exceeds that of the mainshock energy determines the boundary between aftershock sequences and swarms. The amount of energy for various choices of b-value is also calculated using different values of magnitude separation. When the minimum b-value at which the sequence energy exceeds that of the largest event/mainshock is plotted against the magnitude separation, a linear trend emerges. Values plotting above this line represent swarms and values plotting below it represent aftershock sequences. This scheme has the advantage that it represents a physical quantity - energy - rather than only statistical features of earthquake distributions. As such it may be useful to help distinguish swarms from mainshock/aftershock sequences and to better determine the underlying causes of earthquake swarms.
Time-frequency approach to underdetermined blind source separation.
Xie, Shengli; Yang, Liu; Yang, Jun-Mei; Zhou, Guoxu; Xiang, Yong
2012-02-01
This paper presents a new time-frequency (TF) underdetermined blind source separation approach based on Wigner-Ville distribution (WVD) and Khatri-Rao product to separate N non-stationary sources from M(M <; N) mixtures. First, an improved method is proposed for estimating the mixing matrix, where the negative value of the auto WVD of the sources is fully considered. Then after extracting all the auto-term TF points, the auto WVD value of the sources at every auto-term TF point can be found out exactly with the proposed approach no matter how many active sources there are as long as N ≤ 2M-1. Further discussion about the extraction of auto-term TF points is made and finally the numerical simulation results are presented to show the superiority of the proposed algorithm by comparing it with the existing ones.
ERIC Educational Resources Information Center
Hamberger, Nan Marie; Moore, Robert L.
Noting that one way to break boundaries that separate one person from another is to use the writing experience to identify and analyze values, this paper presents guidelines for defining values, discussing values, and teaching about values. Teaching and discussion aids are provided to enhance the examination of narratives and biographies, which…
Code of Federal Regulations, 2012 CFR
2012-04-01
... rules of § 1.861-9T(g)(1) and (2) using either tax book value or fair market value under the method... group uses the tax book value method, the member's portions of COFL, CSLL, and CODL accounts are limited... tax book value of assets transferred in intercompany transactions shall be determined without regard...
Code of Federal Regulations, 2011 CFR
2011-04-01
... rules of § 1.861-9T(g)(1) and (2) using either tax book value or fair market value under the method... group uses the tax book value method, the member's portions of COFL, CSLL, and CODL accounts are limited... tax book value of assets transferred in intercompany transactions shall be determined without regard...
Code of Federal Regulations, 2014 CFR
2014-04-01
... rules of § 1.861-9T(g)(1) and (2) using either tax book value, fair market value, or alternative tax...) shall not apply. If the group uses the tax book value method, the member's portions of COFL, CSLL, and... paragraph (c)(2)(ii), the tax book value of assets transferred in intercompany transactions shall be...
Code of Federal Regulations, 2013 CFR
2013-04-01
... rules of § 1.861-9T(g)(1) and (2) using either tax book value, fair market value, or alternative tax...) shall not apply. If the group uses the tax book value method, the member's portions of COFL, CSLL, and... paragraph (c)(2)(ii), the tax book value of assets transferred in intercompany transactions shall be...
SEPARATION OF URANIUM FROM ZIRCONIUM AND NIOBIUM BY SOLVENT EXTRACTION
Voiland, E.E.
1958-05-01
A process for separation of the uranium from zirconium and/or niobium values contained in 3 to 7M aqueous nitric acid solutions is described. This is accomplished by adding phosphoric acid anions to the nitric acid solution containing the uranium, zirconium, and/or niobium in an amount sufficient to make the solution 0.05 to 0.2M in phosphate ion and contacting the solution with an organic water-immiscible solvent such as MEK, whereby the uranyl values are taken up by the extract phase while the zirconium and niobium preferentially remain in the aqueous raffinate.
NASA Astrophysics Data System (ADS)
Alam, Md. Mehboob; Deur, Killian; Knecht, Stefan; Fromager, Emmanuel
2017-11-01
The extrapolation technique of Savin [J. Chem. Phys. 140, 18A509 (2014)], which was initially applied to range-separated ground-state-density-functional Hamiltonians, is adapted in this work to ghost-interaction-corrected (GIC) range-separated ensemble density-functional theory (eDFT) for excited states. While standard extrapolations rely on energies that decay as μ-2 in the large range-separation-parameter μ limit, we show analytically that (approximate) range-separated GIC ensemble energies converge more rapidly (as μ-3) towards their pure wavefunction theory values (μ → +∞ limit), thus requiring a different extrapolation correction. The purpose of such a correction is to further improve on the convergence and, consequently, to obtain more accurate excitation energies for a finite (and, in practice, relatively small) μ value. As a proof of concept, we apply the extrapolation method to He and small molecular systems (viz., H2, HeH+, and LiH), thus considering different types of excitations such as Rydberg, charge transfer, and double excitations. Potential energy profiles of the first three and four singlet Σ+ excitation energies in HeH+ and H2, respectively, are studied with a particular focus on avoided crossings for the latter. Finally, the extraction of individual state energies from the ensemble energy is discussed in the context of range-separated eDFT, as a perspective.
Rowe, Daniel B; Bruce, Iain P; Nencka, Andrew S; Hyde, James S; Kociuba, Mary C
2016-04-01
Achieving a reduction in scan time with minimal inter-slice signal leakage is one of the significant obstacles in parallel MR imaging. In fMRI, multiband-imaging techniques accelerate data acquisition by simultaneously magnetizing the spatial frequency spectrum of multiple slices. The SPECS model eliminates the consequential inter-slice signal leakage from the slice unaliasing, while maintaining an optimal reduction in scan time and activation statistics in fMRI studies. When the combined k-space array is inverse Fourier reconstructed, the resulting aliased image is separated into the un-aliased slices through a least squares estimator. Without the additional spatial information from a phased array of receiver coils, slice separation in SPECS is accomplished with acquired aliased images in shifted FOV aliasing pattern, and a bootstrapping approach of incorporating reference calibration images in an orthogonal Hadamard pattern. The aliased slices are effectively separated with minimal expense to the spatial and temporal resolution. Functional activation is observed in the motor cortex, as the number of aliased slices is increased, in a bilateral finger tapping fMRI experiment. The SPECS model incorporates calibration reference images together with coefficients of orthogonal polynomials into an un-aliasing estimator to achieve separated images, with virtually no residual artifacts and functional activation detection in separated images. Copyright © 2015 Elsevier Inc. All rights reserved.
Accounting for host cell protein behavior in anion-exchange chromatography.
Swanson, Ryan K; Xu, Ruo; Nettleton, Daniel S; Glatz, Charles E
2016-11-01
Host cell proteins (HCP) are a problematic set of impurities in downstream processing (DSP) as they behave most similarly to the target protein during separation. Approaching DSP with the knowledge of HCP separation behavior would be beneficial for the production of high purity recombinant biologics. Therefore, this work was aimed at characterizing the separation behavior of complex mixtures of HCP during a commonly used method: anion-exchange chromatography (AEX). An additional goal was to evaluate the performance of a statistical methodology, based on the characterization data, as a tool for predicting protein separation behavior. Aqueous two-phase partitioning followed by two-dimensional electrophoresis provided data on the three physicochemical properties most commonly exploited during DSP for each HCP: pI (isoelectric point), molecular weight, and surface hydrophobicity. The protein separation behaviors of two alternative expression host extracts (corn germ and E. coli) were characterized. A multivariate random forest (MVRF) statistical methodology was then applied to the database of characterized proteins creating a tool for predicting the AEX behavior of a mixture of proteins. The accuracy of the MVRF method was determined by calculating a root mean squared error value for each database. This measure never exceeded a value of 0.045 (fraction of protein populating each of the multiple separation fractions) for AEX. © 2016 American Institute of Chemical Engineers Biotechnol. Prog., 32:1453-1463, 2016. © 2016 American Institute of Chemical Engineers.
Joo, Sung-Ho; Shin, Dong Ju; Oh, Chang Hyun; Wang, Jei-Pil; Shin, Shun Myung
2016-11-15
Cobalt and manganese have been the subject of individual separation studies because their fields of application are different. However, this study shows that high-value products can be manufactured in the form of a cobalt-manganese-bromide (CMB) liquid catalyst by simultaneously recovering cobalt and manganese. Na-bis-(2,4,4-tri-methyl-pentyl)phosphinic acid was employed in order to manufacture the CMB liquid catalyst from the spent catalyst generated from petroleum chemistry processes. The pH-isotherm, degree of saponification of solvent and separation factor values were investigated. ΔpH50 and separation factor values show that Co and Mn can be separated from impurities such as Mg and Ca. Further, the extraction stages and organic/aqueous ratio isotherms were investigated using counter-current simulation extraction batch tests. To prepare CMB from a loaded organic phase obtained in a stripping study using hydrogen bromide, the Co and Mn were completely stripped and concentrated by a factor of 6 using a 2M hydrogen bromide solution. When compared with manufactured and commercial CMB, the CMB liquid catalyst could be produced by supplying a shortage of Mn in the form of manganese bromide. Finally, the method of manufacture of CMB was subjected to a real pilot plant test. Copyright © 2016. Published by Elsevier B.V.
Nishimoto, Naoki; Ota, Mizuki; Yagahara, Ayako; Ogasawara, Katsuhiko
2016-11-25
After the Fukushima Dai-ichi Nuclear Power Station accident in Japan on March 11, 2011, a large number of comments, both positive and negative, were posted on social media. The objective of this study was to clarify the characteristics of the trend in the number of tweets posted on Twitter, and to estimate how long public concern regarding the accident continued. We surveyed the attenuation period of the first term occurrence related to radiation exposure as a surrogate endpoint for the duration of concern. We retrieved 18,891,284 tweets from Twitter data between March 11, 2011 and March 10, 2012, containing 143 variables in Japanese. We selected radiation, radioactive, Sievert (Sv), Becquerel (Bq), and gray (Gy) as keywords to estimate the attenuation period of public concern regarding radiation exposure. These data, formatted as comma-separated values, were transferred into a Statistical Analysis System (SAS) dataset for analysis, and survival analysis methodology was followed using the SAS LIFETEST procedure. This study was approved by the institutional review board of Hokkaido University and informed consent was waived. A Kaplan-Meier curve was used to show the rate of Twitter users posting a message after the accident that included one or more of the keywords. The term Sv occurred in tweets up to one year after the first tweet. Among the Twitter users studied, 75.32% (880,108/1,168,542) tweeted the word radioactive and 9.20% (107,522/1,168,542) tweeted the term Sv. The first reduction was observed within the first 7 days after March 11, 2011. The means and standard errors (SEs) of the duration from the first tweet on March 11, 2011 were 31.9 days (SE 0.096) for radioactive and 300.6 days (SE 0.181) for Sv. These keywords were still being used at the end of the study period. The mean attenuation period for radioactive was one month, and approximately one year for radiation and radiation units. The difference in mean duration between the keywords was attributed to the effect of mass media. Regularly posted messages, such as daily radiation dose reports, were relatively easy to detect from their time and formatted contents. The survival estimation indicated that public concern about the nuclear power plant accident remained after one year. Although the simple plot of the number of tweets did not show clear results, we estimated the mean attenuation period as approximately one month for the keyword radioactive, and found that the keywords were still being used in posts at the end of the study period. Further research is required to quantify the effect of other phrases in social media data. The results of this exploratory study should advance progress in influencing and quantifying the communication of risk. ©Naoki Nishimoto, Mizuki Ota, Ayako Yagahara, Katsuhiko Ogasawara. Originally published in JMIR Public Health and Surveillance (http://publichealth.jmir.org), 25.11.2016.
Getman, Dan
2013-09-30
To help guide its future data collection efforts, The DOE GTO funded a data gap analysis in FY2012 to identify high potential hydrothermal areas where critical data are needed. This analysis was updated in FY2013 and the resulting datasets are represented by this metadata. The original process was published in FY 2012 and is available here: https://pangea.stanford.edu/ERE/db/GeoConf/papers/SGW/2013/Esposito.pdf Though there are many types of data that can be used for hydrothermal exploration, five types of exploration data were targeted for this analysis. These data types were selected for their regional reconnaissance potential, and include many of the primary exploration techniques currently used by the geothermal industry. The data types include: 1. well data 2. geologic maps 3. fault maps 4. geochemistry data 5. geophysical data To determine data coverage, metadata for exploration data (including data type, data status, and coverage information) were collected and catalogued from nodes on the National Geothermal Data System (NGDS). It is the intention of this analysis that the data be updated from this source in a semi-automated fashion as new datasets are added to the NGDS nodes. In addition to this upload, an online tool was developed to allow all geothermal data providers to access this assessment and to directly add metadata themselves and view the results of the analysis via maps of data coverage in Geothermal Prospector (http://maps.nrel.gov/gt_prospector). A grid of the contiguous U.S. was created with 88,000 10-km by 10-km grid cells, and each cell was populated with the status of data availability corresponding to the five data types. Using these five data coverage maps and the USGS Resource Potential Map, sites were identified for future data collection efforts. These sites signify both that the USGS has indicated high favorability of occurrence of geothermal resources and that data gaps exist. The uploaded data are contained in two data files for each data category. The first file contains the grid and is in the SHP file format (shape file.) Each populated grid cell represents a 10k area within which data is known to exist. The second file is a CSV (comma separated value) file that contains all of the individual layers that intersected with the grid. This CSV can be joined with the map to retrieve a list of datasets that are available at any given site. The attributes in the CSV include: 1. grid_id : The id of the grid cell that the data intersects with 2. title: This represents the name of the WFS service that intersected with this grid cell 3. abstract: This represents the description of the WFS service that intersected with this grid cell 4. gap_type: This represents the category of data availability that these data fall within. As the current processing is pulling data from NGDS, this category universally represents data that are available in the NGDS and are ready for acquisition for analytic purposes. 5. proprietary_type: Whether the data are considered proprietary 6. service_type: The type of service 7. base_url: The service URL
Chen, Shanyan; Meng, Fanjun; Chen, Zhenzhou; Tomlinson, Brittany N; Wesley, Jennifer M; Sun, Grace Y; Whaley-Connell, Adam T; Sowers, James R; Cui, Jiankun; Gu, Zezong
2015-01-01
Excessive activation of gelatinases (MMP-2/-9) is a key cause of detrimental outcomes in neurodegenerative diseases. A single-dimension zymography has been widely used to determine gelatinase expression and activity, but this method is inadequate in resolving complex enzyme isoforms, because gelatinase expression and activity could be modified at transcriptional and posttranslational levels. In this study, we investigated gelatinase isoforms under in vitro and in vivo conditions using two-dimensional (2D) gelatin zymography electrophoresis, a protocol allowing separation of proteins based on isoelectric points (pI) and molecular weights. We observed organomercuric chemical 4-aminophenylmercuric acetate-induced activation of MMP-2 isoforms with variant pI values in the conditioned medium of human fibrosarcoma HT1080 cells. Studies with murine BV-2 microglial cells indicated a series of proform MMP-9 spots separated by variant pI values due to stimulation with lipopolysaccharide (LPS). The MMP-9 pI values were shifted after treatment with alkaline phosphatase, suggesting presence of phosphorylated isoforms due to the proinflammatory stimulation. Similar MMP-9 isoforms with variant pI values in the same molecular weight were also found in mouse brains after ischemic and traumatic brain injuries. In contrast, there was no detectable pI differentiation of MMP-9 in the brains of chronic Zucker obese rats. These results demonstrated effective use of 2D zymography to separate modified MMP isoforms with variant pI values and to detect posttranslational modifications under different pathological conditions.
Parallel basal ganglia circuits for voluntary and automatic behaviour to reach rewards
Hikosaka, Okihide
2015-01-01
The basal ganglia control body movements, value processing and decision-making. Many studies have shown that the inputs and outputs of each basal ganglia structure are topographically organized, which suggests that the basal ganglia consist of separate circuits that serve distinct functions. A notable example is the circuits that originate from the rostral (head) and caudal (tail) regions of the caudate nucleus, both of which target the superior colliculus. These two caudate regions encode the reward values of visual objects differently: flexible (short-term) values by the caudate head and stable (long-term) values by the caudate tail. These value signals in the caudate guide the orienting of gaze differently: voluntary saccades by the caudate head circuit and automatic saccades by the caudate tail circuit. Moreover, separate groups of dopamine neurons innervate the caudate head and tail and may selectively guide the flexible and stable learning/memory in the caudate regions. Studies focusing on manual handling of objects also suggest that rostrocaudally separated circuits in the basal ganglia control the action differently. These results suggest that the basal ganglia contain parallel circuits for two steps of goal-directed behaviour: finding valuable objects and manipulating the valuable objects. These parallel circuits may underlie voluntary behaviour and automatic skills, enabling animals (including humans) to adapt to both volatile and stable environments. This understanding of the functions and mechanisms of the basal ganglia parallel circuits may inform the differential diagnosis and treatment of basal ganglia disorders. PMID:25981958
Noise effects on entanglement distribution by separable state
NASA Astrophysics Data System (ADS)
Bordbar, Najmeh Tabe; Memarzadeh, Laleh
2018-02-01
We investigate noise effects on the performance of entanglement distribution by separable state. We consider a realistic situation in which the mediating particle between two distant nodes of the network goes through a noisy channel. For a large class of noise models, we show that the average value of distributed entanglement between two parties is equal to entanglement between particular bipartite partitions of target qubits and exchange qubit in intermediate steps of the protocol. This result is valid for distributing two-qubit/qudit and three-qubit entangled states. In explicit examples of the noise family, we show that there exists a critical value of noise parameter beyond which distribution of distillable entanglement is not possible. Furthermore, we determine how this critical value increases in terms of Hilbert space dimension, when distributing d-dimensional Bell states.
Can Morality be Separated from Religion in the Teaching of Values?
ERIC Educational Resources Information Center
Nucci, Larry; Junker, Linda
The author argues that morality can be taught in public schools without retreating into moral relativism and without compromising our cultural and constitutional principles of freedom of speech and the separation of church and state. The paper draws from philosophy and psychological theory to illustrate that concepts of the moral constitute a…
Optimization of tomato pomace separation using air aspirator system by response surface methodology
USDA-ARS?s Scientific Manuscript database
Tomato pomace contains seeds and peels which are rich in protein and fat, and dietary fiber and lycopene, respectively. It is important to develop a suitable method to separate seeds and peel in tomato pomace for achieving value-added utilization of tomato pomace. The objectives of this research wer...
USDA-ARS?s Scientific Manuscript database
This study was designed to provide updated information on the separable components, cooking yields, and nutrient values of retail cuts from the beef chuck. Ultimately, these data will be used in the United States Department of Agriculture (USDA) Nutrient Data Laboratory’s (NDL) National Nutrient Da...
Instrument for the measurement and determination of chemical pulse column parameters
Marchant, Norman J.; Morgan, John P.
1990-01-01
An instrument for monitoring and measuring pneumatic driving force pulse parameters applied to chemical separation pulse columns obtains real time pulse frequency and root mean square amplitude values, calculates column inch values and compares these values against preset limits to alert column operators to the variations of pulse column operational parameters beyond desired limits.
Improved method for extracting lanthanides and actinides from acid solutions
Horwitz, E.P.; Kalina, D.G.; Kaplan, L.; Mason, G.W.
1983-07-26
A process for the recovery of actinide and lanthanide values from aqueous acidic solutions uses a new series of neutral bi-functional extractants, the alkyl(phenyl)-N,N-dialkylcarbamoylmethylphosphine oxides. The process is suitable for the separation of actinide and lanthanide values from fission product values found together in high-level nuclear reprocessing waste solutions.
ERIC Educational Resources Information Center
Larsen, Donald E.; Hunter, Joseph E.
2014-01-01
Research conducted by Larsen and Hunter (2013, February) identified a clear pattern in secondary school principals' decision-making related to mandated change: more than half of participants' decisions were based on core values and beliefs, requiring value judgments. Analysis of themes revealed that more than half of administrative decisions…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guney, Ali; Poyraz, M. Ibrahim; Kangal, Olgac, E-mail: kangal@itu.edu.tr
Highlights: • Both PET and PVC have nearly the same densities. • The best pH value will be 4 for optimizing pH values. • Malic acid gave the best results for selective separation of PET and PVC. - Abstract: Plastics have become the widely used materials because of their advantages, such as cheapness, endurance, lightness, and hygiene. However, they cause waste and soil pollution and they do not easily decompose. Many promising technologies are being investigated for separating mixed thermoplastics, but they are still uneconomical and unreliable. Depending on their surface characteristics, these plastics can be separated from each othermore » by flotation method which is useful mineral processing technique with its low cost and simplicity. The main objective of this study is to investigate the flotation characteristics of PET and PVC and determine the effect of plasticizer reagents on efficient plastic separation. For that purpose, various parameters such as pH, plasticizer concentration, plasticizer type, conditioning temperature and thermal conditioning were investigated. As a result, PET particles were floated with 95.1% purity and 65.3% efficiency while PVC particles were obtained with 98.1% purity and 65.3% efficiency.« less
Evaluation of the Precision of Satellite-Derived Sea Surface Temperature Fields
NASA Astrophysics Data System (ADS)
Wu, F.; Cornillon, P. C.; Guan, L.
2016-02-01
A great deal of attention has been focused on the temporal accuracy of satellite-derived sea surface temperature (SST) fields with little attention being given to their spatial precision. Specifically, the primary measure of the quality of SST fields has been the bias and variance of selected values minus co-located (in space and time) in situ values. Contributing values, determined by the location of the in situ values and the necessity that the satellite-derived values be cloud free, are generally widely separated in space and time hence provide little information related to the pixel-to-pixel uncertainty in the retrievals. But the main contribution to the uncertainty in satellite-derived SST retrievals relates to atmospheric contamination and because the spatial scales of atmospheric features are, in general, large compared with the pixel separation of modern infra-red sensors, the pixel-to-pixel uncertainty is often smaller than the accuracy determined from in situ match-ups. This makes selection of satellite-derived datasets for the study of submesoscale processes, for which the spatial structure of the upper ocean is significant, problematic. In this presentation we present a methodology to characterize the spatial precision of satellite-derived SST fields. The method is based on an examination of the high wavenumber tail of the 2-D spectrum of SST fields in the Sargasso Sea, a low energy region of the ocean close to the track of the MV Oleander, a container ship making weekly roundtrips between New York and Bermuda, with engine intake temperatures sampled every 75 m along track. Important spectral characteristics are the point at which the satellite-derived spectra separate from the Oleander spectra and the spectral slope following separation. In this presentation a number of high resolution 375 m to 10 km SST datasets are evaluated based on this approach.
Spectral imaging using clinical megavoltage beams and a novel multi-layer imager
NASA Astrophysics Data System (ADS)
Myronakis, Marios; Fueglistaller, Rony; Rottmann, Joerg; Hu, Yue-Houng; Wang, Adam; Baturin, Paul; Huber, Pascal; Morf, Daniel; Star-Lack, Josh; Berbeco, Ross
2017-12-01
We assess the feasibility of clinical megavoltage (MV) spectral imaging for material and bone separation with a novel multi-layer imager (MLI) prototype. The MLI provides higher detective quantum efficiency and lower noise than conventional electronic portal imagers. Simulated experiments were performed using a validated Monte Carlo model of the MLI to estimate energy absorption and energy separation between the MLI components. Material separation was evaluated experimentally using solid water and aluminum (Al), copper (Cu) and gold (Au) for 2.5 MV, 6 MV and 6 MV flattening filter free (FFF) clinical photon beams. An anthropomorphic phantom with implanted gold fiducials was utilized to further demonstrate bone/gold separation. Weighted subtraction imaging was employed for material and bone separation. The weighting factor (w) was iteratively estimated, with the optimal w value determined by minimization of the relative signal difference (Δ {{S}R} ) and signal-difference-to-noise ratio (SDNR) between material (or bone) and the background. Energy separation between layers of the MLI was mainly the result of beam hardening between components with an average energy separation between 34 and 47 keV depending on the x-ray beam energy. The minimum average energy of the detected spectrum in the phosphor layer was 123 keV in the top layer of the MLI with the 2.5 MV beam. The w values that minimized Δ {{S}R} and SDNR for Al, Cu and Au were 0.89, 0.76 and 0.64 for 2.5 MV; for 6 MV FFF, w was 0.98, 0.93 and 0.77 respectively. Bone suppression in the anthropomorphic phantom resulted in improved visibility of the gold fiducials with the 2.5 MV beam. Optimization of the MLI design is required to achieve optimal separation at clinical MV beam energies.
ERIC Educational Resources Information Center
Malpass, Roy S.; Symonds, John D.
Preferences for 92 values, obtained from a survey of cross-cultural studies of values, were obtained from two separate and geographically distant sets of groups consisting of black and white males and females of lower- and middle-Class status. The middle-class black population was of insufficient size to include, however. Value preferences were…
NASA Astrophysics Data System (ADS)
Skrypnyk, T.
2017-08-01
We study the problem of separation of variables for classical integrable Hamiltonian systems governed by non-skew-symmetric non-dynamical so(3)\\otimes so(3) -valued elliptic r-matrices with spectral parameters. We consider several examples of such models, and perform separation of variables for classical anisotropic one- and two-spin Gaudin-type models in an external magnetic field, and for Jaynes-Cummings-Dicke-type models without the rotating wave approximation.
ON THE SEPARATION OF VANADIUM, MOLYBDENUM AND TUNGSTEN BY MEANS OF PAPER CHROMATOGRAPHY. PART I
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tzou, S.; Liang, S.
1959-02-01
Molybdenum, tangsten, and vanadium are separated by chromatography as per-acids, and then detected with tannin solution. Of the seven solvents tested, n-butanolhydrogen peroxide-nitric acid mixtures offer the best separations. With the addition of dioxane, the R/sub F/ values of these elements increase, while vanadium and tungsten spots overlap. The formation of per-acids avoids the retainment of tungsten on the original spot and the tailings of vanadium and molybdenum spots. (B.O.G.)
SEPARATION OF PLUTONIUM VALUES FROM URANIUM AND FISSION PRODUCT VALUES
Maddock, A.G.; Booth, A.H.
1960-09-13
Separation of plutonium present in small amounts from neutron irradiated uranium by making use of the phenomenon of chemisorption is described. Plutonium in the tetravalent state is chemically absorbed on a fluoride in solid form. The steps for the separation comprise dissolving the irradiated uranium in nitric acid, oxidizing the plutonium in the resulting solution to the hexavalent state, adding to the solution a soluble calcium salt which by the common ion effect inhibits dissolution of the fluoride by the solution, passing the solution through a bed or column of subdivided calcium fluoride which has been sintered to about 8OO deg C to remove the chemisorbable fission products, reducing the plutonium in the solution thus obtained to the tetravalent state, and again passing the solution through a similar bed or column of calcium fluoride to selectively absorb the plutonium, which may then be recovered by treating the calcium fluoride with a solution of ammonium oxalate.
Continuous downstream processing for high value biological products: A Review.
Zydney, Andrew L
2016-03-01
There is growing interest in the possibility of developing truly continuous processes for the large-scale production of high value biological products. Continuous processing has the potential to provide significant reductions in cost and facility size while improving product quality and facilitating the design of flexible multi-product manufacturing facilities. This paper reviews the current state-of-the-art in separations technology suitable for continuous downstream bioprocessing, focusing on unit operations that would be most appropriate for the production of secreted proteins like monoclonal antibodies. This includes cell separation/recycle from the perfusion bioreactor, initial product recovery (capture), product purification (polishing), and formulation. Of particular importance are the available options, and alternatives, for continuous chromatographic separations. Although there are still significant challenges in developing integrated continuous bioprocesses, recent technological advances have provided process developers with a number of attractive options for development of truly continuous bioprocessing operations. © 2015 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Qian, Jin; Zheng, Hao; Wang, Peifang; Liao, Xiaolin; Wang, Chao; Hou, Jun; Ao, Yanhui; Shen, Mengmeng; Liu, Jingjing; Li, Kun
2017-10-01
In this study we used a dual stable isotope approach (δ18O and δ2H) to assess the ecohydrological separation hypothesis and to identify the seasonal variation in water sources of Ginkgo biloba L. in the riparian zone in the Taihu Lake basin, China. Three study sites located at 5, 10, and 30 m from a river bank were established. From August 2014 to July 2015, samples of rainwater, river water, groundwater, bulk soil water at five soil depths (i.e. 0-30, 30-60, 60-90, 90-120, 120-150 cm), and xylem water of G. biloba, were collected and their δ18O and δ2H values were measured. Generally, the δ18O and δ2H values for xylem water, groundwater, and soil water clustered together and separated from those of river water, suggesting the possible occurrence of ecohydrological separation. However, the line-conditioned excess (lc-excess) values of most xylem water were positive, indicating a mixture of different water sources. Significant correlations were observed between the contributions of precipitation, soil water, and groundwater to water uptake by G. biloba, further supporting ecohydrological connectivity rather than ecohydrological separation. G. biloba switched its major water sources from soil water at 0-60 cm depth and precipitation in the wet summer, to soil water from >90 cm depth and groundwater in the dry winter. The river water was a minor water source for G. biloba, but its contribution was comparatively greater at the site closest to the river bank. Our findings contribute to understanding of plant-soil-water relationships and the water balance, and may provide important information for investigations of nutrient sources and sinks in riparian zones. The present study suggests the need to rethink the application of ecohydrological connectivity and separation in different biomes, especially where river water and groundwater recharge each other over time.
Mn and Btex Reference Value Arrays (Final Reports)
These final reports are a summary of reference value arrays with critical supporting documentation for the chemicals manganese, benzene, toluene, ethylbenzene, and xylene. Each chemical is covered in a separate document, and each is a statement of the status of the available inha...
Another convex combination of product states for the separable Werner state
DOE Office of Scientific and Technical Information (OSTI.GOV)
Azuma, Hiroo; Ban, Masashi; CREST, Japan Science and Technology Agency, 1-1-9 Yaesu, Chuo-ku, Tokyo 103-0028
2006-03-15
In this paper, we write down the separable Werner state in a two-qubit system explicitly as a convex combination of product states, which is different from the convex combination obtained by Wootters' method. The Werner state in a two-qubit system has a single real parameter and varies from inseparable to separable according to the value of its parameter. We derive a hidden variable model that is induced by our decomposed form for the separable Werner state. From our explicit form of the convex combination of product states, we understand the following: The critical point of the parameter for separability ofmore » the Werner state comes from positivity of local density operators of the qubits.« less
NASA Astrophysics Data System (ADS)
Kudomi, Nobuyuki; Watabe, Hiroshi; Hayashi, Takuya; Iida, Hidehiro
2007-04-01
Cerebral metabolic rate of oxygen (CMRO2), oxygen extraction fraction (OEF) and cerebral blood flow (CBF) images can be quantified using positron emission tomography (PET) by administrating 15O-labelled water (H152O) and oxygen (15O2). Conventionally, those images are measured with separate scans for three tracers C15O for CBV, H152O for CBF and 15O2 for CMRO2, and there are additional waiting times between the scans in order to minimize the influence of the radioactivity from the previous tracers, which results in a relatively long study period. We have proposed a dual tracer autoradiographic (DARG) approach (Kudomi et al 2005), which enabled us to measure CBF, OEF and CMRO2 rapidly by sequentially administrating H152O and 15O2 within a short time. Because quantitative CBF and CMRO2 values are sensitive to arterial input function, it is necessary to obtain accurate input function and a drawback of this approach is to require separation of the measured arterial blood time-activity curve (TAC) into pure water and oxygen input functions under the existence of residual radioactivity from the first injected tracer. For this separation, frequent manual sampling was required. The present paper describes two calculation methods: namely a linear and a model-based method, to separate the measured arterial TAC into its water and oxygen components. In order to validate these methods, we first generated a blood TAC for the DARG approach by combining the water and oxygen input functions obtained in a series of PET studies on normal human subjects. The combined data were then separated into water and oxygen components by the present methods. CBF and CMRO2 were calculated using those separated input functions and tissue TAC. The quantitative accuracy in the CBF and CMRO2 values by the DARG approach did not exceed the acceptable range, i.e., errors in those values were within 5%, when the area under the curve in the input function of the second tracer was larger than half of the first one. Bias and deviation in those values were also compatible to that of the conventional method, when noise was imposed on the arterial TAC. We concluded that the present calculation based methods could be of use for quantitatively calculating CBF and CMRO2 with the DARG approach.
Holler, P J; Wess, G
2014-01-01
E-point-to-septal-separation (EPSS) and the sphericity index (SI) are echocardiographic parameters that are recommended in the ESVC-DCM guidelines. However, SI cutoff values to diagnose dilated cardiomyopathy (DCM) have never been evaluated. To establish reference ranges, calculate cutoff values, and assess the clinical value of SI and EPSS to diagnose DCM in Doberman Pinschers. One hundred seventy-nine client-owned Doberman Pinschers. Three groups were formed in this prospective longitudinal study according to established Holter and echocardiographic criteria using the Simpson method of disk (SMOD): control group (97 dogs), DCM with echocardiographic changes (75 dogs) and "last normal" group (n = 7), which included dogs that developed DCM within 1.5 years, but were still normal at this time point. In a substudy, dogs with early DCM based upon SMOD values above the reference range but still normal M-Mode measurements were selected, to evaluate if EPSS or SI were abnormal using the established cutoff values. ROC-curve analysis determined <1.65 for the SI (sensitivity 86.8%; specificity 87.6%) and >6.5 mm for EPSS (sensitivity 100%; specificity 99.0%) as optimal cutoff values to diagnose DCM. Both parameters were significantly different between the control group and the DCM group (P < 0.001), but were not abnormal in the "last normal" group. In the substudy, EPSS was abnormal in 13/13 dogs and SI in 2/13 dogs. E-point-to-septal-separation is a valuable additional parameter for the diagnosis of DCM, which can enhance diagnostic capabilities of M-Mode and which performs similar as well as SMOD. Copyright © 2013 by the American College of Veterinary Internal Medicine.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Palmieri, M.D.; Fritz, J.S.
Metal ions are determined by adding N-methylfurohydroxamic acid to an aqueous sample and then separating the metal chelates by direct injection onto a liquid chromatographic column. Separations on a C/sub 8/ silica column and a polystyrene-divinylbenzene column are compared, with better separations seen on the polymeric column. The complexes formed at low pH values are cationic and are separated by an ion pairing mechanism. Retention times and selectivity of the metal complexes can be varied by changing the pH. Several metal ions can be separated and quantified; separation conditions, linear calibration curve ranges, and detection limits are presented for Zr(IV),more » Hf(IV), Fe(III), Nb(V), Al(III), and Sb(III). Interferences due to the presence of other ions in solution are investigated. Finally, an antiperspirant sample is analyzed for zirconium by high-performance liquid chromatography.« less
Batch extracting process using magneticparticle held solvents
Nunez, Luis; Vandergrift, George F.
1995-01-01
A process for selectively removing metal values which may include catalytic values from a mixture containing same, wherein a magnetic particle is contacted with a liquid solvent which selectively dissolves the metal values to absorb the liquid solvent onto the magnetic particle. Thereafter the solvent-containing magnetic particles are contacted with a mixture containing the heavy metal values to transfer metal values into the solvent carried by the magnetic particles, and then magnetically separating the magnetic particles. Ion exchange resins may be used for selective solvents.
NASA Astrophysics Data System (ADS)
Dholey, S.
2018-04-01
In this paper, we have investigated numerically the laminar unsteady separated stagnation-point flow and heat transfer of a viscous fluid over a moving flat surface in the presence of a time dependent free stream velocity which causes the unsteadiness of this flow problem. The plate is assumed to move in the same or opposite direction of the free stream velocity. The flow is therefore governed by the velocity ratio parameter λ (ratio of the plate velocity to the free stream velocity) and the unsteadiness parameter β. When the plate surface moves in the same direction of the free stream velocity (i.e., when λ > 0), the solution of this flow problem continues for any given value of β. On the other hand, when they move in opposite directions (i.e., when λ < 0), the solution does not exist after a certain value of λ depending upon the values of β. In this case, separation appears inside the layer only for a negative value of β, and for a positive value of β, the boundary layer solution is terminated after a certain distance from the plate surface with an attached flow solution with no point of inflection. The concerning issue of the steady flow (β = 0) case has also been considered and two types of attached flow solutions have been found—one with a point of inflection and the other with no point of inflection, in a definite range of λ (-1.246 58 ≤ λ ≤ -1.07). However, this range decreases with an increase in |β| when β < 0. A novel result which arises from the heat transfer analysis is that for a given value of λ(= 0), first the heat transfer rate increases with the increase of the Prandtl number Pr and after attaining a maximum value, it decreases and finally tends to be zero for large values of Pr depending upon the values of β > 0. On the contrary, for a given value of β(≤ 0), the rate of heat transfer increases consistently with the increase of Pr.
Maximal Aortic Valve Cusp Separation and Severity of Aortic Stenosis
Dilu, VP; George, Raju
2017-01-01
Introduction An integrated approach that incorporates two dimensional, M mode and Doppler echocardiographic evaluation has become the standard means for accurate quantification of severity of valvular aortic stenosis. Maximal separation of the aortic valve cusps during systole has been shown to correlate well with the severity of aortic stenosis measured by other echocardiographic parameters. Aim To study the correlation between Maximal Aortic valve Cusp Separation (MACS) and severity of aortic valve stenosis and to find cut-off values of MACS for detecting severe and mild aortic stenosis. Materials and Methods In the present prospective observational study, we have compared the accuracy of MACS distance and the aortic valve area calculated by continuity equation in 59 patients with varying degrees of aortic valve stenosis. Aortic leaflet separation in M mode was identified as the distance between the inner edges of the tips of these structures at mid systole in the parasternal long axis view. Cuspal separation was also measured in 2D echocardiography from the parasternal long axis view and the average of the two values was taken as the MACS. Patients were grouped into mild, moderate and severe aortic stenosis based on the aortic valve area calculated by continuity equation. The resultant data regarding maximal leaflet separation on cross-sectional echocardiogram was then subjected to linear regression analysis in regard to correlation with the peak transvalvular aortic gradient as well as the calculated aortic valve area. A cut-off value for each group was derived using ROC curve. Results There was a strong correlation between MACS and aortic valve area measured by continuity equation and the peak and mean transvalvular aortic gradients. Mean MACS was 6.89 mm in severe aortic stenosis, 9.97 mm in moderate aortic stenosis and 12.36 mm in mild aortic stenosis. MACS below 8.25 mm reliably predicted severe aortic stenosis, with high sensitivity, specificity and positive predictive value. MACS above 11.25 mm practically ruled out significant aortic stenosis. Conclusion Measurement of MACS is a simple echocardio-graphic method to assess the severity of valvular aortic stenosis, with high sensitivity and specificity. MACS can be extremely useful in two clinical situations as a simple screening tool for assessment of stenosis severity and also helps in decision making non invasively when there is discordance between the other echocardiographic parameters of severity of aortic stenosis. PMID:28764221
Process for the combined removal of SO.sub.2 and NO.sub.x from flue gas
Chang, Shih-Ger; Liu, David K.; Griffiths, Elizabeth A.; Littlejohn, David
1988-01-01
The present invention in one aspect relates to a process for the simultaneous removal of NO.sub.x and SO.sub.2 from a fluid stream comprising mixtures thereof and in another aspect relates to the separation, use and/or regeneration of various chemicals contaminated or spent in the process and which includes the steps of: (A) contacting the fluid stream at a temperature of between about 105.degree. and 180.degree. C. with a liquid aqueous slurry or solution comprising an effective amount of an iron chelate of an amino acid moiety having at least one --SH group; (B) separating the fluid stream from the particulates formed in step (A) comprising the chelate of the amino acid moiety and fly ash; (C) washing and separating the particulates of step (B) with an aqueous solution having a pH value of between about 5 to 8; (D) subsequently washing and separating the particulates of step (C) with a strongly acidic aqueous solution having a pH value of between about 1 to 3; (E) washing and separating the particulates of step (D) with an basic aqueous solution having a pH value of between about 9 to 12; (F) optionally adding additional amino acid moiety, iron (II) and alkali to the aqueous liquid from step (D) to produce an aqueous solution or slurry similar to that in step (A) having a pH value of between about 4 to 12; and (G) recycling the aqueous slurry of step (F) to the contacting zone of step (A). Steps (D) and (E) can be carried out in the reverse sequence, however the preferred order is (D) and then (E). In another preferred embodiment the present invention provides a process for the removal of NO.sub.x, SO.sub.2 and particulates from a fluid stream which includes the steps of (A) injecting into a reaction zone an aqueous solution itself comprising (i) an amino acid moiety selected from those described above; (ii) iron (II) ion; and (iii) an alkali, wherein the aqueous solution has a pH of between about 4 and 11; followed by solids separation and washing as is described in steps (B), (C), (D) and (E) above. The overall process is useful to reduce acid rain components from combustion gas sources.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 5 Administrative Personnel 2 2010-01-01 2010-01-01 false Present Value Conversion Factors for Earlier Commencing Date of Annuities of Current and Former Spouses of Deceased Separated Employees A...—Present Value Conversion Factors for Earlier Commencing Date of Annuities of Current and Former Spouses of...
Physical properties of recycled PET non-woven fabrics for buildings
NASA Astrophysics Data System (ADS)
Üstün Çetin, S.; Tayyar, A. E.
2017-10-01
Recycled fibers have been commonly used in non-woven production technology for engineering applications such as textile engineering and civil engineering. Nonwovens including recycled fibers can be utilized in insulation, roofing and floor separation applications. In this study, physical performance properties such as drape, bending resistance, tensile strength, and breaking elongation values of non-woven fabrics consisting of v-PET (virgin) and r-PET (recycled) fibers in five different blend ratios are examined comparatively. The test results indicated that r-PET can be used in non-wovens for civil engineering applications such as insulation, roofing and floor separation fulfilling the acceptable quality level values.
The Chiral Separation Effect in quenched finite-density QCD
NASA Astrophysics Data System (ADS)
Puhr, Matthias; Buividovich, Pavel
2018-03-01
We present results of a study of the Chiral Separation Effect (CSE) in quenched finite-density QCD. Using a recently developed numerical method we calculate the conserved axial current for exactly chiral overlap fermions at finite density for the first time. We compute the anomalous transport coeffcient for the CSE in the confining and deconfining phase and investigate possible deviations from the universal value. In both phases we find that non-perturbative corrections to the CSE are absent and we reproduce the universal value for the transport coeffcient within small statistical errors. Our results suggest that the CSE can be used to determine the renormalisation factor of the axial current.
NASA Astrophysics Data System (ADS)
Jabar, A.; Masrour, R.
2018-05-01
The magnetic properties of magnetic bilayers of Kekulene structure separate by a nonmagnetic layer with Ruderman-Kittel-Kasuya-Yosida (RKKY) exchange interactions with Ising spin model have been studied using Monte Carlo simulations. The RKKY interaction between the bilayers of Kekulene is considered for different distances. The transition temperature has been deduced from the magnetizations and magnetic susceptibilities partial for a fixed value of nonmagnetic layer. The reduced transition temperatures are also deduced from the total magnetization and total magnetic susceptibilities with different values of L. The magnetic hysteresis cycles of systems have been determined.
Studies on unsteady pressure fields in the region of separating and reattaching flows
NASA Astrophysics Data System (ADS)
Govinda Ram, H. S.; Arakeri, V. H.
1990-12-01
Experimental studies on the measurement of pressure fields in the region of separating and reattaching flows behind several two-dimensional fore-bodies and one axisymmetric body are reported. In particular, extensive measurements of mean pressure, surface pressure fluctuation, and pressure fluctuation within the flow were made for a series of two-dimensional fore-body shapes consisting of triangular nose with varying included angle. The measurements from different bodies are compared and one of the important findings is that the maximum values of rms pressure fluctuation levels in the shear layer approaching reattachment are almost equal to the maximum value of the surface fluctuation levels.
Minimal sufficient positive-operator valued measure on a separable Hilbert space
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kuramochi, Yui, E-mail: kuramochi.yui.22c@st.kyoto-u.ac.jp
We introduce a concept of a minimal sufficient positive-operator valued measure (POVM), which is the least redundant POVM among the POVMs that have the equivalent information about the measured quantum system. Assuming the system Hilbert space to be separable, we show that for a given POVM, a sufficient statistic called a Lehmann-Scheffé-Bahadur statistic induces a minimal sufficient POVM. We also show that every POVM has an equivalent minimal sufficient POVM and that such a minimal sufficient POVM is unique up to relabeling neglecting null sets. We apply these results to discrete POVMs and information conservation conditions proposed by the author.
NASA Astrophysics Data System (ADS)
Chung, T. W.; Chen, C. K.; Hsu, S. H.
2017-11-01
Protein concentration process using filter membrane has a significant advantage on energy saving compared to the traditional drying processes. However, fouling on large membrane area and frequent membrane cleaning will increase the energy consumption and operation cost for the protein concentration process with filter membrane. In this study, the membrane filtration for protein concentration will be conducted and compared with the recent protein concentration technology. The analysis of operating factors for protein concentration process using filter membrane was discussed. The separation mechanism of membrane filtration was developed according to the size difference between the pore of membrane and the particle of filter material. The Darcy’s Law was applied to discuss the interaction on flux, TMP (transmembrane pressure) and resistance in this study. The effect of membrane pore size, pH value and TMP on the steady-state flux (Jst) and protein rejection (R) were studied. It is observed that the Jst increases with decreasing membrane pore size, the Jst increases with increasing TMP, and R increased with decreasing solution pH value. Compare to other variables, the pH value is the most significant variable for separation between protein and water.
NASA Astrophysics Data System (ADS)
Rao, Gottumukkala Venkateswara; Markandeya, R.; Kumar, Rajan
2018-04-01
An attempt has been made to utilise Sub Grade Iron Ore by producing pellet grade concentrate from Deposit 5, Bacheli Complex, Bailadila, Chhattisgarh, India. The `as received' Run of Mine (ROM) sample assayed 40.80% Fe, 40.90% SiO2. Mineralogical studies indicated that the main ore mineral is Hematite and lone gangue mineral is Quartz. Mineral liberation studies indicated that, the ore mineral Hematite and gangue mineral Quartz are getting liberated below 100 microns. The stage crushed and ground sample was subjected to concentration by using a Multi Gravity Separator (MGS). Rougher Multi Gravity Separation (MGS) experimental results were optimised to recover highest possible iron values. A concentrate of 55.80% Fe with a yield of 61.73% by weight with a recovery of 84.42% Iron values was obtained in rougher MGS concentrate. Further experiments were carried out with rougher MGS concentrate to produce a concentrate suitable for commercial grade pellet concentrate. It was proved that a concentrate assaying 66.67% Fe, 3.12% SiO2 with an yield of 45.08% by weight and with a recovery of 73.67% iron values in the concentrate.
40 CFR 246.200-5 - Recommended procedures: Methods of separation and collection.
Code of Federal Regulations, 2012 CFR
2012-07-01
... designed to recover high grades of office paper at the source of generation, i.e., the desk, are the... recommended system is the desk-top system because it is designed to maximize recovery of high value material... desk-top system has been designed to minimize these problems. (d) The precise method of separation and...
40 CFR 246.200-5 - Recommended procedures: Methods of separation and collection.
Code of Federal Regulations, 2013 CFR
2013-07-01
... designed to recover high grades of office paper at the source of generation, i.e., the desk, are the... recommended system is the desk-top system because it is designed to maximize recovery of high value material... desk-top system has been designed to minimize these problems. (d) The precise method of separation and...
40 CFR 246.200-5 - Recommended procedures: Methods of separation and collection.
Code of Federal Regulations, 2011 CFR
2011-07-01
... designed to recover high grades of office paper at the source of generation, i.e., the desk, are the... recommended system is the desk-top system because it is designed to maximize recovery of high value material... desk-top system has been designed to minimize these problems. (d) The precise method of separation and...
40 CFR 246.200-5 - Recommended procedures: Methods of separation and collection.
Code of Federal Regulations, 2010 CFR
2010-07-01
... designed to recover high grades of office paper at the source of generation, i.e., the desk, are the... recommended system is the desk-top system because it is designed to maximize recovery of high value material... desk-top system has been designed to minimize these problems. (d) The precise method of separation and...
40 CFR 246.200-5 - Recommended procedures: Methods of separation and collection.
Code of Federal Regulations, 2014 CFR
2014-07-01
... designed to recover high grades of office paper at the source of generation, i.e., the desk, are the... recommended system is the desk-top system because it is designed to maximize recovery of high value material... desk-top system has been designed to minimize these problems. (d) The precise method of separation and...
Heat Transfer at the Reattachment Zone of Separated Laminar Boundary Layers
NASA Technical Reports Server (NTRS)
Chung, Paul M.; Viegas, John R.
1961-01-01
The flow and heat transfer are analyzed at the reattachment zone of two-dimensional separated laminar boundary layers. The fluid is considered to be flowing normal to the wall at reattachment. An approximate expression is derived for the heat transfer in the reattachment region and a calculated value is compared with an experimental measurement.
Health Economics and the Economics of Education: Specialization and Division of Labor.
ERIC Educational Resources Information Center
Bishop, Dawn M.; Hunt-McCool, Janet
1998-01-01
Addresses the separation of human capital studies into distinct fields of education and health. The main difference between the fields may be the ability to measure objectively the value added of health care expenditures, in contrast to the earnings valuation of education. As a result, the two fields (and their literatures) separate theoretically…
Separation and structural analysis of saponins in a bark extract from Quillaja saponaria Molina.
Nord, L I; Kenne, L
1999-07-20
Six major saponins were isolated from a bark extract from Quillaja saponaria Molina. Solid-phase extraction, followed by a two-step reversed-phase HPLC separation procedure with phosphate and ammonium acetate buffers of different pH values, was used. The compounds were characterised using NMR spectroscopy, mass spectrometry and chemical methods.
NASA Technical Reports Server (NTRS)
Stokes, R. L.
1979-01-01
Tests performed to determine accuracy and efficiency of bus separators used in microprocessors are presented. Functional, AC parametric, and DC parametric tests were performed in a Tektronix S-3260 automated test system. All the devices passed the functional tests and yielded nominal values in the parametric test.
METHOD OF PROCESSING MONAZITE SAND
Welt, M.A.; Smutz, M.
1958-08-26
A process is described for recovering thorium, uranium, and rare earth values from monazite sand. The monazite sand is first digested with sulfuric acid and the resulting "monazite sulfate" solution is adjusted to a pH of between 0.4 and 3.0, and oxalate anions are added causing precipitation of the thorium and the rare earths as the oxalates. The oxalate precipitate is separated from the uranium containing supernatant solution, and is dried and calcined to the oxides. The thorium and rare earth oxides are then dissolved in nitric acid and the solution is contacted with tribntyl phosphate whereby an organic extract phase containing the cerium and thorium values is obtained, together with an aqueous raffinate containing the other rare earth values. The organic phase is then separated from the aqueous raffinate and the cerium and thorium are back extracted with an aqueous medium.
A rapid method for soil cement design : Louisiana slope value method : part II : evaluation.
DOT National Transportation Integrated Search
1966-05-01
This report is an evaluation of the recently developed "Louisiana Slope Value Method". : The conclusion drawn are based on data from 637 separate samples representing nearly all major soil groups in Louisiana that are suitable for cement stabilizatio...
Halámek, Jan; Zhou, Jian; Halámková, Lenka; Bocharova, Vera; Privman, Vladimir; Wang, Joseph; Katz, Evgeny
2011-11-15
Biomolecular logic systems processing biochemical input signals and producing "digital" outputs in the form of YES/NO were developed for analysis of physiological conditions characteristic of liver injury, soft tissue injury, and abdominal trauma. Injury biomarkers were used as input signals for activating the logic systems. Their normal physiological concentrations were defined as logic-0 level, while their pathologically elevated concentrations were defined as logic-1 values. Since the input concentrations applied as logic 0 and 1 values were not sufficiently different, the output signals being at low and high values (0, 1 outputs) were separated with a short gap making their discrimination difficult. Coupled enzymatic reactions functioning as a biomolecular signal processing system with a built-in filter property were developed. The filter process involves a partial back-conversion of the optical-output-signal-yielding product, but only at its low concentrations, thus allowing the proper discrimination between 0 and 1 output values.
NASA Astrophysics Data System (ADS)
Lv, J. X.; Wang, B. F.; Nie, L. H.; Xu, R. R.; Zhou, J. Y.; Hao, Y. J.
2018-01-01
The simulation process of the whole CNG filling station are established using Aspen Plus V7.2. The separator (Sep) was used to simulate the desulfurization and dehydration equipment in the gas station, and the flash module separator Flash 2 was used to simulate the gas storage well with proper temperature and environmental pressure. Furthermore, the sensitivity module was used to analyse the behaviour of the dehydration and desulfurization rate, and the residual pH value of the gas storage wells was between 2.2 and 3.3. The results indicated that the effect of water content on pH value is higher than that of hydrogen sulphide in the environment of gas storage wells, and the calculation process of the pH value is feasible. Additionally, the simulation process provides basic data for the subsequent anticorrosive mechanism and work of gas storage well and has great potential for practical applications.
Escorza-Treviño, S; Dizon, A E
2000-08-01
Mitochondrial DNA (mtDNA) control-region sequences and microsatellite loci length polymorphisms were used to estimate phylogeographical patterns (historical patterns underlying contemporary distribution), intraspecific population structure and gender-biased dispersal of Phocoenoides dalli dalli across its entire range. One-hundred and thirteen animals from several geographical strata were sequenced over 379 bp of mtDNA, resulting in 58 mtDNA haplotypes. Analysis using F(ST) values (based on haplotype frequencies) and phi(ST) values (based on frequencies and genetic distances between haplotypes) yielded statistically significant separation (bootstrap values P < 0.05) among most of the stocks currently used for management purposes. A minimum spanning network of haplotypes showed two very distinctive clusters, differentially occupied by western and eastern populations, with some common widespread haplotypes. This suggests some degree of phyletic radiation from west to east, superimposed on gene flow. Highly male-biased migration was detected for several population comparisons. Nuclear microsatellite DNA markers (119 individuals and six loci) provided additional support for population subdivision and gender-biased dispersal detected in the mtDNA sequences. Analysis using F(ST) values (based on allelic frequencies) yielded statistically significant separation between some, but not all, populations distinguished by mtDNA analysis. R(ST) values (based on frequencies of and genetic distance between alleles) showed no statistically significant subdivision. Again, highly male-biased dispersal was detected for all population comparisons, suggesting, together with morphological and reproductive data, the existence of sexual selection. Our molecular results argue for nine distinct dalli-type populations that should be treated as separate units for management purposes.
Kuo, Phillip Hsin; Avery, Ryan; Krupinski, Elizabeth; Lei, Hong; Bauer, Adam; Sherman, Scott; McMillan, Natalie; Seibyl, John; Zubal, George
2013-03-01
A fully automated objective striatal analysis (OSA) program that quantitates dopamine transporter uptake in subjects with suspected Parkinson's disease was applied to images from clinical (123)I-ioflupane studies. The striatal binding ratios or alternatively the specific binding ratio (SBR) of the lowest putamen uptake was computed, and receiver-operating-characteristic (ROC) analysis was applied to 94 subjects to determine the best discriminator using this quantitative method. Ninety-four (123)I-ioflupane SPECT scans were analyzed from patients referred to our clinical imaging department and were reconstructed using the manufacturer-supplied reconstruction and filtering parameters for the radiotracer. Three trained readers conducted independent visual interpretations and reported each case as either normal or showing dopaminergic deficit (abnormal). The same images were analyzed using the OSA software, which locates the striatal and occipital structures and places regions of interest on the caudate and putamen. Additionally, the OSA places a region of interest on the occipital region that is used to calculate the background-subtracted SBR. The lower SBR of the 2 putamen regions was taken as the quantitative report. The 33 normal (bilateral comma-shaped striata) and 61 abnormal (unilateral or bilateral dopaminergic deficit) studies were analyzed to generate ROC curves. Twenty-nine of the scans were interpreted as normal and 59 as abnormal by all 3 readers. For 12 scans, the 3 readers did not unanimously agree in their interpretations (discordant). The ROC analysis, which used the visual-majority-consensus interpretation from the readers as the gold standard, yielded an area under the curve of 0.958 when using 1.08 as the threshold SBR for the lowest putamen. The sensitivity and specificity of the automated quantitative analysis were 95% and 89%, respectively. The OSA program delivers SBR quantitative values that have a high sensitivity and specificity, compared with visual interpretations by trained nuclear medicine readers. Such a program could be a helpful aid for readers not yet experienced with (123)I-ioflupane SPECT images and if further adapted and validated may be useful to assess disease progression during pharmaceutical testing of therapies.
The Origins of Agrarianism and the Development of the Self.
ERIC Educational Resources Information Center
Hanson, Victor Davis
1998-01-01
Describes the history of agrarianism and its connection with Western cultural values of private property, civil liberties, constitutional government, separation of power, individualism, and self-reliance. Argues that agrarian history has vast implications beyond just farming, affecting the language, values, and foundations of culture. Discusses…
2017-06-01
Reports an error in "Embodiment as procedures: Physical cleansing changes goal priming effects" by Ping Dong and Spike W. S. Lee ( Journal of Experimental Psychology: General , 2017[Apr], Vol 146[4], 592-605). In the article, the following F-value and p-values in the Results section of Experiment 3 were set incorrectly: The p -value p =.925 should be p =.922. The F - and p -values F (1, 201)=.011, p =.916 should be F (1, 201) .014, p = .906. (The following abstract of the original article appeared in record 2017-14922-010.) Physical cleansing reduces the influence of numerous psychological experiences, such as guilt from immoral behavior, dissonance from free choice, and good/bad luck from winning/losing. How do these domain-general effects occur? We propose an integrative account of cleansing as an embodied procedure of psychological separation. By separating physical traces from a physical target object (e.g., detaching dirt from hands), cleansing serves as the embodied grounding for the separation of psychological traces from a psychological target object (e.g., dissociating prior experience from the present self). This account predicts that cleansing reduces the accessibility of psychological traces and their consequences for judgments and behaviors. Testing these in the context of goal priming, we find that wiping one's hands (vs. not) decreases the mental accessibility (Experiment 1), behavioral expression (Experiment 2), and judged importance (Experiments 3-4) of previously primed goals (e.g., achievement, saving, fitness). But if a goal is primed after cleansing, its importance gets amplified instead (Experiment 3). Based on the logic of moderation-of-process, an alternative manipulation that psychologically separates a primed goal from the present self produces the same effects, but critically, the effects vanish once people wipe their hands clean (Experiment 4), consistent with the notion that cleansing functions as an embodied procedure of psychological separation. These findings have implications for the flexibility of goal pursuit. More broadly, our procedural perspective generates novel predictions about the scope and mechanisms of cleansing effects and may help integrate embodied and related phenomena. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Features of separating turbulent boundary layers
NASA Technical Reports Server (NTRS)
Nagabushana, K. A.; Agarwal, Naval K.; Simpson, Roger L.
1988-01-01
In the present study of two strong adverse pressure gradient flows, mean flow and turbulence characteristics are measured, together with frequency spectra, using hot-wire and laser anemometry. In these separating flows, reattachment occurs over a shorter distance than separation. It is noted that the outer flow variables form a unique set of scaling parameters for streamwise power spectra in adverse pressure gradient flows. The inner flow scaling of Perry et al. (1985) for streamwise spectra does not hold in the backflow region unless the value of the downstream-upstream intermittency in the flow is unity.
SULFIDE METHOD PLUTONIUM SEPARATION
Duffield, R.B.
1958-08-12
A process is described for the recovery of plutonium from neutron irradiated uranium solutions. Such a solution is first treated with a soluble sullide, causing precipitation of the plutoniunn and uraniunn values present, along with those impurities which form insoluble sulfides. The precipitate is then treated with a solution of carbonate ions, which will dissolve the uranium and plutonium present while the fission product sulfides remain unaffected. After separation from the residue, this solution may then be treated by any of the usual methods, such as formation of a lanthanum fluoride precipitate, to effect separation of plutoniunn from uranium.
ZIRCONIUM PHOSPHATE ADSORPTION METHOD
Russell, E.R.; Adamson, A.S.; Schubert, J.; Boyd, G.E.
1958-11-01
A method is presented for separating plutonium values from fission product values in aqueous acidic solution. This is accomplished by flowing the solutlon containing such values through a bed of zirconium orthophosphate. Any fission products adsorbed can subsequently be eluted by washing the column with a solution of 2N HNO/sub 3/ and O.lN H/sub 3/PO/sub 4/. Plutonium values may subsequently be desorbed by contacting the column with a solution of 7N HNO/sub 3/ .
Coastal Fog, South Peruvian Coast at Pisco
NASA Technical Reports Server (NTRS)
2002-01-01
Coastal fog commonly drapes the Peruvian coast. This image captures complex interactions between land, sea, and atmosphere along the southern Peruvian coast. When Shuttle astronauts took the image in February of 2002, the layers of coastal fog and stratus were being progressively scoured away by brisk south to southeast winds. Remnants of the cloud deck banked against the larger, obstructing headlands like Peninsula Paracas and Isla Sangayan, giving the prominent 'white comma' effect. Southerlies also produced ripples of internal gravity waves in the clouds offshore where warm, dry air aloft interacts with a thinning layer of cool, moist air near the sea surface on the outer edge of the remaining cloud bank. South of Peninsula Baracas, the small headlands channeled the clouds into streaks-local horizontal vortices caused by the headlands provided enough lift to give points of origin of the clouds in some bays. Besides the shelter of the peninsula, the Bahia de Pisco appears to be cloud-free due to a dry, offshore flow down the valley of the Rio Ica. The STS-109 crew took image STS109-730-80 in February 2002. The image is provided by the Earth Sciences and Image Analysis Laboratory at Johnson Space Center. Additional images taken by astronauts and cosmonauts can be viewed at the NASA-JSC Gateway to Astronaut Photography of Earth.
The Maryland power plant research program internet resource for precipitation chemistry data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Corio, L.A.; Jones, W.B.; Sherwell, J.
1999-07-01
The Maryland Department of Natural Resources Power Plant Research Program (PPRP) initiated a project in 1998 to make available on the World Wide Web (WWW), precipitation chemistry data from monitoring sites located in the Chesapeake Bay watershed. To that end, PPRP obtained, from various organizations, background information on atmospheric deposition monitoring programs (some of which are still on-going), as well as special studies. For those programs and studies with available precipitation chemistry data of known quality (data were not available for all programs and studies), PPRP obtained, processed, and uploaded the data to its WWW site (www.versar.com/pprp/features/aciddep/aciddep.htm). These data canmore » either be viewed on the web site or downloaded as a zipped file in either comma-delimited or Excel spreadsheet format. PPRP also provides descriptions of the monitoring programs/studies, including information on measurement methods and quality assurance procedures, where available. For the few monitoring programs (e.g., NADP) with existing web sites that allow on-line access to data, PPRP provides links to these sites. PPRP currently is working with the National Oceanic and Atmospheric Administration (NOAA) Air Resources Laboratory (ARL) in a cooperative effort to make more precipitation chemistry data easily available to the scientific community.« less
Host plant utilization in the comma butterfly: sources of variation and evolutionary implications.
Janz, Niklas; Nylin, Sören; Wedell, Nina
1994-09-01
A major challenge in the study of insect-host plant interactions is to understand how the different aspects of offspring performance interact to produce a preference hierarchy in the ovipositing females. In this paper we investigate host plant preference of the polyphagous butterfly Polygonia c-album (Lepidoptera: Nymphalidae) and compare it with several aspects of the life history of its offspring (growth rate, development time, adult size, survival and female fecundity). Females and offspring were tested on four naturally used host plants (Urtica dioica, Ulmus glabra, Salix caprea, and Betula pubescens). There was substantial individual variation in host plant preference, including reversals in rank order, but the differences were largely confined to differences in the ranking of Urtica dioica and S. caprea. Different aspects of performance on these two plants gave conflicting and complementary results, implying a trade-off between short development time on U. dioica, and larger size and higher fecundity on S. caprea. As all performance components showed low individual variation the large variation in host plant preference was interpreted as due to alternative oviposition strategies on the basis of similar 'performance hierarchies'. This indicates that the larval performance component of host-plant utilization may be more conservative to evolutionary change than the preference of ovipositing females. Possible macro-evolutionary implications of this are discussed.
Yanagida, Saori; Nishizawa, Noriko; Mizoguchi, Kenji; Hatakeyama, Hiromitsu; Fukuda, Satoshi
2015-07-01
Voice onset time (VOT) for word-initial voiceless consonants in adductor spasmodic dysphonia (ADSD) and abductor spasmodic dysphonia (ABSD) patients were measured to determine (1) which acoustic measures differed from the controls and (2) whether acoustic measures were related to the pause or silence between the test word and the preceding word. Forty-eight patients with ADSD and nine patients with ABSD, as well as 20 matched normal controls read a story in which the word "taiyo" (the sun) was repeated three times, each differentiated by the position of the word in the sentence. The target of measurement was the VOT for the word-initial voiceless consonant /t/. When the target syllable appeared in a sentence following a comma, or at the beginning of a sentence following a period, the ABSD patients' VOTs were significantly longer than those of the ADSD patients and controls. Abnormal prolongation of the VOTs was related to the pause or silence between the test word and the preceding word. VOTs in spasmodic dysphonia (SD) may vary according to the SD subtype or speaking conditions. VOT measurement was suggested to be a useful method for quantifying voice symptoms in SD. Copyright © 2015 The Voice Foundation. Published by Elsevier Inc. All rights reserved.
WinClastour—a Visual Basic program for tourmaline formula calculation and classification
NASA Astrophysics Data System (ADS)
Yavuz, Fuat; Yavuz, Vural; Sasmaz, Ahmet
2006-10-01
WinClastour is a Microsoft ® Visual Basic 6.0 program that enables the user to enter and calculate structural formulae of tourmaline analyses obtained both by the electron-microprobe or wet-chemical analyses. It is developed to predict cation site-allocations at the different structural positions, as well as to estimate mole percent of the end-members of the calcic-, alkali-, and X-site vacant group tourmalines. Using the different normalization schemes, such as 24.5 oxygens, 31 anions, 15 cations ( T+ Z+ Y), and 6 silicons, the present program classifies tourmaline data based on the classification scheme proposed by Hawthorne and Henry [1999. Classification of the minerals of the tourmaline group. European Journal of Mineralogy 11, 201-215]. The present program also enables the user Al-Mg disorder between Y and Z sites. WinClastour stores all the calculated results in a comma-delimited ASCII file format. Hence, output of the program can be displayed and processed by any other software for general data manipulation and graphing purposes. The compiled program code together with a test data file and related graphic files, which are designed to produce a high-quality printout from the Grapher program of Golden Software, is approximately 3 Mb as a self-extracting setup file.
NASA Astrophysics Data System (ADS)
Zhu, Yong; Newell, Reginald E.
1994-09-01
Filamentary structure is a common feature of atmospheric water vapor transport; the filaments may be termed “atmospheric rivers” because some carry as much water as the Amazon [Newell et al., 1992]. An extratropical cyclone whose central pressure fall averages at least 1 hPa hr-1 for 24 hours is known in meteorology as a “bomb” [Sanders and Gyakum, 1980]. We report here an association between rivers and bombs. When a cyclonic system is penetrated by a river, the cyclonic center moves to be close to the position occupied by the leading edge of the river twelve hours previously and the central pressure falls. If the river then moves away from the cyclone, the central pressure rises. Based on a pilot study of pressure fall and water vapor flux convergence for two winter months, the cause of the explosive deepening appears to be latent heat liberation. This is substantiated by composite maps of seven Atlantic and seven Pacific bombs which show that the flux convergence near the bomb center has a comma cloud signature. The observed association may be useful in forecasting 12-hour direction of motion and pressure change of rapidly developing cyclonic systems; the incorporation of better moisture data into numerical forecasting models may be the reason for the reported increase of skill in the prediction of bombs in recent years.
Zhu, Jingbo; Liu, Baoyue; Shan, Shibo; Ding, Yanl; Kou, Zinong; Xiao, Wei
2015-08-01
In order to meet the needs of efficient purification of products from natural resources, this paper developed an automatic vacuum liquid chromatographic device (AUTO-VLC) and applied it to the component separation of petroleum ether extracts of Schisandra chinensis (Turcz) Baill. The device was comprised of a solvent system, a 10-position distribution valve, a 3-position changes valve, dynamic axis compress chromatographic columns with three diameters, and a 10-position fraction valve. The programmable logic controller (PLC) S7- 200 was adopted to realize the automatic control and monitoring of the mobile phase changing, column selection, separation time setting and fraction collection. The separation results showed that six fractions (S1-S6) of different chemical components from 100 g Schisandra chinensis (Turcz) Baill. petroleum ether phase were obtained by the AUTO-VLC with 150 mm diameter dynamic axis compress chromatographic column. A new method used for the VLC separation parameters screened by using multiple development TLC was developed and confirmed. The initial mobile phase of AUTO-VLC was selected by taking Rf of all the target compounds ranging from 0 to 0.45 for fist development on the TLC; gradient elution ratio was selected according to k value (the slope of the linear function of Rf value and development times on the TLC) and the resolution of target compounds; elution times (n) were calculated by the formula n ≈ ΔRf/k. A total of four compounds with the purity more than 85% and 13 other components were separated from S5 under the selected conditions for only 17 h. Therefore, the development of the automatic VLC and its method are significant to the automatic and systematic separation of traditional Chinese medicines.
Berndt, M.E.; Seal, R.R.; Shanks, Wayne C.; Seyfried, W.E.
1996-01-01
Hydrogen isotope fractionation factors were measured for coexisting brines and vapors formed by phase separation of NaCl/H2O fluids at temperatures ranging from 399-450??C and pressures from 277-397 bars. It was found that brines are depleted in D compared to coexisting vapors at all conditions studied. The magnitude of hydrogen isotope fractionation is dependent on the relative amounts of Cl in the two phases and can be empirically correlated to pressure using the following relationship: 1000 ln ??(vap-brine) = 2.54(??0.83) + 2.87(??0.69) x log (??P), where ??(vap-brine) is the fractionation factor and ??P is a pressure term representing distance from the critical curve in the NaCl/H2O system. The effect of phase separation on hydrogen isotope distribution in subseafloor hydrothermal systems depends on a number of factors, including whether phase separation is induced by heating at depth or by decompression of hydrothermal fluids ascending to the seafloor. Phase separation in most subseafloor systems appears to be a simple process driven by heating of seawater to conditions within the two-phase region, followed by segregation and entrainment of brine or vapor into a seawater dominated system. Resulting vent fluids exhibit large ranges in Cl concentration with no measurable effect on ??D. Possible exceptions to this include hydrothermal fluids venting at Axial and 9??N on the East Pacific Rise. High ??D values of low Cl fluids venting at Axial are consistent with phase separation taking place at relatively shallow levels in the oceanic crust while negative ??D values in some low Cl fluids venting at 9??N suggest involvement of a magmatic fluid component or phase separation of D-depleted brines derived during previous hydrothermal activity.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Horttanainen, M., E-mail: mika.horttanainen@lut.fi; Teirasvuo, N.; Kapustina, V.
Highlights: • New experimental data of mixed MSW properties in a Finnish case region. • The share of renewable energy of mixed MSW. • The results were compared with earlier international studies. • The average share of renewable energy was 30% and the average LHVar 19 MJ/kg. • Well operating source separation decreases the renewable energy content of MSW. - Abstract: For the estimation of greenhouse gas emissions from waste incineration it is essential to know the share of the renewable energy content of the combusted waste. The composition and heating value information is generally available, but the renewable energymore » share or heating values of different fractions of waste have rarely been determined. In this study, data from Finnish studies concerning the composition and energy content of mixed MSW were collected, new experimental data on the compositions, heating values and renewable share of energy were presented and the results were compared to the estimations concluded from earlier international studies. In the town of Lappeenranta in south-eastern Finland, the share of renewable energy ranged between 25% and 34% in the energy content tests implemented for two sample trucks. The heating values of the waste and fractions of plastic waste were high in the samples compared to the earlier studies in Finland. These high values were caused by good source separation and led to a low share of renewable energy content in the waste. The results showed that in mixed municipal solid waste the renewable share of the energy content can be significantly lower than the general assumptions (50–60%) when the source separation of organic waste, paper and cardboard is carried out successfully. The number of samples was however small for making extensive conclusions on the results concerning the heating values and renewable share of energy and additional research is needed for this purpose.« less
Americium recovery from reduction residues
Conner, W.V.; Proctor, S.G.
1973-12-25
A process for separation and recovery of americium values from container or bomb'' reduction residues comprising dissolving the residues in a suitable acid, adjusting the hydrogen ion concentration to a desired level by adding a base, precipitating the americium as americium oxalate by adding oxalic acid, digesting the solution, separating the precipitate, and thereafter calcining the americium oxalate precipitate to form americium oxide. (Official Gazette)
Separation of Ni and Co by D2EHPA in the Presence of Citrate Ion
NASA Astrophysics Data System (ADS)
Nadimi, Hamed; Haghshenas Fatmehsari, Davoud; Firoozi, Sadegh
2017-10-01
Recycling processes for the recovery of metallic content from the electronic wastes are environmentally friendly and economical. This paper reports a method for the recovery and separation of Ni and Co from the sulfate solution by the use of D2EHPA. In this regard, the influence of citrate ion, as a carboxylate ligand, was examined in the separation conditions of Ni and Co via D2EHPA (a poor selective extractant for Ni and Co separation). It was found that the Δ {pH}_{0.5}^{Ni-Co} (the difference between pH values corresponding to 50 pct extraction of metallic ion) increases to 1.5 at the citrate concentration of 0.05 M; this Δ {pH}_{0.5}^{Ni-Co} value is much higher than that obtained in the absence of citrate ion (0.1). Fourier Transform Infrared Spectroscopy (FT-IR) indicated that the citrate ion is co-absorbed during the metallic ions absorption by D2EHPA meaning that the metal-organic complexes contain Co/Ni and citrate ion. Also, the stoichiometric coefficients of the Ni and Co extraction reaction were proposed by applying the slope analysis method.
JIGSAW: Joint Inhomogeneity estimation via Global Segment Assembly for Water-fat separation.
Lu, Wenmiao; Lu, Yi
2011-07-01
Water-fat separation in magnetic resonance imaging (MRI) is of great clinical importance, and the key to uniform water-fat separation lies in field map estimation. This work deals with three-point field map estimation, in which water and fat are modelled as two single-peak spectral lines, and field inhomogeneities shift the spectrum by an unknown amount. Due to the simplified spectrum modelling, there exists inherent ambiguity in forming field maps from multiple locally feasible field map values at each pixel. To resolve such ambiguity, spatial smoothness of field maps has been incorporated as a constraint of an optimization problem. However, there are two issues: the optimization problem is computationally intractable and even when it is solved exactly, it does not always separate water and fat images. Hence, robust field map estimation remains challenging in many clinically important imaging scenarios. This paper proposes a novel field map estimation technique called JIGSAW. It extends a loopy belief propagation (BP) algorithm to obtain an approximate solution to the optimization problem. The solution produces locally smooth segments and avoids error propagation associated with greedy methods. The locally smooth segments are then assembled into a globally consistent field map by exploiting the periodicity of the feasible field map values. In vivo results demonstrate that JIGSAW outperforms existing techniques and produces correct water-fat separation in challenging imaging scenarios.
ERIC Educational Resources Information Center
Nash, John J.; Meyer, Jeanne A.; Everson, Barbara
2001-01-01
Rx values in thin-layer chromatography (TLC) depend strongly on the solvent saturation of the atmosphere above the liquid in the TLC developing chamber. Presents an experiment illustrating the potentially dramatic effects on TLC Rx values of not equilibrating the solvent atmosphere during development. (ASK)
Character and Citizenship Education: Conversations between Personal and Societal Values
ERIC Educational Resources Information Center
Sim, Jasmine B.-Y.; Low, Ee Ling
2012-01-01
The theme of this special issue is "Character and Citizenship Education: Conversations between Personal and Societal Values." Character education and citizenship education, taken separately or as a single entity are currently riding high on the political and educational policy agendas of several governments (Arthur, 2003; Berkowitz & Bier, 2007;…
Compound-specific isotope analysis: Questioning the origins of a trichloroethene plume
Eberts, S.M.; Braun, C.; Jones, S.
2008-01-01
Stable carbon isotope ratios of trichloroethene (TCE), cis-1,2- dichloroethene, and trans-1,2-dichloroethene were determined by use of gas chromatography-combustion-isotope ratio mass spectroscopy to determine whether compound-specific stable carbon isotopes could be used to help understand the origin and history of a TCE groundwater plume in Fort Worth, TX. Calculated ??13C values for total chlorinated ethenes in groundwater samples, which can approximate the ??13C of a spilled solvent if all degradation products are accounted for, were useful for determining whether separate lobes of the plume resulted from different sources. Most notably, values for one lobe, where tetrachloroethene (PCE) has been detected periodically, were outside the range for manufactured TCE but within the range for manufactured PCE, whereas values for a separate lobe, which is downgradient of reported TCE spills, were within the range for manufactured TCE. Copyright ?? Taylor & Francis Group, LLC.
Separate Circuitries Encode the Hedonic and Nutritional Values of Sugar
Tellez, Luis A.; Han, Wenfei; Zhang, Xiaobing; Ferreira, Tatiana L.; Perez, Isaac O.; Shammah-Lagnado, Sara J.; van den Pol, Anthony N.; de Araujo, Ivan E.
2016-01-01
Sugar exerts its potent reinforcing effects via both gustatory and post-ingestive pathways. It is however unknown if sweetness and nutritional signals engage segregated brain networks to motivate ingestion. We show in mice that separate basal ganglia circuitries mediate the hedonic and nutritional actions of sugar. We found that, during sugar intake, suppressing hedonic value inhibited dopamine release in ventral but not dorsal striatum, whereas suppressing nutritional value inhibited dopamine release in dorsal but not ventral striatum. Consistently, cell-specific ablation of dopamine-excitable cells in dorsal, but not ventral, striatum inhibited sugar’s ability to drive the ingestion of unpalatable solutions. Conversely, optogenetic stimulation of dopamine-excitable cells in dorsal, but not ventral, striatum substituted for sugar in its ability to drive the ingestion of unpalatable solutions. Our data demonstrate that sugar recruits a distributed dopamine-excitable striatal circuitry that acts to prioritize energy seeking over taste quality. PMID:26807950
Separate circuitries encode the hedonic and nutritional values of sugar.
Tellez, Luis A; Han, Wenfei; Zhang, Xiaobing; Ferreira, Tatiana L; Perez, Isaac O; Shammah-Lagnado, Sara J; van den Pol, Anthony N; de Araujo, Ivan E
2016-03-01
Sugar exerts its potent reinforcing effects via both gustatory and post-ingestive pathways. It is, however, unknown whether sweetness and nutritional signals engage segregated brain networks to motivate ingestion. We found in mice that separate basal ganglia circuitries mediated the hedonic and nutritional actions of sugar. During sugar intake, suppressing hedonic value inhibited dopamine release in ventral, but not dorsal, striatum, whereas suppressing nutritional value inhibited dopamine release in dorsal, but not ventral, striatum. Consistently, cell-specific ablation of dopamine-excitable cells in dorsal, but not ventral, striatum inhibited sugar's ability to drive the ingestion of unpalatable solutions. Conversely, optogenetic stimulation of dopamine-excitable cells in dorsal, but not ventral, striatum substituted for sugar in its ability to drive the ingestion of unpalatable solutions. Our data indicate that sugar recruits a distributed dopamine-excitable striatal circuitry that acts to prioritize energy-seeking over taste quality.
Method for enhanced accuracy in predicting peptides using liquid separations or chromatography
Kangas, Lars J.; Auberry, Kenneth J.; Anderson, Gordon A.; Smith, Richard D.
2006-11-14
A method for predicting the elution time of a peptide in chromatographic and electrophoretic separations by first providing a data set of known elution times of known peptides, then creating a plurality of vectors, each vector having a plurality of dimensions, and each dimension representing the elution time of amino acids present in each of these known peptides from the data set. The elution time of any protein is then be predicted by first creating a vector by assigning dimensional values for the elution time of amino acids of at least one hypothetical peptide and then calculating a predicted elution time for the vector by performing a multivariate regression of the dimensional values of the hypothetical peptide using the dimensional values of the known peptides. Preferably, the multivariate regression is accomplished by the use of an artificial neural network and the elution times are first normalized using a transfer function.
Batch extracting process using magnetic particle held solvents
Nunez, L.; Vandergrift, G.F.
1995-11-21
A process is described for selectively removing metal values which may include catalytic values from a mixture containing same, wherein a magnetic particle is contacted with a liquid solvent which selectively dissolves the metal values to absorb the liquid solvent onto the magnetic particle. Thereafter the solvent-containing magnetic particles are contacted with a mixture containing the heavy metal values to transfer metal values into the solvent carried by the magnetic particles, and then magnetically separating the magnetic particles. Ion exchange resins may be used for selective solvents. 5 figs.
NASA Technical Reports Server (NTRS)
Zehe, Michael J.; Jaffe, Richard L.
2010-01-01
High-level ab initio calculations have been performed on the exo and endo isomers of gas-phase tetrahydrodicyclopentadiene (THDCPD), a principal component of the jet fuel JP10, using the Gaussian Gx and Gx(MPx) composite methods, as well as the CBS-QB3 method, and using a variety of isodesmic and homodesmotic reaction schemes. The impetus for this work is to help resolve large discrepancies existing between literature measurements of the formation enthalpy Delta (sub f)H deg (298) for exo-THDCPD. We find that use of the isodesmic bond separation reaction C10H16 + 14CH4 yields 12C2H6 yields results for the exo isomer (JP10) in between the two experimentally accepted values, for the composite methods G3(MP2), G3(MP2)//B3LYP, and CBS-QB3. Application of this same isodesmic bond separation scheme to gas-phase adamantane yields a value for Delta (sub f)H deg (298) within 5 kJ/mol of experiment. Isodesmic bond separation calculations for the endo isomer give a heat of formation in excellent agreement with the experimental measurement. Combining our calculated values for the gas-phase heat of formation with recent measurements of the heat of vaporization yields recommended values for Delta (sub f)H deg (298)liq of -126.4 and -114.7 kJ/mol for the exo and endo isomers, respectively.
Separation of Gadolinium (Gd) using Synergic Solvent Mixed Topo-D2EHPA with Extraction Method.
NASA Astrophysics Data System (ADS)
Effendy, N.; Basuki, K. T.; Biyantoro, D.; Perwira, N. K.
2018-04-01
The main problem to obtain Gd with high purity is the similarity of chemical properties and physical properties with the other rare earth elements (REE) such as Y and Dy, it is necessary to do separation by the extraction process. The purpose of this research to determine the best solvent type, amount of solvent, feed and solvent ratio in the Gd extraction process, to determine the rate order and the value of the rate constant of Gd concentration based on experimental data of aqueous phase concentration as a function of time and to know the effect of temperature on the reaction speed constant. This research was conducted on variation of solvent, amount of solvent, feed and solvent ratio in the extraction process of Gd separation, extraction time to determine the order value and the rate constant of Gd concentration in extraction process based on the aqueous phase concentration data as a function of time, to the rate constant of decreasing concentration of Gd. Based on the calculation results, the solvent composition was obtained with the best feed to separate the rare earth elements Gd in the extraction process is 1 : 4 with 15% concentration of TOPO and 10% concentration of D2EHPA. The separation process of Gd using extraction method by solvent TOPO-D2EHPA 2 : 1 comparison is better than single solvent D2EHPA / TOPO because of the synergistic effect. The rate order of separation process of Gd follows order 1. The Arrhenius Gd equation becomes k = 1.46 x 10-7 exp (-6.96 kcal / mol / RT).
Karaoğul, Eyyüp; Parlar, Perihan; Parlar, Harun; Alma, M Hakkı
2016-01-01
The main aim of this study was to enrich glycyrrhizic acid ammonium salt known as one of the main compounds of licorice roots (Glycyrrhiza glabra L.) by isoelectric focused adsorptive bubble separation technique with different foaming agents. In the experiments, four bubble separation parameters were used with β-lactoglobulin, albumin bovine, and starch (soluble) preferred as foaming agents and without additives. The enrichment of glycyrrhizic acid ammonium salt into the foam was influenced by different additive substances. The results showed that highest enrichment values were obtained from β-lactoglobulin as much as 368.3 times. The lowest enrichment values (5.9 times) were determined for the application without additive. After enrichment, each experiment of glycyrrhizic acid ammonium salt confirmed that these substances could be quantitatively enriched into the collection vessel with isoelectric focused adsorptive bubble separation technique. The transfer of glycyrrhizic acid ammonium salt into the foam from standard solution in the presence of additive was more efficient than aqueous licorice extract.
Karaoğul, Eyyüp; Parlar, Perihan; Parlar, Harun; Alma, M. Hakkı
2016-01-01
The main aim of this study was to enrich glycyrrhizic acid ammonium salt known as one of the main compounds of licorice roots (Glycyrrhiza glabra L.) by isoelectric focused adsorptive bubble separation technique with different foaming agents. In the experiments, four bubble separation parameters were used with β-lactoglobulin, albumin bovine, and starch (soluble) preferred as foaming agents and without additives. The enrichment of glycyrrhizic acid ammonium salt into the foam was influenced by different additive substances. The results showed that highest enrichment values were obtained from β-lactoglobulin as much as 368.3 times. The lowest enrichment values (5.9 times) were determined for the application without additive. After enrichment, each experiment of glycyrrhizic acid ammonium salt confirmed that these substances could be quantitatively enriched into the collection vessel with isoelectric focused adsorptive bubble separation technique. The transfer of glycyrrhizic acid ammonium salt into the foam from standard solution in the presence of additive was more efficient than aqueous licorice extract. PMID:26949562
Design and analysis on sorting blade for automated size-based sorting device
NASA Astrophysics Data System (ADS)
Razali, Zol Bahri; Kader, Mohamed Mydin M. Abdul; Samsudin, Yasser Suhaimi; Daud, Mohd Hisam
2017-09-01
Nowadays rubbish separating or recycling is a main problem of nation, where peoples dumped their rubbish into dumpsite without caring the value of the rubbish if it can be recycled and reused. Thus the author proposed an automated segregating device, purposely to teach people to separate their rubbish and value the rubbish that can be reused. The automated size-based mechanical segregating device provides significant improvements in terms of efficiency and consistency in this segregating process. This device is designed to make recycling easier, user friendly, in the hope that more people will take responsibility if it is less of an expense of time and effort. This paper discussed about redesign a blade for the sorting device which is to develop an efficient automated mechanical sorting device for the similar material but in different size. The machine is able to identify the size of waste and it depends to the coil inside the container to separate it out. The detail design and methodology is described in detail in this paper.