48 CFR 15.101-2 - Lowest price technically acceptable source selection process.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Lowest price technically acceptable source selection process. 15.101-2 Section 15.101-2 Federal Acquisition Regulations System FEDERAL... Processes and Techniques 15.101-2 Lowest price technically acceptable source selection process. (a) The...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-29
...-readiness processes of potential contractors and subcontractors as a part of the source selection process... Supplement (DFARS) subpart 215.3, Source Selection. It amends DFARS 215.304(c) by adding paragraph (iv) to... and subcontractors shall be considered as a part of the source selection process for major defense...
Successful Air Force Source Selections Start with an Effective Empowered Team
responsibilities of the team that will conduct a source selection, there is less guidance and focus how to create an effective source selection team and...empower that team to successfully complete a source selection. This purpose of this research was to determine if an increased focus on the formation and...empowerment of source selection teams and not just processes and procedures contribute to the efficiency and success of Air Force source selections
Evaluation and selection of open-source EMR software packages based on integrated AHP and TOPSIS.
Zaidan, A A; Zaidan, B B; Al-Haiqi, Ahmed; Kiah, M L M; Hussain, Muzammil; Abdulnabi, Mohamed
2015-02-01
Evaluating and selecting software packages that meet the requirements of an organization are difficult aspects of software engineering process. Selecting the wrong open-source EMR software package can be costly and may adversely affect business processes and functioning of the organization. This study aims to evaluate and select open-source EMR software packages based on multi-criteria decision-making. A hands-on study was performed and a set of open-source EMR software packages were implemented locally on separate virtual machines to examine the systems more closely. Several measures as evaluation basis were specified, and the systems were selected based a set of metric outcomes using Integrated Analytic Hierarchy Process (AHP) and TOPSIS. The experimental results showed that GNUmed and OpenEMR software can provide better basis on ranking score records than other open-source EMR software packages. Copyright © 2014 Elsevier Inc. All rights reserved.
A Decision Analysis Tool for the Source Selection Process
2006-03-01
THE SOURCE SELECTION PROCESS THESIS Presented to the Faculty Department of Systems and Engineering Management Graduate School of...Engineering and Management Air Force Institute of Technology Air University Air Education and Training Command In Partial Fulfillment of...the Requirements for the Degree of Master of Science in Engineering Management John R. Trumm, BS Captain, USAF March 2006
Discrimination of correlated and entangling quantum channels with selective process tomography
Dumitrescu, Eugene; Humble, Travis S.
2016-10-10
The accurate and reliable characterization of quantum dynamical processes underlies efforts to validate quantum technologies, where discrimination between competing models of observed behaviors inform efforts to fabricate and operate qubit devices. We present a protocol for quantum channel discrimination that leverages advances in direct characterization of quantum dynamics (DCQD) codes. We demonstrate that DCQD codes enable selective process tomography to improve discrimination between entangling and correlated quantum dynamics. Numerical simulations show selective process tomography requires only a few measurement configurations to achieve a low false alarm rate and that the DCQD encoding improves the resilience of the protocol to hiddenmore » sources of noise. Lastly, our results show that selective process tomography with DCQD codes is useful for efficiently distinguishing sources of correlated crosstalk from uncorrelated noise in current and future experimental platforms.« less
NASA Astrophysics Data System (ADS)
Tang, Jian; Qiao, Junfei; Wu, ZhiWei; Chai, Tianyou; Zhang, Jian; Yu, Wen
2018-01-01
Frequency spectral data of mechanical vibration and acoustic signals relate to difficult-to-measure production quality and quantity parameters of complex industrial processes. A selective ensemble (SEN) algorithm can be used to build a soft sensor model of these process parameters by fusing valued information selectively from different perspectives. However, a combination of several optimized ensemble sub-models with SEN cannot guarantee the best prediction model. In this study, we use several techniques to construct mechanical vibration and acoustic frequency spectra of a data-driven industrial process parameter model based on selective fusion multi-condition samples and multi-source features. Multi-layer SEN (MLSEN) strategy is used to simulate the domain expert cognitive process. Genetic algorithm and kernel partial least squares are used to construct the inside-layer SEN sub-model based on each mechanical vibration and acoustic frequency spectral feature subset. Branch-and-bound and adaptive weighted fusion algorithms are integrated to select and combine outputs of the inside-layer SEN sub-models. Then, the outside-layer SEN is constructed. Thus, "sub-sampling training examples"-based and "manipulating input features"-based ensemble construction methods are integrated, thereby realizing the selective information fusion process based on multi-condition history samples and multi-source input features. This novel approach is applied to a laboratory-scale ball mill grinding process. A comparison with other methods indicates that the proposed MLSEN approach effectively models mechanical vibration and acoustic signals.
The source of dual-task limitations: Serial or parallel processing of multiple response selections?
Marois, René
2014-01-01
Although it is generally recognized that the concurrent performance of two tasks incurs costs, the sources of these dual-task costs remain controversial. The serial bottleneck model suggests that serial postponement of task performance in dual-task conditions results from a central stage of response selection that can only process one task at a time. Cognitive-control models, by contrast, propose that multiple response selections can proceed in parallel, but that serial processing of task performance is predominantly adopted because its processing efficiency is higher than that of parallel processing. In the present study, we empirically tested this proposition by examining whether parallel processing would occur when it was more efficient and financially rewarded. The results indicated that even when parallel processing was more efficient and was incentivized by financial reward, participants still failed to process tasks in parallel. We conclude that central information processing is limited by a serial bottleneck. PMID:23864266
NASA Astrophysics Data System (ADS)
Ranatunga, T.
2017-12-01
Modeling of fate and transport of fecal bacteria in a watershed is a processed based approach that considers releases from manure, point sources, and septic systems. Overland transport with water and sediments, infiltration into soils, transport in the vadose zone and groundwater, die-off and growth processes, and in-stream transport are considered as the other major processes in bacteria simulation. This presentation will discuss a simulation of fecal indicator bacteria source loading and in-stream conditions of a non-tidal watershed (Cedar Bayou Watershed) in South Central Texas using two models; Spatially Explicit Load Enrichment Calculation Tool (SELECT) and Soil and Water Assessment Tool (SWAT). Furthermore, it will discuss a probable approach of bacteria source load reduction in order to meet the water quality standards in the streams. The selected watershed is listed as having levels of fecal indicator bacteria that posed a risk for contact recreation and wading by the Texas Commission of Environmental Quality (TCEQ). The SELECT modeling approach was used in estimating the bacteria source loading from land categories. Major bacteria sources considered were, failing septic systems, discharges from wastewater treatment facilities, excreta from livestock (Cattle, Horses, Sheep and Goat), excreta from Wildlife (Feral Hogs, and Deer), Pet waste (mainly from Dogs), and runoff from urban surfaces. The estimated source loads from SELECT model were input to the SWAT model, and simulate the bacteria transport through the land and in-stream. The calibrated SWAT model was then used to estimate the indicator bacteria in-stream concentrations for future years based on regional land use, population and household forecast (up to 2040). Based on the reductions required to meet the water quality standards in-stream, the corresponding required source load reductions were estimated.
Jonkman, L M; Kenemans, J L; Kemner, C; Verbaten, M N; van Engeland, H
2004-07-01
This study was aimed at investigating whether attention-deficit hyperactivity disorder (ADHD) children suffer from specific early selective attention deficits in the visual modality with the aid of event-related brain potentials (ERPs). Furthermore, brain source localization was applied to identify brain areas underlying possible deficits in selective visual processing in ADHD children. A two-channel visual color selection task was administered to 18 ADHD and 18 control subjects in the age range of 7-13 years and ERP activity was derived from 30 electrodes. ADHD children exhibited lower perceptual sensitivity scores resulting in poorer target selection. The ERP data suggested an early selective-attention deficit as manifested in smaller frontal positive activity (frontal selection positivity; FSP) in ADHD children around 200 ms whereas later occipital and fronto-central negative activity (OSN and N2b; 200-400 ms latency) appeared to be unaffected. Source localization explained the FSP by posterior-medial equivalent dipoles in control subjects, which may reflect the contribution of numerous surrounding areas. ADHD children have problems with selective visual processing that might be caused by a specific early filtering deficit (absent FSP) occurring around 200 ms. The neural sources underlying these problems have to be further identified. Source localization also suggested abnormalities in the 200-400 ms time range, pertaining to the distribution of attention-modulated activity in lateral frontal areas.
Attention and Trust Biases in the Design of Augmented Reality Displays
2000-04-01
storage, selective attention , and their mutual constraints within the human information processing system. Psychological Bulletin, 104(2), 163-191...the pilots’ attention at the cost of processing other information in the far domain beyond the symbology, i.e., attentional tunneling (Fadden et al...need to select between two sources of information, attention is allocated to the one which facilitates the user’s task. When only a single source of
Examining Student Research Choices and Processes in a Disintermediated Searching Environment
ERIC Educational Resources Information Center
Rempel, Hannah Gascho; Buck, Stefanie; Deitering, Anne-Marie
2013-01-01
Students today perform research in a disintermediated environment, which often allows them to struggle directly with the process of selecting research tools and choosing scholarly sources. The authors conducted a qualitative study with twenty students, using structured observations to ascertain the processes students use to select databases and…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dumitrescu, Eugene; Humble, Travis S.
The accurate and reliable characterization of quantum dynamical processes underlies efforts to validate quantum technologies, where discrimination between competing models of observed behaviors inform efforts to fabricate and operate qubit devices. We present a protocol for quantum channel discrimination that leverages advances in direct characterization of quantum dynamics (DCQD) codes. We demonstrate that DCQD codes enable selective process tomography to improve discrimination between entangling and correlated quantum dynamics. Numerical simulations show selective process tomography requires only a few measurement configurations to achieve a low false alarm rate and that the DCQD encoding improves the resilience of the protocol to hiddenmore » sources of noise. Lastly, our results show that selective process tomography with DCQD codes is useful for efficiently distinguishing sources of correlated crosstalk from uncorrelated noise in current and future experimental platforms.« less
NASA Astrophysics Data System (ADS)
Ranatunga, T.
2016-12-01
Modeling of fate and transport of fecal bacteria in a watershed is generally a processed based approach that considers releases from manure, point sources, and septic systems. Overland transport with water and sediments, infiltration into soils, transport in the vadose zone and groundwater, die-off and growth processes, and in-stream transport are considered as the other major processes in bacteria simulation. This presentation will discuss a simulation of fecal indicator bacteria (E.coli) source loading and in-stream conditions of a non-tidal watershed (Cedar Bayou Watershed) in South Central Texas using two models; Spatially Explicit Load Enrichment Calculation Tool (SELECT) and Soil and Water Assessment Tool (SWAT). Furthermore, it will discuss a probable approach of bacteria source load reduction in order to meet the water quality standards in the streams. The selected watershed is listed as having levels of fecal indicator bacteria that posed a risk for contact recreation and wading by the Texas Commission of Environmental Quality (TCEQ). The SELECT modeling approach was used in estimating the bacteria source loading from land categories. Major bacteria sources considered were, failing septic systems, discharges from wastewater treatment facilities, excreta from livestock (Cattle, Horses, Sheep and Goat), excreta from Wildlife (Feral Hogs, and Deer), Pet waste (mainly from Dogs), and runoff from urban surfaces. The estimated source loads were input to the SWAT model in order to simulate the transport through the land and in-stream conditions. The calibrated SWAT model was then used to estimate the indicator bacteria in-stream concentrations for future years based on H-GAC's regional land use, population and household projections (up to 2040). Based on the in-stream reductions required to meet the water quality standards, the corresponding required source load reductions were estimated.
Code of Federal Regulations, 2011 CFR
2011-01-01
... notified in writing and provided with the specific reasons for the rejection. (c) Selection for processing... applicants qualifying for a veterans preference. After selection for processing, loans are funded on a first...-approved Mutual Self-Help project or loans that will leverage funding or financing from other sources. (5...
Code of Federal Regulations, 2010 CFR
2010-01-01
... notified in writing and provided with the specific reasons for the rejection. (c) Selection for processing... applicants qualifying for a veterans preference. After selection for processing, loans are funded on a first...-approved Mutual Self-Help project or loans that will leverage funding or financing from other sources. (5...
One approach to predictive modeling of biological contamination of recreational waters and drinking water sources involves applying process-based models that consider microbial sources, hydrodynamic transport, and microbial fate. Fecal indicator bacteria such as enterococci have ...
NASA Astrophysics Data System (ADS)
Torres Astorga, Romina; Velasco, Hugo; Dercon, Gerd; Mabit, Lionel
2017-04-01
Soil erosion and associated sediment transportation and deposition processes are key environmental problems in Central Argentinian watersheds. Several land use practices - such as intensive grazing and crop cultivation - are considered likely to increase significantly land degradation and soil/sediment erosion processes. Characterized by highly erodible soils, the sub catchment Estancia Grande (12.3 km2) located 23 km north east of San Luis has been investigated by using sediment source fingerprinting techniques to identify critical hot spots of land degradation. The authors created 4 artificial mixtures using known quantities of the most representative sediment sources of the studied catchment. The first mixture was made using four rotation crop soil sources. The second and the third mixture were created using different proportions of 4 different soil sources including soils from a feedlot, a rotation crop, a walnut forest and a grazing soil. The last tested mixture contained the same sources as the third mixture but with the addition of a fifth soil source (i.e. a native bank soil). The Energy Dispersive X Ray Fluorescence (EDXRF) analytical technique has been used to reconstruct the source sediment proportion of the original mixtures. Besides using a traditional method of fingerprint selection such as Kruskal-Wallis H-test and Discriminant Function Analysis (DFA), the authors used the actual source proportions in the mixtures and selected from the subset of tracers that passed the statistical tests specific elemental tracers that were in agreement with the expected mixture contents. The selection process ended with testing in a mixing model all possible combinations of the reduced number of tracers obtained. Alkaline earth metals especially Strontium (Sr) and Barium (Ba) were identified as the most effective fingerprints and provided a reduced Mean Absolute Error (MAE) of approximately 2% when reconstructing the 4 artificial mixtures. This study demonstrates that the EDXRF fingerprinting approach performed very well in reconstructing our original mixtures especially in identifying and quantifying the contribution of the 4 rotation crop soil sources in the first mixture.
Records Management Handbook; Source Data Automation Equipment Guide.
ERIC Educational Resources Information Center
National Archives and Records Service (GSA), Washington, DC. Office of Records Management.
A detailed guide to selecting appropriate source data automation equipment is presented. Source data automation equipment is used to prepare data for electronic data processing or computerized recordkeeping. The guide contains specifications, performance data cost, and pictures of the major types of machines used in source data automation.…
NASA Astrophysics Data System (ADS)
Allgaier, Joachim
2011-06-01
Media accounts of reality have the potential to influence public opinion and decision making processes. Therefore who has and who does not have access to the media and can make their voice heard is a crucial question with serious political consequences. In this article it is investigated whether the speciality of journalists influences their source selection procedures. The coverage of science in schools is an interesting example, since it can be covered by specialized science or education correspondents, but also by general news reporters. A public controversy in the UK about the inclusion of creationism in a school is used to identify which types of sources were selected by various journalists. The focus is upon the selection of sources and whether journalists with different specialties consider various sources relevant and credible. A content analysis of articles, featuring this controversy, is combined with an analysis of correspondent's strategies for selecting sources based on interviews with them. The findings suggest that compared to journalists that specialize in education issues, science correspondents employ a narrower scope when seeking sources. This might have important consequences for the representation of views on science education in the media.
Subliminally and consciously induced cognitive conflicts interact at several processing levels.
Stock, Ann-Kathrin; Friedrich, Julia; Beste, Christian
2016-12-01
Controlled behavior is susceptible to conflicts that can emerge from subliminal or consciously processed information. While research suggests that both sources of conflicting information may interact in their modulation of controlled behavior, it has remained unclear which cognitive sub-processes involved in controlled behavior are affected by this interaction; i.e., at which processing level subliminally and consciously induced response conflicts interact in modulating controlled behavior. Moreover, we investigated whether this interaction of subliminally and consciously induced response conflicts was due to a nexus between the two types of conflict like a common cognitive process or factor. For this, n = 38 healthy young subjects completed a paradigm which combines subliminal primes and consciously perceived flankers while an electroencephalography (EEG) was recorded. We show that the interaction of subliminal and conscious sources of conflict is not restricted to the response selection level (N2) but can already be shown at the earliest stages of perceptual and attentional processing (P1). While the degree of early attentional processing of subliminal information seems to depend on the absence of consciously perceived response conflicts, conflicts during the stage of response selection may be either reduced or enhanced by subliminal priming. Moreover, the results showed that even though the two different sources of conflict interact at the response selection level, they clearly originate from two distinct processes that interact before they detrimentally affect cognitive control. Copyright © 2016 Elsevier Ltd. All rights reserved.
LPTA Versus Tradeoff: Analysis of Contract Source Selection Strategies and Performance Outcomes
2016-06-01
methodologies contracting professionals employ to acquire what the DOD needs. Contracting professionals may use lowest price technically acceptable (LPTA) and...contract management process, source selection, lowest price technically acceptable, tradeoff 15. NUMBER OF PAGES 69 16. PRICE CODE 17. SECURITY...use lowest price technically acceptable (LPTA) and tradeoff strategies to procure requirements to maximize the overall best value to the government
Comprehending and Learning from Internet Sources: Processing Patterns of Better and Poorer Learners
ERIC Educational Resources Information Center
Goldman, Susan R.; Braasch, Jason L. G.; Wiley, Jennifer; Graesser, Arthur C.; Brodowinska, Kamila
2012-01-01
Readers increasingly attempt to understand and learn from information sources they find on the Internet. Doing so highlights the crucial role that evaluative processes play in selecting and making sense of the information. In a prior study, Wiley et al. (2009, Experiment 1) asked undergraduates to perform a web-based inquiry task about volcanoes…
Competitive Parallel Processing For Compression Of Data
NASA Technical Reports Server (NTRS)
Diner, Daniel B.; Fender, Antony R. H.
1990-01-01
Momentarily-best compression algorithm selected. Proposed competitive-parallel-processing system compresses data for transmission in channel of limited band-width. Likely application for compression lies in high-resolution, stereoscopic color-television broadcasting. Data from information-rich source like color-television camera compressed by several processors, each operating with different algorithm. Referee processor selects momentarily-best compressed output.
Developing Source Selection Evaluation Criteria and Standards for Reliability and Maintainability.
1985-09-01
of early investment in R&M engi- neering must be carried into the source selection process. The R&M engineering policy...cotaiedtherein. Furthermore, the views expressed in the document are those of the author(s) and do not necessarily reflect the views of’the School of ...THESIS Presented to the Faculty of the School of Systems and Logistics of the Air Force Institute of
Secure relay selection based on learning with negative externality in wireless networks
NASA Astrophysics Data System (ADS)
Zhao, Caidan; Xiao, Liang; Kang, Shan; Chen, Guiquan; Li, Yunzhou; Huang, Lianfen
2013-12-01
In this paper, we formulate relay selection into a Chinese restaurant game. A secure relay selection strategy is proposed for a wireless network, where multiple source nodes send messages to their destination nodes via several relay nodes, which have different processing and transmission capabilities as well as security properties. The relay selection utilizes a learning-based algorithm for the source nodes to reach their best responses in the Chinese restaurant game. In particular, the relay selection takes into account the negative externality of relay sharing among the source nodes, which learn the capabilities and security properties of relay nodes according to the current signals and the signal history. Simulation results show that this strategy improves the user utility and the overall security performance in wireless networks. In addition, the relay strategy is robust against the signal errors and deviations of some user from the desired actions.
NASA Technical Reports Server (NTRS)
1990-01-01
This document describes the machine readable version of the Selected Compact Radio Source Catalog as it is currently being distributed from the international network of astronomical data centers. It is intended to enable users to read and process the computerized catalog. The catalog contains 233 strong, compact extragalactic radio sources having identified optical counterparts. The machine version contains the same data as the published catalog and includes source identifications, equatorial positions at J2000.0 and their mean errors, object classifications, visual magnitudes, redshift, 5-GHz flux densities, and comments.
Resonance ionization laser ion sources for on-line isotope separators (invited).
Marsh, B A
2014-02-01
A Resonance Ionization Laser Ion Source (RILIS) is today considered an essential component of the majority of Isotope Separator On Line (ISOL) facilities; there are seven laser ion sources currently operational at ISOL facilities worldwide and several more are under development. The ionization mechanism is a highly element selective multi-step resonance photo-absorption process that requires a specifically tailored laser configuration for each chemical element. For some isotopes, isomer selective ionization may even be achieved by exploiting the differences in hyperfine structures of an atomic transition for different nuclear spin states. For many radioactive ion beam experiments, laser resonance ionization is the only means of achieving an acceptable level of beam purity without compromising isotope yield. Furthermore, by performing element selection at the location of the ion source, the propagation of unwanted radioactivity downstream of the target assembly is reduced. Whilst advances in laser technology have improved the performance and reliability of laser ion sources and broadened the range of suitable commercially available laser systems, many recent developments have focused rather on the laser/atom interaction region in the quest for increased selectivity and/or improved spectral resolution. Much of the progress in this area has been achieved by decoupling the laser ionization from competing ionization processes through the use of a laser/atom interaction region that is physically separated from the target chamber. A new application of gas catcher laser ion source technology promises to expand the capabilities of projectile fragmentation facilities through the conversion of otherwise discarded reaction fragments into high-purity low-energy ion beams. A summary of recent RILIS developments and the current status of laser ion sources worldwide is presented.
How attention gates social interactions.
Capozzi, Francesca; Ristic, Jelena
2018-05-25
Social interactions are at the core of social life. However, humans selectively choose their exchange partners and do not engage in all available opportunities for social encounters. In this review, we argue that attentional systems play an important role in guiding the selection of social interactions. Supported by both classic and emerging literature, we identify and characterize the three core processes-perception, interpretation, and evaluation-that interact with attentional systems to modulate selective responses to social environments. Perceptual processes facilitate attentional prioritization of social cues. Interpretative processes link attention with understanding of cues' social meanings and agents' mental states. Evaluative processes determine the perceived value of the source of social information. The interplay between attention and these three routes of processing places attention in a powerful role to manage the selection of the vast amount of social information that individuals encounter on a daily basis and, in turn, gate the selection of social interactions. © 2018 New York Academy of Sciences.
[Implementation of quality of care indicators for third-level public hospitals in Mexico].
Saturno-Hernández, Pedro Jesús; Martínez-Nicolás, Ismael; Poblano-Verástegui, Ofelia; Vértiz-Ramírez, José de Jesús; Suárez-Ortiz, Erasto Cosme; Magaña-Izquierdo, Manuel; Kawa-Karasik, Simón
2017-01-01
To select, pilot test and implement a set of indicators for tertiary public hospitals. Quali-quantitative study in four stages: identification of indicators used internationally; selection and prioritization by utility, feasibility and reliability; exploration of the quality of sources of information in six hospitals; pilot feasibility and reliability, and follow-up measurement. From 143 indicators, 64 were selected and eight were prioritized. The scan revealed sources of information deficient. In the pilot, three indicators were feasible with reliability limited. Has conducted workshops to improve records and sources of information; nine hospitals reported measurements of a quarter. Eight priority indicators could not be measured immediately due to limitations in the data sources for its construction. It is necessary to improve mechanisms of registration and processing of data in this group of hospital.
Side Streams of Plant Food Processing As a Source of Valuable Compounds: Selected Examples.
Schieber, Andreas
2017-02-28
Industrial processing of plant-derived raw materials generates enormous amounts of by-products. On one hand, these by-products constitute a serious disposal issue because they often emerge seasonally and are prone to microbial decay. On the other hand, they are an abundant source of valuable compounds, in particular secondary plant metabolites and cell wall materials, which may be recovered and used to functionalize foods and replace synthetic additives with ingredients of natural origin. This review covers 150 references and presents select studies performed between 2001 and 2016 on the recovery, characterization, and application of valuable constituents from grape pomace, apple pomace, potato peels, tomato pomace, carrot pomace, onion peels, by-products of citrus, mango, banana, and pineapple processing, side streams of olive oil production, and cereal by-products. The criteria used were economic importance, amounts generated, relevance of side streams as a source of valuable compounds, and reviews already published. Despite a plethora of studies carried out on the utilization of side streams, relatively few processes have yet found industrial application.
Free and Open Source GIS Tools: Role and Relevance in the Environmental Assessment Community
The presence of an explicit geographical context in most environmental decisions can complicate assessment and selection of management options. These decisions typically involve numerous data sources, complex environmental and ecological processes and their associated models, ris...
USDA-ARS?s Scientific Manuscript database
A total of seven source fiber types were selected for use in the manufacturing of nonwoven roll goods: polyester; polypropylene; rayon; greige cotton from two sources; mechanically cleaned greige cotton; and scoured and bleached cotton. The microbial burden of each source fiber was measured as a pr...
Orejas, Jaime; Pfeuffer, Kevin P; Ray, Steven J; Pisonero, Jorge; Sanz-Medel, Alfredo; Hieftje, Gary M
2014-11-01
Ambient desorption/ionization (ADI) sources coupled to mass spectrometry (MS) offer outstanding analytical features: direct analysis of real samples without sample pretreatment, combined with the selectivity and sensitivity of MS. Since ADI sources typically work in the open atmosphere, ambient conditions can affect the desorption and ionization processes. Here, the effects of internal source parameters and ambient humidity on the ionization processes of the flowing atmospheric pressure afterglow (FAPA) source are investigated. The interaction of reagent ions with a range of analytes is studied in terms of sensitivity and based upon the processes that occur in the ionization reactions. The results show that internal parameters which lead to higher gas temperatures afforded higher sensitivities, although fragmentation is also affected. In the case of humidity, only extremely dry conditions led to higher sensitivities, while fragmentation remained unaffected.
Targeted versus statistical approaches to selecting parameters for modelling sediment provenance
NASA Astrophysics Data System (ADS)
Laceby, J. Patrick
2017-04-01
One effective field-based approach to modelling sediment provenance is the source fingerprinting technique. Arguably, one of the most important steps for this approach is selecting the appropriate suite of parameters or fingerprints used to model source contributions. Accordingly, approaches to selecting parameters for sediment source fingerprinting will be reviewed. Thereafter, opportunities and limitations of these approaches and some future research directions will be presented. For properties to be effective tracers of sediment, they must discriminate between sources whilst behaving conservatively. Conservative behavior is characterized by constancy in sediment properties, where the properties of sediment sources remain constant, or at the very least, any variation in these properties should occur in a predictable and measurable way. Therefore, properties selected for sediment source fingerprinting should remain constant through sediment detachment, transportation and deposition processes, or vary in a predictable and measurable way. One approach to select conservative properties for sediment source fingerprinting is to identify targeted tracers, such as caesium-137, that provide specific source information (e.g. surface versus subsurface origins). A second approach is to use statistical tests to select an optimal suite of conservative properties capable of modelling sediment provenance. In general, statistical approaches use a combination of a discrimination (e.g. Kruskal Wallis H-test, Mann-Whitney U-test) and parameter selection statistics (e.g. Discriminant Function Analysis or Principle Component Analysis). The challenge is that modelling sediment provenance is often not straightforward and there is increasing debate in the literature surrounding the most appropriate approach to selecting elements for modelling. Moving forward, it would be beneficial if researchers test their results with multiple modelling approaches, artificial mixtures, and multiple lines of evidence to provide secondary support to their initial modelling results. Indeed, element selection can greatly impact modelling results and having multiple lines of evidence will help provide confidence when modelling sediment provenance.
48 CFR 15.101-1 - Tradeoff process.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Tradeoff process. 15.101-1 Section 15.101-1 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION CONTRACTING METHODS AND CONTRACT TYPES CONTRACTING BY NEGOTIATION Source Selection Processes and Techniques 15.101-1...
Optical selection and collection of DNA fragments
Roslaniec, Mary C.; Martin, John C.; Jett, James H.; Cram, L. Scott
1998-01-01
Optical selection and collection of DNA fragments. The present invention includes the optical selection and collection of large (>.mu.g) quantities of clonable, chromosome-specific DNA from a sample of chromosomes. Chromosome selection is based on selective, irreversible photoinactivation of unwanted chromosomal DNA. Although more general procedures may be envisioned, the invention is demonstrated by processing chromosomes in a conventional flow cytometry apparatus, but where no droplets are generated. All chromosomes in the sample are first stained with at least one fluorescent analytic dye and bonded to a photochemically active species which can render chromosomal DNA unclonable if activated. After passing through analyzing light beam(s), unwanted chromosomes are irradiated using light which is absorbed by the photochemically active species, thereby causing photoinactivation. As desired chromosomes pass this photoinactivation point, the inactivating light source is deflected by an optical modulator; hence, desired chromosomes are not photoinactivated and remain clonable. The selection and photoinactivation processes take place on a microsecond timescale. By eliminating droplet formation, chromosome selection rates 50 times greater than those possible with conventional chromosome sorters may be obtained. Thus, usable quantities of clonable DNA from any source thereof may be collected.
Beste, Christian; Mückschel, Moritz; Rosales, Raymond; Domingo, Aloysius; Lee, Lillian; Ng, Arlene; Klein, Christine; Münchau, Alexander
2018-07-01
Cognitive control is relevant when distracting information induces behavioral conflicts. Such conflicts can be produced consciously and by subliminally processed information. Interestingly, both sources of conflict interact suggesting that they share neural mechanisms. Here, we ask whether conjoint effects between different sources of conflict are modulated by microstructural basal ganglia dysfunction. To this end, we carried out an electroencephalography study and examined event-related potentials (ERPs) including source localization using a combined flanker-subliminal priming task in patients with X-linked dystonia Parkinsonism (XDP) and a group of healthy controls. XDP in its early stages is known to predominantly affect the basal ganglia striosomes. The results suggest that conjoint effects between subliminal and conscious sources of conflicts are modulated by the striosomes and were stronger in XDP patients. The neurophysiological data indicate that this effect is related to modulations in conflict monitoring and response selection (N2 ERP) mechanisms engaging the anterior cingulate cortex. Bottom-up perceptual gating, attentional selection, and motor response activation processes in response to the stimuli (P1, N1, and lateralized readiness potential ERPs) were unaffected. Taken together, these data indicate that striosomes modulate the processing of conscious and subliminal sources of conflict suggesting that microstructural basal ganglia properties are relevant for cognitive control.
Naturally occurring 32Si and low-background silicon dark matter detectors
Orrell, John L.; Arnquist, Isaac J.; Bliss, Mary; ...
2018-02-10
Here, the naturally occurring radioisotope 32Si represents a potentially limiting background in future dark matter direct-detection experiments. We investigate sources of 32Si and the vectors by which it comes to reside in silicon crystals used for fabrication of radiation detectors. We infer that the 32Si concentration in commercial single-crystal silicon is likely variable, dependent upon the specific geologic and hydrologic history of the source (or sources) of silicon “ore” and the details of the silicon-refinement process. The silicon production industry is large, highly segmented by refining step, and multifaceted in terms of final product type, from which we conclude thatmore » production of 32Si-mitigated crystals requires both targeted silicon material selection and a dedicated refinement-through-crystal-production process. We review options for source material selection, including quartz from an underground source and silicon isotopically reduced in 32Si. To quantitatively evaluate the 32Si content in silicon metal and precursor materials, we propose analytic methods employing chemical processing and radiometric measurements. Ultimately, it appears feasible to produce silicon detectors with low levels of 32Si, though significant assay method development is required to validate this claim and thereby enable a quality assurance program during an actual controlled silicon-detector production cycle.« less
Naturally occurring 32Si and low-background silicon dark matter detectors
NASA Astrophysics Data System (ADS)
Orrell, John L.; Arnquist, Isaac J.; Bliss, Mary; Bunker, Raymond; Finch, Zachary S.
2018-05-01
The naturally occurring radioisotope 32Si represents a potentially limiting background in future dark matter direct-detection experiments. We investigate sources of 32Si and the vectors by which it comes to reside in silicon crystals used for fabrication of radiation detectors. We infer that the 32Si concentration in commercial single-crystal silicon is likely variable, dependent upon the specific geologic and hydrologic history of the source (or sources) of silicon "ore" and the details of the silicon-refinement process. The silicon production industry is large, highly segmented by refining step, and multifaceted in terms of final product type, from which we conclude that production of 32Si-mitigated crystals requires both targeted silicon material selection and a dedicated refinement-through-crystal-production process. We review options for source material selection, including quartz from an underground source and silicon isotopically reduced in 32Si. To quantitatively evaluate the 32Si content in silicon metal and precursor materials, we propose analytic methods employing chemical processing and radiometric measurements. Ultimately, it appears feasible to produce silicon detectors with low levels of 32Si, though significant assay method development is required to validate this claim and thereby enable a quality assurance program during an actual controlled silicon-detector production cycle.
Naturally occurring 32Si and low-background silicon dark matter detectors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Orrell, John L.; Arnquist, Isaac J.; Bliss, Mary
Here, the naturally occurring radioisotope 32Si represents a potentially limiting background in future dark matter direct-detection experiments. We investigate sources of 32Si and the vectors by which it comes to reside in silicon crystals used for fabrication of radiation detectors. We infer that the 32Si concentration in commercial single-crystal silicon is likely variable, dependent upon the specific geologic and hydrologic history of the source (or sources) of silicon “ore” and the details of the silicon-refinement process. The silicon production industry is large, highly segmented by refining step, and multifaceted in terms of final product type, from which we conclude thatmore » production of 32Si-mitigated crystals requires both targeted silicon material selection and a dedicated refinement-through-crystal-production process. We review options for source material selection, including quartz from an underground source and silicon isotopically reduced in 32Si. To quantitatively evaluate the 32Si content in silicon metal and precursor materials, we propose analytic methods employing chemical processing and radiometric measurements. Ultimately, it appears feasible to produce silicon detectors with low levels of 32Si, though significant assay method development is required to validate this claim and thereby enable a quality assurance program during an actual controlled silicon-detector production cycle.« less
Naturally occurring 32 Si and low-background silicon dark matter detectors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Orrell, John L.; Arnquist, Isaac J.; Bliss, Mary
The naturally occurring radioisotope Si-32 represents a potentially limiting background in future dark matter direct-detection experiments. We investigate sources of Si-32 and the vectors by which it comes to reside in silicon crystals used for fabrication of radiation detectors. We infer that the Si-32 concentration in commercial single-crystal silicon is likely variable, dependent upon the specific geologic and hydrologic history of the source (or sources) of silicon “ore” and the details of the silicon-refinement process. The silicon production industry is large, highly segmented by refining step, and multifaceted in terms of final product type, from which we conclude that productionmore » of Si-32-mitigated crystals requires both targeted silicon material selection and a dedicated refinement-through-crystal-production process. We review options for source material selection, including quartz from an underground source and silicon isotopically reduced in Si-32. To quantitatively evaluate the Si-32 content in silicon metal and precursor materials, we propose analytic methods employing chemical processing and radiometric measurements. Ultimately, it appears feasible to produce silicon-based detectors with low levels of Si-32, though significant assay method development is required to validate this claim and thereby enable a quality assurance program during an actual controlled silicon-detector production cycle.« less
Image Fusion Algorithms Using Human Visual System in Transform Domain
NASA Astrophysics Data System (ADS)
Vadhi, Radhika; Swamy Kilari, Veera; Samayamantula, Srinivas Kumar
2017-08-01
The endeavor of digital image fusion is to combine the important visual parts from various sources to advance the visibility eminence of the image. The fused image has a more visual quality than any source images. In this paper, the Human Visual System (HVS) weights are used in the transform domain to select appropriate information from various source images and then to attain a fused image. In this process, mainly two steps are involved. First, apply the DWT to the registered source images. Later, identify qualitative sub-bands using HVS weights. Hence, qualitative sub-bands are selected from different sources to form high quality HVS based fused image. The quality of the HVS based fused image is evaluated with general fusion metrics. The results show the superiority among the state-of-the art resolution Transforms (MRT) such as Discrete Wavelet Transform (DWT), Stationary Wavelet Transform (SWT), Contourlet Transform (CT), and Non Sub Sampled Contourlet Transform (NSCT) using maximum selection fusion rule.
NASA Astrophysics Data System (ADS)
Saad, S. M.; Shakaff, A. Y. M.; Saad, A. R. M.; Yusof, A. M.; Andrew, A. M.; Zakaria, A.; Adom, A. H.
2017-03-01
There are various sources influencing indoor air quality (IAQ) which could emit dangerous gases such as carbon monoxide (CO), carbon dioxide (CO2), ozone (O3) and particulate matter. These gases are usually safe for us to breathe in if they are emitted in safe quantity but if the amount of these gases exceeded the safe level, they might be hazardous to human being especially children and people with asthmatic problem. Therefore, a smart indoor air quality monitoring system (IAQMS) is needed that able to tell the occupants about which sources that trigger the indoor air pollution. In this project, an IAQMS that able to classify sources influencing IAQ has been developed. This IAQMS applies a classification method based on Probabilistic Neural Network (PNN). It is used to classify the sources of indoor air pollution based on five conditions: ambient air, human activity, presence of chemical products, presence of food and beverage, and presence of fragrance. In order to get good and best classification accuracy, an analysis of several feature selection based on data pre-processing method is done to discriminate among the sources. The output from each data pre-processing method has been used as the input for the neural network. The result shows that PNN analysis with the data pre-processing method give good classification accuracy of 99.89% and able to classify the sources influencing IAQ high classification rate.
LC-MS/MS Identification of Species-Specific Muscle Peptides in Processed Animal Proteins.
Marchis, Daniela; Altomare, Alessandra; Gili, Marilena; Ostorero, Federica; Khadjavi, Amina; Corona, Cristiano; Ru, Giuseppe; Cappelletti, Benedetta; Gianelli, Silvia; Amadeo, Francesca; Rumio, Cristiano; Carini, Marina; Aldini, Giancarlo; Casalone, Cristina
2017-12-06
An innovative analytical strategy has been applied to identify signature peptides able to distinguish among processed animal proteins (PAPs) derived from bovine, pig, fish, and milk products. Proteomics was first used to elucidate the proteome of each source. Starting from the identified proteins and using a funnel based approach, a set of abundant and well characterized peptides with suitable physical-chemical properties (signature peptides) and specific for each source was selected. An on-target LC-ESI-MS/MS method (MRM mode) was set up using standard peptides and was then applied to selectively identify the PAP source and also to distinguish proteins from bovine carcass and milk proteins. We believe that the method described meets the request of the European Commission which has developed a strategy for gradually lifting the "total ban" toward "species to species ban", therefore requiring official methods for species-specific discrimination in feed.
Survey of Munitions Response Technologies
2006-06-01
3-34 3.3.4 Digital Data Processing .......................................................................... 3-36 4.0 SOURCE DATA AND METHODS...6-4 6.1.6 DGM versus Mag and Flag Processes ..................................................... 6-5 6.1.7 Translation to...signatures, surface clutter, variances in operator technique, target selection, and data processing all degrade from and affect optimum performance
Entrofy: Participant Selection Made Easy
NASA Astrophysics Data System (ADS)
Huppenkothen, Daniela
2016-03-01
Selection participants for a workshop out of a much larger applicant pool can be a difficult task, especially when the goal is diversifying over a range of criteria (e.g. academic seniority, research field, skill levels, gender etc). In this talk I am presenting our tool, Entrofy, aimed at aiding organizers in this task. Entrofy is an open-source tool using a maximum entropy-based algorithm that aims to select a set of participants out of the applicant pool such that a pre-defined range of criteria are globally maximized. This approach allows for a potentially more transparent and less biased selection process while encouraging organizers to think deeply about the goals and the process of their participant selection.
Binaural frequency selectivity in humans.
Verhey, Jesko L; van de Par, Steven
2018-01-23
Several behavioural studies in humans have shown that listening to sounds with two ears that is binaural hearing, provides the human auditory system with extra information on the sound source that is not available when sounds are only perceived through one ear that is monaurally. Binaural processing involves the analysis of phase and level differences between the two ear signals. As monaural cochlea processing (in each ear) precedes the neural stages responsible for binaural processing properties it is reasonable to assume that properties of the cochlea may also be observed in binaural processing. A main characteristic of cochlea processing is its frequency selectivity. In psychoacoustics, there is an ongoing discussion on the frequency selectivity of the binaural auditory system. While some psychoacoustic experiments seem to indicate poorer frequency selectivity of the binaural system than that of the monaural processing others seem to indicate the same frequency selectivity for monaural and binaural processing. This study provides an overview of these seemingly controversial results and the different explanations that were provided to account for the different results. © 2018 Federation of European Neuroscience Societies and John Wiley & Sons Ltd.
Science Fair Projects. LC Science Tracer Bullet. TB 07-6
ERIC Educational Resources Information Center
Howland, Joyce, Comp.
2007-01-01
Selected sources in this bibliography provide guidance to students, parents, and teachers throughout the process of planning, developing, implementing and competing in science fair activities. Sources range in suitability from elementary to high school levels. This guide updates "Library of Congress Science Tracer Bullet" 01-4. More specialized…
NASA Astrophysics Data System (ADS)
Staszak, Katarzyna; Wieszczycka, Karolina
2018-04-01
The potential sources of metals from energy industries are discussed. The discussion is organized based on two main metal-contains wastes from power plants: ashes, slags from combustion process and spent catalysts from selective catalytic NOx reduction process with ammonia, known as SCR. The compositions, methods of metals recovery, based mainly on leaching process, and their further application are presented. Solid coal combustion wastes are sources of various compounds such as silica, alumina, iron oxide, and calcium. In the case of the spent SCR catalysts mainly two metals are considered: vanadium and tungsten - basic components of industrial ones.
Fugitive Dust Emissions: Development of a Real-time Monitor
2011-10-01
the mechanical disturbance of soils which injects particles into the air. Common sources of FD include vehicles driving on unpaved roads...agricultural tilling, and heavy construction operations. For these sources the dust-generation process is caused by two basic physical phenomena...visibility, source apportionment , etc. The PM10 standard set by the U.S. Environmental Protection Agency in 1987 is an example of size-selective
Recommendation of ruthenium source for sludge batch flowsheet studies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Woodham, W.
Included herein is a preliminary analysis of previously-generated data from sludge batches 7a, 7b, 8, and 9 sludge simulant and real-waste testing, performed to recommend a form of ruthenium for future sludge batch simulant testing under the nitric-formic flowsheet. Focus is given to reactions present in the Sludge Receipt and Adjustment Tank cycle, given that this cycle historically produces the most changes in chemical composition during Chemical Process Cell processing. Data is presented and analyzed for several runs performed under the nitric-formic flowsheet, with consideration given to effects on the production of hydrogen gas, nitrous oxide gas, consumption of formate,more » conversion of nitrite to nitrate, and the removal and recovery of mercury during processing. Additionally, a brief discussion is given to the effect of ruthenium source selection under the nitric-glycolic flowsheet. An analysis of data generated from scaled demonstration testing, sludge batch 9 qualification testing, and antifoam degradation testing under the nitric-glycolic flowsheet is presented. Experimental parameters of interest under the nitric-glycolic flowsheet include N2O production, glycolate destruction, conversion of glycolate to formate and oxalate, and the conversion of nitrite to nitrate. To date, the number of real-waste experiments that have been performed under the nitric-glycolic flowsheet is insufficient to provide a complete understanding of the effects of ruthenium source selection in simulant experiments with regard to fidelity to real-waste testing. Therefore, a determination of comparability between the two ruthenium sources as employed under the nitric-glycolic flowsheet is made based on available data in order to inform ruthenium source selection for future testing under the nitric-glycolic flowsheet.« less
Cong, Fengyu; Puoliväli, Tuomas; Alluri, Vinoo; Sipola, Tuomo; Burunat, Iballa; Toiviainen, Petri; Nandi, Asoke K; Brattico, Elvira; Ristaniemi, Tapani
2014-02-15
Independent component analysis (ICA) has been often used to decompose fMRI data mostly for the resting-state, block and event-related designs due to its outstanding advantage. For fMRI data during free-listening experiences, only a few exploratory studies applied ICA. For processing the fMRI data elicited by 512-s modern tango, a FFT based band-pass filter was used to further pre-process the fMRI data to remove sources of no interest and noise. Then, a fast model order selection method was applied to estimate the number of sources. Next, both individual ICA and group ICA were performed. Subsequently, ICA components whose temporal courses were significantly correlated with musical features were selected. Finally, for individual ICA, common components across majority of participants were found by diffusion map and spectral clustering. The extracted spatial maps (by the new ICA approach) common across most participants evidenced slightly right-lateralized activity within and surrounding the auditory cortices. Meanwhile, they were found associated with the musical features. Compared with the conventional ICA approach, more participants were found to have the common spatial maps extracted by the new ICA approach. Conventional model order selection methods underestimated the true number of sources in the conventionally pre-processed fMRI data for the individual ICA. Pre-processing the fMRI data by using a reasonable band-pass digital filter can greatly benefit the following model order selection and ICA with fMRI data by naturalistic paradigms. Diffusion map and spectral clustering are straightforward tools to find common ICA spatial maps. Copyright © 2013 Elsevier B.V. All rights reserved.
Microorganism mediated liquid fuels
DOE Office of Scientific and Technical Information (OSTI.GOV)
Troiano, Richard
Herein disclosed is a method for producing liquid hydrocarbon product, the method comprising disintegrating a hydrocarbon source; pretreating the disintegrated hydrocarbon source; solubilizing the disintegrated hydrocarbon source to form a slurry comprising a reactant molecule of the hydrocarbon source; admixing a biochemical liquor into the slurry, wherein the biochemical liquor comprises at least one conversion enzyme configured to facilitate bond selective photo-fragmentation of said reactant molecule of the hydrocarbon source, to form liquid hydrocarbons via enzyme assisted bond selective photo-fragmentation, wherein said conversion enzyme comprises reactive sites configured to restrict said reactant molecule such that photo-fragmentation favorably targets a preselectedmore » internal bond of said reactant molecule; separating the liquid hydrocarbons from the slurry, wherein contaminants remain in the slurry; and enriching the liquid hydrocarbons to form a liquid hydrocarbon product. Various aspects of such method/process are also discussed.« less
Acoustic emission measurements of aerospace materials and structures
NASA Technical Reports Server (NTRS)
Sachse, Wolfgang; Gorman, Michael R.
1993-01-01
A development status evaluation is given for aerospace applications of AE location, detection, and source characterization. Attention is given to the neural-like processing of AE signals for graphite/epoxy. It is recommended that development efforts for AE make connections between the material failure process and source dynamics, and study the effects of composite material anisotropy and inhomogeneity on the propagation of AE waves. Broadband, as well as frequency- and wave-mode selective sensors, need to be developed.
48 CFR 2015.305 - Proposal evaluation.
Code of Federal Regulations, 2013 CFR
2013-10-01
... METHODS AND CONTRACT TYPES CONTRACTING BY NEGOTIATION Source Selection Processes and Techniques 2015.305 Proposal evaluation. The contracting officer may provide offerors' cost proposals and supporting financial...
International Data on Radiological Sources
DOE Office of Scientific and Technical Information (OSTI.GOV)
Martha Finck; Margaret Goldberg
2010-07-01
ABSTRACT The mission of radiological dispersal device (RDD) nuclear forensics is to identify the provenance of nuclear and radiological materials used in RDDs and to aid law enforcement in tracking nuclear materials and routes. The application of databases to radiological forensics is to match RDD source material to a source model in the database, provide guidance regarding a possible second device, and aid the FBI by providing a short list of manufacturers and distributors, and ultimately to the last legal owner of the source. The Argonne/Idaho National Laboratory RDD attribution database is a powerful technical tool in radiological forensics. Themore » database (1267 unique vendors) includes all sealed sources and a device registered in the U.S., is complemented by data from the IAEA Catalogue, and is supported by rigorous in-lab characterization of selected sealed sources regarding physical form, radiochemical composition, and age-dating profiles. Close working relationships with global partners in the commercial sealed sources industry provide invaluable technical information and expertise in the development of signature profiles. These profiles are critical to the down-selection of potential candidates in either pre- or post- event RDD attribution. The down-selection process includes a match between an interdicted (or detonated) source and a model in the database linked to one or more manufacturers and distributors.« less
Niche construction, sources of selection and trait coevolution.
Laland, Kevin; Odling-Smee, John; Endler, John
2017-10-06
Organisms modify and choose components of their local environments. This 'niche construction' can alter ecological processes, modify natural selection and contribute to inheritance through ecological legacies. Here, we propose that niche construction initiates and modifies the selection directly affecting the constructor, and on other species, in an orderly, directed and sustained manner. By dependably generating specific environmental states, niche construction co-directs adaptive evolution by imposing a consistent statistical bias on selection. We illustrate how niche construction can generate this evolutionary bias by comparing it with artificial selection. We suggest that it occupies the middle ground between artificial and natural selection. We show how the perspective leads to testable predictions related to: (i) reduced variance in measures of responses to natural selection in the wild; (ii) multiple trait coevolution, including the evolution of sequences of traits and patterns of parallel evolution; and (iii) a positive association between niche construction and biodiversity. More generally, we submit that evolutionary biology would benefit from greater attention to the diverse properties of all sources of selection.
NASA Astrophysics Data System (ADS)
Al-Jumaili, Safaa Kh.; Pearson, Matthew R.; Holford, Karen M.; Eaton, Mark J.; Pullin, Rhys
2016-05-01
An easy to use, fast to apply, cost-effective, and very accurate non-destructive testing (NDT) technique for damage localisation in complex structures is key for the uptake of structural health monitoring systems (SHM). Acoustic emission (AE) is a viable technique that can be used for SHM and one of the most attractive features is the ability to locate AE sources. The time of arrival (TOA) technique is traditionally used to locate AE sources, and relies on the assumption of constant wave speed within the material and uninterrupted propagation path between the source and the sensor. In complex structural geometries and complex materials such as composites, this assumption is no longer valid. Delta T mapping was developed in Cardiff in order to overcome these limitations; this technique uses artificial sources on an area of interest to create training maps. These are used to locate subsequent AE sources. However operator expertise is required to select the best data from the training maps and to choose the correct parameter to locate the sources, which can be a time consuming process. This paper presents a new and improved fully automatic delta T mapping technique where a clustering algorithm is used to automatically identify and select the highly correlated events at each grid point whilst the "Minimum Difference" approach is used to determine the source location. This removes the requirement for operator expertise, saving time and preventing human errors. A thorough assessment is conducted to evaluate the performance and the robustness of the new technique. In the initial test, the results showed excellent reduction in running time as well as improved accuracy of locating AE sources, as a result of the automatic selection of the training data. Furthermore, because the process is performed automatically, this is now a very simple and reliable technique due to the prevention of the potential source of error related to manual manipulation.
Knowledge and Processes That Predict Proficiency in Digital Literacy
ERIC Educational Resources Information Center
Bulger, Monica E.; Mayer, Richard E.; Metzger, Miriam J.
2014-01-01
Proficiency in digital literacy refers to the ability to read and write using online sources, and includes the ability to select sources relevant to the task, synthesize information into a coherent message, and communicate the message with an audience. The present study examines the determinants of digital literacy proficiency by asking 150…
Quantitative estimation of source complexity in tsunami-source inversion
NASA Astrophysics Data System (ADS)
Dettmer, Jan; Cummins, Phil R.; Hawkins, Rhys; Jakir Hossen, M.
2016-04-01
This work analyses tsunami waveforms to infer the spatiotemporal evolution of sea-surface displacement (the tsunami source) caused by earthquakes or other sources. Since the method considers sea-surface displacement directly, no assumptions about the fault or seafloor deformation are required. While this approach has no ability to study seismic aspects of rupture, it greatly simplifies the tsunami source estimation, making it much less dependent on subjective fault and deformation assumptions. This results in a more accurate sea-surface displacement evolution in the source region. The spatial discretization is by wavelet decomposition represented by a trans-D Bayesian tree structure. Wavelet coefficients are sampled by a reversible jump algorithm and additional coefficients are only included when required by the data. Therefore, source complexity is consistent with data information (parsimonious) and the method can adapt locally in both time and space. Since the source complexity is unknown and locally adapts, no regularization is required, resulting in more meaningful displacement magnitudes. By estimating displacement uncertainties in a Bayesian framework we can study the effect of parametrization choice on the source estimate. Uncertainty arises from observation errors and limitations in the parametrization to fully explain the observations. As a result, parametrization choice is closely related to uncertainty estimation and profoundly affects inversion results. Therefore, parametrization selection should be included in the inference process. Our inversion method is based on Bayesian model selection, a process which includes the choice of parametrization in the inference process and makes it data driven. A trans-dimensional (trans-D) model for the spatio-temporal discretization is applied here to include model selection naturally and efficiently in the inference by sampling probabilistically over parameterizations. The trans-D process results in better uncertainty estimates since the parametrization adapts parsimoniously (in both time and space) according to the local data resolving power and the uncertainty about the parametrization choice is included in the uncertainty estimates. We apply the method to the tsunami waveforms recorded for the great 2011 Japan tsunami. All data are recorded on high-quality sensors (ocean-bottom pressure sensors, GPS gauges, and DART buoys). The sea-surface Green's functions are computed by JAGURS and include linear dispersion effects. By treating the noise level at each gauge as unknown, individual gauge contributions to the source estimate are appropriately and objectively weighted. The results show previously unreported detail of the source, quantify uncertainty spatially, and produce excellent data fits. The source estimate shows an elongated peak trench-ward from the hypo centre that closely follows the trench, indicating significant sea-floor deformation near the trench. Also notable is a bi-modal (negative to positive) displacement feature in the northern part of the source near the trench. The feature has ~2 m amplitude and is clearly resolved by the data with low uncertainties.
VizieR Online Data Catalog: RASS-6dFGS catalogue (Mahony+, 2010)
NASA Astrophysics Data System (ADS)
Mahony, E. K.; Croom, S. M.; Boyle, B. J.; Edge, A. C.; Mauch, T.; Sadler, E. M.
2014-09-01
Objects were selected such that the dominant source of X-ray emission originates from an AGN. The target list was selected from the southern sources (δ<=0°) of the RBSC, a total of 9578 sources. Sources were then checked for optical identifications via a visual inspection process using Digitized Sky Survey (DSS) images. The majority of the optical positions were taken from the United States Naval Observatory (USNO) data base, with the remainder taken from either the Automated Plate Measuring (APM) or DSS catalogues. Positions from these latter catalogues were used when the USNO appeared to give an incorrect position according to the DSS images. Optical magnitudes were taken from the USNO-A2.0 catalogue (Monet 1998, Cat. I/252). (2 data files).
Aging affects the interaction between attentional control and source memory: an fMRI study.
Dulas, Michael R; Duarte, Audrey
2014-12-01
Age-related source memory impairments may be due, at least in part, to deficits in executive processes mediated by the PFC at both study and test. Behavioral work suggests that providing environmental support at encoding, such as directing attention toward item-source associations, may improve source memory and reduce age-related deficits in the recruitment of these executive processes. The present fMRI study investigated the effects of directed attention and aging on source memory encoding and retrieval. At study, participants were shown pictures of objects. They were either asked to attend to the objects and their color (source) or to their size. At test, participants determined if objects were seen before, and if so, whether they were the same color as previously. Behavioral results showed that direction of attention improved source memory for both groups; however, age-related deficits persisted. fMRI results revealed that, across groups, direction of attention facilitated medial temporal lobe-mediated contextual binding processes during study and attenuated right PFC postretrieval monitoring effects at test. However, persistent age-related source memory deficits may be related to increased recruitment of medial anterior PFC during encoding, indicative of self-referential processing, as well as underrecruitment of lateral anterior PFC-mediated relational processes. Taken together, this study suggests that, even when supported, older adults may fail to selectively encode goal-relevant contextual details supporting source memory performance.
Adaptive illumination source for multispectral vision system applied to material discrimination
NASA Astrophysics Data System (ADS)
Conde, Olga M.; Cobo, Adolfo; Cantero, Paulino; Conde, David; Mirapeix, Jesús; Cubillas, Ana M.; López-Higuera, José M.
2008-04-01
A multispectral system based on a monochrome camera and an adaptive illumination source is presented in this paper. Its preliminary application is focused on material discrimination for food and beverage industries, where monochrome, color and infrared imaging have been successfully applied for this task. This work proposes a different approach, in which the relevant wavelengths for the required discrimination task are selected in advance using a Sequential Forward Floating Selection (SFFS) Algorithm. A light source, based on Light Emitting Diodes (LEDs) at these wavelengths is then used to sequentially illuminate the material under analysis, and the resulting images are captured by a CCD camera with spectral response in the entire range of the selected wavelengths. Finally, the several multispectral planes obtained are processed using a Spectral Angle Mapping (SAM) algorithm, whose output is the desired material classification. Among other advantages, this approach of controlled and specific illumination produces multispectral imaging with a simple monochrome camera, and cold illumination restricted to specific relevant wavelengths, which is desirable for the food and beverage industry. The proposed system has been tested with success for the automatic detection of foreign object in the tobacco processing industry.
Intellectual Functioning and Aging: A Selected Bibliography. Technical Bibliographies on Aging.
ERIC Educational Resources Information Center
Schaie, K. Warner; Zelinski, Elizabeth M.
The selected bibliography contains about 400 references taken from a keysort file of more than 45,000 references, compiled from commercially available data bases and published sources, relevant to gerontology. Those of questionable accuracy were checked or deleted during the verification process. Most references are in English and were selected…
X-33 Telemetry Best Source Selection, Processing, Display, and Simulation Model Comparison
NASA Technical Reports Server (NTRS)
Burkes, Darryl A.
1998-01-01
The X-33 program requires the use of multiple telemetry ground stations to cover the launch, ascent, transition, descent, and approach phases for the flights from Edwards AFB to landings at Dugway Proving Grounds, UT and Malmstrom AFB, MT. This paper will discuss the X-33 telemetry requirements and design, including information on fixed and mobile telemetry systems, best source selection, and support for Range Safety Officers. A best source selection system will be utilized to automatically determine the best source based on the frame synchronization status of the incoming telemetry streams. These systems will be used to select the best source at the landing sites and at NASA Dryden Flight Research Center to determine the overall best source between the launch site, intermediate sites, and landing site sources. The best source at the landing sites will be decommutated to display critical flight safety parameters for the Range Safety Officers. The overall best source will be sent to the Lockheed Martin's Operational Control Center at Edwards AFB for performance monitoring by X-33 program personnel and for monitoring of critical flight safety parameters by the primary Range Safety Officer. The real-time telemetry data (received signal strength, etc.) from each of the primary ground stations will also be compared during each nu'ssion with simulation data generated using the Dynamic Ground Station Analysis software program. An overall assessment of the accuracy of the model will occur after each mission. Acknowledgment: The work described in this paper was NASA supported through cooperative agreement NCC8-115 with Lockheed Martin Skunk Works.
Code of Federal Regulations, 2011 CFR
2011-01-01
... innovative designs, materials, and equipment such as daylighting, passive solar heating, and heat recovery... select the fuel source for the HVAC systems, service hot water, and process loads from available...
48 CFR 2015.304 - Evaluation factors.
Code of Federal Regulations, 2010 CFR
2010-10-01
... METHODS AND CONTRACT TYPES CONTRACTING BY NEGOTIATION Source Selection Processes and Techniques 2015.304... numerically weighted are conflict of interest, estimated cost, and “go/no go” evaluation factors. ...
NASA Astrophysics Data System (ADS)
Chen, Dong; Shang-Hong, Zhao; MengYi, Deng
2018-03-01
The multiple crystal heralded source with post-selection (MHPS), originally introduced to improve the single-photon character of the heralded source, has specific applications for quantum information protocols. In this paper, by combining decoy-state measurement-device-independent quantum key distribution (MDI-QKD) with spontaneous parametric downconversion process, we present a modified MDI-QKD scheme with MHPS where two architectures are proposed corresponding to symmetric scheme and asymmetric scheme. The symmetric scheme, which linked by photon switches in a log-tree structure, is adopted to overcome the limitation of the current low efficiency of m-to-1 optical switches. The asymmetric scheme, which shows a chained structure, is used to cope with the scalability issue with increase in the number of crystals suffered in symmetric scheme. The numerical simulations show that our modified scheme has apparent advances both in transmission distance and key generation rate compared to the original MDI-QKD with weak coherent source and traditional heralded source with post-selection. Furthermore, the recent advances in integrated photonics suggest that if built into a single chip, the MHPS might be a practical alternative source in quantum key distribution tasks requiring single photons to work.
Palladium-Catalyzed Borylation of Primary Alkyl Bromides
Joshi-Pangu, Amruta; Ma, Xinghua; Diane, Mohamed; Iqbal, Sidra; Kribs, Robert J.; Huang, Richard; Wang, Chao-Yuan
2012-01-01
A mild Pd-catalyzed process for the borylation of alkyl bromides has been developed using bis(pinacolato)diboron as a boron source. This process accommodates the use of a wide range of functional groups on the alkyl bromide substrate. Primary bromides react with complete selectivity in the presence of a secondary bromide. The generality of this approach is demonstrated by its extension to the use of alkyl iodides and alkyl tosylates, as well as borylation reactions employing bis(neopentyl glycolato)diboron as the boron source. PMID:22774861
de Chastelaine, Marianne; Friedman, David; Cycowicz, Yael M
2007-08-01
Improvement in source memory performance throughout childhood is thought to be mediated by the development of executive control. As postretrieval control processes may be better time-locked to the recognition response rather than the retrieval cue, the development of processes underlying source memory was investigated with both stimulus- and response-locked event-related potentials (ERPs). These were recorded in children, adolescents, and adults during a recognition memory exclusion task. Green- and red-outlined pictures were studied, but were tested in black outline. The test requirement was to endorse old items shown in one study color ("targets") and to reject new items along with old items shown in the alternative study color ("nontargets"). Source memory improved with age. All age groups retrieved target and nontarget memories as reflected by reliable parietal episodic memory (EM) effects, a stimulus-locked ERP correlate of recollection. Response-locked ERPs to targets and nontargets diverged in all groups prior to the response, although this occurred at an increasingly earlier time point with age. We suggest these findings reflect the implementation of attentional control mechanisms to enhance target memories and facilitate response selection with the greatest and least success, respectively, in adults and children. In adults only, response-locked ERPs revealed an early-onsetting parietal negativity for nontargets, but not for targets. This was suggested to reflect adults' ability to consistently inhibit prepotent target responses for nontargets. The findings support the notion that the development of source memory relies on the maturation of control processes that serve to enhance accurate selection of task-relevant memories.
48 CFR 873.113 - Exchanges with offerors.
Code of Federal Regulations, 2010 CFR
2010-10-01
... judgment. Clarifications, communications, and discussions, as provided for in the FAR, are concepts not... take place throughout the source selection process. Exchanges may start in the planning stages and...
Almeida-Warren, Katarina; Sommer, Volker; Piel, Alex K; Pascual-Garrido, Alejandra
2017-10-01
Chimpanzee termite fishing has been studied for decades, yet the selective processes preceding the manufacture of fishing tools remain largely unexplored. We investigate raw material selection and potential evidence of forward planning in the chimpanzees of Issa valley, western Tanzania. Using traditional archaeological methods, we surveyed the location of plants from where chimpanzees sourced raw material to manufacture termite fishing tools, relative to targeted mounds. We measured raw material abundance to test for availability and selection. Statistics included Chi-Squared, two-tailed Wilcoxon, and Kruskall-Wallace tests. Issa chimpanzees manufactured extraction tools only from bark, despite availability of other suitable materials (e.g., twigs), and selected particular plant species as raw material sources, which they often also exploit for food. Most plants were sourced 1-16 m away from the mound, with a maximum of 33 m. The line of sight from the targeted mound was obscured for a quarter of these plants. The exclusive use of bark tools despite availability of other suitable materials indicates a possible cultural preference. The fact that Issa chimpanzees select specific plant species and travel some distance to source them suggests some degree of selectivity and, potentially, forward planning. Our results have implications for the reconstruction of early hominin behaviors, particularly with regard to the use of perishable tools, which remain archaeologically invisible. © 2017 Wiley Periodicals, Inc.
2001-12-01
SWDIV’s current organizational structure is a direct result of a December 1995 customer survey, which revealed that SWDIV’s customers were not happy with...its services. They wanted SWDIV to be better, faster, cheaper, and easier to use. As a result of this customer feedback, and various OPNAV and...process of delivering products and services to 13 SWDIV’s customers . The RET was tasked with accomplishing the following three things: 1. Focus
Emotion processing in the visual brain: a MEG analysis.
Peyk, Peter; Schupp, Harald T; Elbert, Thomas; Junghöfer, Markus
2008-06-01
Recent functional magnetic resonance imaging (fMRI) and event-related brain potential (ERP) studies provide empirical support for the notion that emotional cues guide selective attention. Extending this line of research, whole head magneto-encephalogram (MEG) was measured while participants viewed in separate experimental blocks a continuous stream of either pleasant and neutral or unpleasant and neutral pictures, presented for 330 ms each. Event-related magnetic fields (ERF) were analyzed after intersubject sensor coregistration, complemented by minimum norm estimates (MNE) to explore neural generator sources. Both streams of analysis converge by demonstrating the selective emotion processing in an early (120-170 ms) and a late time interval (220-310 ms). ERF analysis revealed that the polarity of the emotion difference fields was reversed across early and late intervals suggesting distinct patterns of activation in the visual processing stream. Source analysis revealed the amplified processing of emotional pictures in visual processing areas with more pronounced occipito-parieto-temporal activation in the early time interval, and a stronger engagement of more anterior, temporal, regions in the later interval. Confirming previous ERP studies showing facilitated emotion processing, the present data suggest that MEG provides a complementary look at the spread of activation in the visual processing stream.
Age-related differences in agenda-driven monitoring of format and task information
Mitchell, Karen J.; Ankudowich, Elizabeth; Durbin, Kelly A.; Greene, Erich J.; Johnson, Marcia K.
2013-01-01
Age-related source memory deficits may arise, in part, from changes in the agenda-driven processes that control what features of events are relevant during remembering. Using fMRI, we compared young and older adults on tests assessing source memory for format (picture, word) or encoding task (self-, other-referential), as well as on old-new recognition. Behaviorally, relative to old-new recognition, older adults showed disproportionate and equivalent deficits on both source tests compared to young adults. At encoding, both age groups showed expected activation associated with format in posterior visual processing areas, and with task in medial prefrontal cortex. At test, the groups showed similar selective, agenda-related activity in these representational areas. There were, however, marked age differences in the activity of control regions in lateral and medial prefrontal cortex and lateral parietal cortex. Results of correlation analyses were consistent with the idea that young adults had greater trial-by-trial agenda-driven modulation of activity (i.e., greater selectivity) than did older adults in representational regions. Thus, under selective remembering conditions where older adults showed clear differential regional activity in representational areas depending on type of test, they also showed evidence of disrupted frontal and parietal function and reduced item-by-item modulation of test-appropriate features. This pattern of results is consistent with an age-related deficit in the engagement of selective reflective attention. PMID:23357375
Digital selective growth of a ZnO nanowire array by large scale laser decomposition of zinc acetate.
Hong, Sukjoon; Yeo, Junyeob; Manorotkul, Wanit; Kang, Hyun Wook; Lee, Jinhwan; Han, Seungyong; Rho, Yoonsoo; Suh, Young Duk; Sung, Hyung Jin; Ko, Seung Hwan
2013-05-07
We develop a digital direct writing method for ZnO NW micro-patterned growth on a large scale by selective laser decomposition of zinc acetate. For ZnO NW growth, by replacing the bulk heating with the scanning focused laser as a fully digital local heat source, zinc acetate crystallites can be selectively activated as a ZnO seed pattern to grow ZnO nanowires locally on a larger area. Together with the selective laser sintering process of metal nanoparticles, more than 10,000 UV sensors have been demonstrated on a 4 cm × 4 cm glass substrate to develop all-solution processible, all-laser mask-less digital fabrication of electronic devices including active layer and metal electrodes without any conventional vacuum deposition, photolithographic process, premade mask, high temperature and vacuum environment.
Source Evaluation of Domain Experts and Novices during Web Search
ERIC Educational Resources Information Center
Brand-Gruwel, S.; Kammerer, Y.; van Meeuwen, L.; van Gog, T.
2017-01-01
Nowadays, almost everyone uses the World Wide Web (WWW) to search for information of any kind. In education, students frequently use the WWW for selecting information to accomplish assignments such as writing an essay or preparing a presentation. The evaluation of sources and information is an important sub-skill in this process. But many students…
NASA Astrophysics Data System (ADS)
Yang, Lurong; Wang, Xinyu; Mendoza-Sanchez, Itza; Abriola, Linda M.
2018-04-01
Sequestered mass in low permeability zones has been increasingly recognized as an important source of organic chemical contamination that acts to sustain downgradient plume concentrations above regulated levels. However, few modeling studies have investigated the influence of this sequestered mass and associated (coupled) mass transfer processes on plume persistence in complex dense nonaqueous phase liquid (DNAPL) source zones. This paper employs a multiphase flow and transport simulator (a modified version of the modular transport simulator MT3DMS) to explore the two- and three-dimensional evolution of source zone mass distribution and near-source plume persistence for two ensembles of highly heterogeneous DNAPL source zone realizations. Simulations reveal the strong influence of subsurface heterogeneity on the complexity of DNAPL and sequestered (immobile/sorbed) mass distribution. Small zones of entrapped DNAPL are shown to serve as a persistent source of low concentration plumes, difficult to distinguish from other (sorbed and immobile dissolved) sequestered mass sources. Results suggest that the presence of DNAPL tends to control plume longevity in the near-source area; for the examined scenarios, a substantial fraction (43.3-99.2%) of plume life was sustained by DNAPL dissolution processes. The presence of sorptive media and the extent of sorption non-ideality are shown to greatly affect predictions of near-source plume persistence following DNAPL depletion, with plume persistence varying one to two orders of magnitude with the selected sorption model. Results demonstrate the importance of sorption-controlled back diffusion from low permeability zones and reveal the importance of selecting the appropriate sorption model for accurate prediction of plume longevity. Large discrepancies for both DNAPL depletion time and plume longevity were observed between 2-D and 3-D model simulations. Differences between 2- and 3-D predictions increased in the presence of sorption, especially for the case of non-ideal sorption, demonstrating the limitations of employing 2-D predictions for field-scale modeling.
ERIC Educational Resources Information Center
Bowers, P. G.; And Others
A study investigated whether a visual selective attention deficit with its presumed basis in slow visual processing referred to the same phonological recoding deficit, or whether they were two independent sources of reading disability. Subjects were children aged 7 to 15 referred to a university clinic (the Waterloo Child Assessment…
Code of Federal Regulations, 2010 CFR
2010-01-01
... a conventional simulation tool, of the Proposed Design. A life cycle cost analysis shall be used to select the fuel source for the HVAC systems, service hot water, and process loads from available...
A role for relaxed selection in the evolution of the language capacity
Deacon, Terrence W.
2010-01-01
Explaining the extravagant complexity of the human language and our competence to acquire it has long posed challenges for natural selection theory. To answer his critics, Darwin turned to sexual selection to account for the extreme development of language. Many contemporary evolutionary theorists have invoked incredibly lucky mutation or some variant of the assimilation of acquired behaviors to innate predispositions in an effort to explain it. Recent evodevo approaches have identified developmental processes that help to explain how complex functional synergies can evolve by Darwinian means. Interestingly, many of these developmental mechanisms bear a resemblance to aspects of Darwin's mechanism of natural selection, often differing only in one respect (e.g., form of duplication, kind of variation, competition/cooperation). A common feature is an interplay between processes of stabilizing selection and processes of relaxed selection at different levels of organism function. These may play important roles in the many levels of evolutionary process contributing to language. Surprisingly, the relaxation of selection at the organism level may have been a source of many complex synergistic features of the human language capacity, and may help explain why so much language information is “inherited” socially. PMID:20445088
48 CFR 873.113 - Exchanges with offerors.
Code of Federal Regulations, 2011 CFR
2011-10-01
... take place throughout the source selection process. Exchanges may start in the planning stages and... best value pool (see 873.114). The purpose of exchanges is to ensure there is mutual understanding...
Moenickes, Sylvia; Höltge, Sibylla; Kreuzig, Robert; Richter, Otto
2011-12-01
Fate monitoring data on anaerobic transformation of the benzimidazole anthelmintics flubendazole (FLU) and fenbendazole (FEN) in liquid pig manure and aerobic transformation and sorption in soil and manured soil under laboratory conditions were used for corresponding fate modeling. Processes considered were reversible and irreversible sequestration, mineralization, and metabolization, from which a set of up to 50 different models, both nested and concurrent, was assembled. Five selection criteria served for model selection after parameter fitting: the coefficient of determination, modeling efficiency, a likelihood ratio test, an information criterion, and a determinability measure. From the set of models selected, processes were classified as essential or sufficient. This strategy to identify process dominance was corroborated through application to data from analogous experiments for sulfadiazine and a comparison with established fate models for this substance. For both, FLU and FEN, model selection performance was fine, including indication of weak data support where observed. For FLU reversible and irreversible sequestration in a nonextractable fraction was determined. In particular, both the extractable and the nonextractable fraction were equally sufficient sources for irreversible sequestration. For FEN generally reversible formation of the extractable sulfoxide metabolite and reversible sequestration of both the parent and the metabolite were dominant. Similar to FLU, irreversible sequestration in the nonextractable fraction was determined for which both the extractable or the nonextractable fraction were equally sufficient sources. Formation of the sulfone metabolite was determined as irreversible, originating from the first metabolite. Copyright © 2011 Elsevier B.V. All rights reserved.
Comparison of NOM character in selected Australian and Norwegian drinking waters.
Fabris, Rolando; Chow, Christopher W K; Drikas, Mary; Eikebrokk, Bjørnar
2008-09-01
Observations from many countries around the world during the past 10-20 years indicate increasing natural organic matter (NOM) concentration levels in water sources, due to issues such as global warming, changes in soil acidification, increased drought severity and more intensive rain events. In addition to the trend towards increasing NOM concentration, the character of NOM can vary with source and time (season). The great seasonal variability and the trend towards elevated NOM concentration levels impose challenges to the water industry and the water treatment facilities in terms of operational optimisation and proper process control. The aim of this investigation was to compare selected raw and conventionally treated drinking water sources from different hemispheres with regard to NOM character which may lead to better understanding of the impact of source water on water treatment. Results from the analyses of selected Norwegian and Australian water samples showed that Norwegian NOM exhibited greater humic nature, indicating a stronger bias of allochthonous versus autochthonous organic origin. Similarly, Norwegian source waters had higher average molecular weights than Australian waters. Following coagulation treatment, the organic character of the recalcitrant NOM in both countries was similar. Differences in organic character of these source waters after treatment were found to be related to treatment practice rather than origin of the source water. The characterisation techniques employed also enabled identification of the coagulation processes which were not necessarily optimised for dissolved organic carbon (DOC) removal. The reactivity with chlorine as well as trihalomethane formation potential (THMFP) of the treated waters showed differences in behaviour between Norwegian and Australian sources that appeared to be related to residual higher molecular weight organic material. By evaluation of changes in specific molecular weight regions and disinfection parameters before and after treatment, correlations were found that relate treatment strategy to chlorine demand and DBP formation.
Apparatus and method for stabilization or oxidation of polymeric materials
Paulauskas, Felix L [Knoxville, TN; Sherman, Daniel M [Knoxville, TN
2010-01-19
An apparatus for treating polymeric materials comprises a treatment chamber adapted to maintain a selected atmosphere at a selected temperature; a means for supporting the polymeric material within the chamber; and, a source of ozone-containing gas, which decomposes at the selected temperature yielding at least one reactive oxidative species whereby the polymer is stabilized and cross linked through exposure to the oxidative species in the chamber at the selected temperature. The ozone may be generated by a plasma discharge or by various chemical processes. The apparatus may be configured for either batch-type or continuous-type processing. The apparatus and method are especially useful for preparing polymer fibers, particularly PAN fibers, for later carbonization treatments as well as to make flame-retardant fabrics.
van der Stelt, O; van der Molen, M; Boudewijn Gunning, W; Kok, A
2001-10-01
In order to gain insight into the functional and macroanatomical loci of visual selective processing deficits that may be basic to attention-deficit hyperactivity disorder (ADHD), the present study examined multi-channel event-related potentials (ERPs) recorded from 7- to 11-year-old boys clinically diagnosed as having ADHD (n=24) and age-matched healthy control boys (n=24) while they performed a visual (color) selective attention task. The spatio-temporal dynamics of several ERP components related to attention to color were characterized using topographic profile analysis, topographic mapping of the ERP and associated scalp current density distributions, and spatio-temporal source potential modeling. Boys with ADHD showed a lower target hit rate, a higher false-alarm rate, and a lower perceptual sensitivity than controls. Also, whereas color attention induced in the ERPs from controls a characteristic early frontally maximal selection positivity (FSP), ADHD boys displayed little or no FSP. Similarly, ADHD boys manifested P3b amplitude decrements that were partially lateralized (i.e., maximal at left temporal scalp locations) as well as affected by maturation. These results indicate that ADHD boys suffer from deficits at both relatively early (sensory) and late (semantic) levels of visual selective information processing. The data also support the hypothesis that the visual selective processing deficits observed in the ADHD boys originate from deficits in the strength of activation of a neural network comprising prefrontal and occipito-temporal brain regions. This network seems to be actively engaged during attention to color and may contain the major intracerebral generating sources of the associated scalp-recorded ERP components.
Mysterud, Atle; Tryjanowski, Piotr; Panek, Marek
2006-01-01
Harvesting represents a major source of mortality in many deer populations. The extent to which harvesting is selective for specific traits is important in order to understand contemporary evolutionary processes. In addition, since such data are frequently used in life-history studies, it is important to know the pattern of selectivity as a source of bias. Recently, it was demonstrated that different hunting methods were selected for different weights in red deer (Cervus elaphus), but little insight was offered into why this occurs. In this study, we show that foreign trophy stalkers select for larger antlers when hunting roe deer (Capreolus capreolus) than local hunters, but that close to half of the difference in selectivity was due to foreigners hunting earlier in the season and in locations with larger males. The relationship between antler size and age was nevertheless fairly similar based on whether deer was shot by foreign or local hunters. PMID:17148307
Neural and behavioral correlates of selective stopping: Evidence for a different strategy adoption.
Sánchez-Carmona, Alberto J; Albert, Jacobo; Hinojosa, José A
2016-10-01
The present study examined the neural and behavioral correlates of selective stopping, a form of inhibition that has scarcely been investigated. The selectivity of the inhibitory process is needed when individuals have to deal with an environment filled with multiple stimuli, some of which require inhibition and some of which do not. The stimulus-selective stop-signal task has been used to explore this issue assuming that all participants interrupt their ongoing responses selectively to stop but not to ignore signals. However, recent behavioral evidence suggests that some individuals do not carry out the task as experimenters expect, since they seemed to interrupt their response non-selectively to both signals. In the present study, we detected and controlled the cognitive strategy adopted by participants (n=57) when they performed a stimulus-selective stop-signal task before comparing brain activation between conditions. In order to determine both the onset and the end of the response cancellation process underlying each strategy and to fully take advantage of the precise temporal resolution of event-related potentials, we used a mass univariate approach. Source localization techniques were also employed to estimate the neural underpinnings of the effects observed at the scalp level. Our results from scalp and source level analysis support the behavioral-based strategy classification. Specific effects were observed depending on the strategy adopted by participants. Thus, when contrasting successful stop versus ignore conditions, increased activation was only evident for subjects who were classified as using a strategy whereby the response interruption process was selective to stop trials. This increased activity was observed during the P3 time window in several left-lateralized brain regions, including middle and inferior frontal gyri, as well as parietal and insular cortices. By contrast, in those participants who used a strategy characterized by stopping non-selectively, no activation differences between successful stop and ignore conditions were observed at the estimated time at which response interruption process occurs. Overall, results from the current study highlight the importance of controlling for the different strategies adopted by participants to perform selective stopping tasks before analyzing brain activation patterns. Copyright © 2016 Elsevier Inc. All rights reserved.
On Developing Independent Critical Thinking: What We Can Learn from Studies of the Research Process.
ERIC Educational Resources Information Center
Stotsky, Sandra
1991-01-01
Examines studies of what students do as they select and narrow a topic, locate sources, sift through these sources, and develop a central research question or thesis statement. Notes limitations of a related body of research focusing on other kinds of academic writing. Raises conceptual and methodological issues for researchers to address in…
ERIC Educational Resources Information Center
Wass, Christopher; Pizzo, Alessandro; Sauce, Bruno; Kawasumi, Yushi; Sturzoiu, Tudor; Ree, Fred; Otto, Tim; Matzel, Louis D.
2013-01-01
A common source of variance (i.e., "general intelligence") underlies an individual's performance across diverse tests of cognitive ability, and evidence indicates that the processing efficacy of working memory may serve as one such source of common variance. One component of working memory, selective attention, has been reported to…
United States Navy Contracting Officer Warranting Process
2011-03-01
by 30% or more of the respondents: Contract Law , Cost Analysis, Market Research, Contract Source Selection, Simplified Acquisition Procedures, and...that the majority of AOs found the following course at least somewhat important: Contract Law , Cost Analysis, Market Research, Contract 52 Source...the budget and appropriation cycle 4. Ethics and conduct standards 5. Basic contract laws and regulations 6. Socio-economic requirements in
Automated cross-identifying radio to infrared surveys using the LRPY algorithm: a case study
NASA Astrophysics Data System (ADS)
Weston, S. D.; Seymour, N.; Gulyaev, S.; Norris, R. P.; Banfield, J.; Vaccari, M.; Hopkins, A. M.; Franzen, T. M. O.
2018-02-01
Cross-identifying complex radio sources with optical or infra red (IR) counterparts in surveys such as the Australia Telescope Large Area Survey (ATLAS) has traditionally been performed manually. However, with new surveys from the Australian Square Kilometre Array Pathfinder detecting many tens of millions of radio sources, such an approach is no longer feasible. This paper presents new software (LRPY - Likelihood Ratio in PYTHON) to automate the process of cross-identifying radio sources with catalogues at other wavelengths. LRPY implements the likelihood ratio (LR) technique with a modification to account for two galaxies contributing to a sole measured radio component. We demonstrate LRPY by applying it to ATLAS DR3 and a Spitzer-based multiwavelength fusion catalogue, identifying 3848 matched sources via our LR-based selection criteria. A subset of 1987 sources have flux density values for all IRAC bands which allow us to use criteria to distinguish between active galactic nuclei (AGNs) and star-forming galaxies (SFG). We find that 936 radio sources ( ≈ 47 per cent) meet both of the Lacy and Stern AGN selection criteria. Of the matched sources, 295 have spectroscopic redshifts and we examine the radio to IR flux ratio versus redshift, proposing an AGN selection criterion below the Elvis radio-loud AGN limit for this dataset. Taking the union of all three AGNs selection criteria we identify 956 as AGNs ( ≈ 48 per cent). From this dataset, we find a decreasing fraction of AGNs with lower radio flux densities consistent with other results in the literature.
ERIC Educational Resources Information Center
Isbell, Elif; Wray, Amanda Hampton; Neville, Helen J.
2016-01-01
Selective attention, the ability to enhance the processing of particular input while suppressing the information from other concurrent sources, has been postulated to be a foundational skill for learning and academic achievement. The neural mechanisms of this foundational ability are both vulnerable and enhanceable in children from lower…
NASA Astrophysics Data System (ADS)
Fanti, C.; Fanti, R.; Zanichelli, A.; Dallacasa, D.; Stanghellini, C.
2011-04-01
Context. Compact steep-spectrum radio sources and giga-hertz peaked spectrum radio sources (CSS/GPS) are generally considered to be mostly young radio sources. In recent years we studied at many wavelengths a sample of these objects selected from the B3-VLA catalog: the B3-VLA CSS sample. Only ≈60% of the sources were optically identified. Aims: We aim to increase the number of optical identifications and study the properties of the host galaxies of young radio sources. Methods: We cross-correlated the CSS B3-VLA sample with the Sloan Digital Sky Survey (SDSS), DR7, and complemented the SDSS photometry with available GALEX (DR 4/5 and 6) and near-IR data from UKIRT and 2MASS. Results: We obtained new identifications and photometric redshifts for eight faint galaxies and for one quasar and two quasar candidates. Overall we have 27 galaxies with SDSS photometry in five bands, for which we derived the ultraviolet-optical spectral energy distribution (UV-O-SED). We extended our investigation to additional CSS/GPS selected from the literature. Most of the galaxies show an excess of ultra-violet (UV) radiation compared with the UV-O-SED of local radio-quiet ellipticals. We found a strong dependence of the UV excess on redshift and analyzed it assuming that it is generated either from the nucleus (hidden quasar) or from a young stellar population (YSP). We also compare the UV-O-SEDs of our CSS/GPS sources with those of a selection of large size (LSO) powerful radio sources from the literature. Conclusions: If the major process of the UV excess is caused by a YSP, our conclusion is that it is the result of the merger process that also triggered the onset of the radio source with some time delay. We do not see evidence for a major contribution from a YSP triggered by the radio sources itself. Appendices A-G are only available in electronic form at http://www.aanda.org
Autothermal hydrogen storage and delivery systems
Pez, Guido Peter [Allentown, PA; Cooper, Alan Charles [Macungie, PA; Scott, Aaron Raymond [Allentown, PA
2011-08-23
Processes are provided for the storage and release of hydrogen by means of dehydrogenation of hydrogen carrier compositions where at least part of the heat of dehydrogenation is provided by a hydrogen-reversible selective oxidation of the carrier. Autothermal generation of hydrogen is achieved wherein sufficient heat is provided to sustain the at least partial endothermic dehydrogenation of the carrier at reaction temperature. The at least partially dehydrogenated and at least partially selectively oxidized liquid carrier is regenerated in a catalytic hydrogenation process where apart from an incidental employment of process heat, gaseous hydrogen is the primary source of reversibly contained hydrogen and the necessary reaction energy.
Niina, Megumi; Okamura, Jun-ya; Wang, Gang
2015-10-01
Scalp event-related potential (ERP) studies have demonstrated larger N170 amplitudes when subjects view faces compared to items from object categories. Extensive attempts have been made to clarify face selectivity and hemispheric dominance for face processing. The purpose of this study was to investigate hemispheric differences in N170s activated by human faces and non-face objects, as well as the extent of overlap of their sources. ERP was recorded from 20 subjects while they viewed human face and non-face images. N170s obtained during the presentation of human faces appeared earlier and with larger amplitude than for other category images. Further source analysis with a two-dipole model revealed that the locations of face and object processing largely overlapped in the left hemisphere. Conversely, the source for face processing in the right hemisphere located more anterior than the source for object processing. The results suggest that the neuronal circuits for face and object processing are largely shared in the left hemisphere, with more distinct circuits in the right hemisphere. Copyright © 2015 Elsevier B.V. All rights reserved.
Estimation of mercury emission from different sources to atmosphere in Chongqing, China.
Wang, Dingyong; He, Lei; Wei, Shiqiang; Feng, Xinbin
2006-08-01
This investigation presents a first assessment of the contribution to the regional mercury budget from anthropogenic and natural sources in Chongqing, an important industrial region in southwest China. The emissions of mercury to atmosphere from anthropogenic sources in the region were estimated through indirect approaches, i.e. using commonly acceptable emission factors method, which based on annual process throughputs or consumption for these sources. The natural mercury emissions were estimated from selected natural sources by the dynamic flux chamber technique. The results indicated that the anthropogenic mercury emissions totaled approximately 8.85 tons (t), more than 50% of this total originated in coal combustion and 23.7% of this total emission in the industrial process (include cement production, metal smelting and chemical industry). The natural emissions represented approximately 17% of total emissions (1.78 t yr(-1)). The total mercury emission to atmosphere in Chongqing in 2001 was 10.63 t.
A Scalable, Open Source Platform for Data Processing, Archiving and Dissemination
2016-01-01
Object Oriented Data Technology (OODT) big data toolkit developed by NASA and the Work-flow INstance Generation and Selection (WINGS) scientific work...to several challenge big data problems and demonstrated the utility of OODT-WINGS in addressing them. Specific demonstrated analyses address i...source software, Apache, Object Oriented Data Technology, OODT, semantic work-flows, WINGS, big data , work- flow management 16. SECURITY CLASSIFICATION OF
Geostationary Carbon Process Mapper (GCPM)
NASA Technical Reports Server (NTRS)
Key, Richard; Sander, Stanley; Eldering, Annmarie; Miller, Charles; Frankenberg, Christian; Natraj, Vijay; Rider, David; Blavier, Jean-Francois; Bekker, Dmitriy; Wu, Yen-Hung
2012-01-01
Geostationary Carbon Process Mapper (GCPM) is an earth science mission to measure key atmospheric trace gases related to climate change and human activity.Understanding of sources and sinks of CO2 is currently limited by frequency of observations and uncertainty in vertical transport. GCPM improves this situation by making simultaneous high resolution measurements of CO2, CH4, CF, and CO in near-IR, many times per day. GCPM is able to investigate processes with time scales of minutes to hours. CO2, CH4, CF, Co selected because their combination provides information needed to disentangle natural and anthropogenic sources/sinks. Quasi-continuous monitoring effectively eliminates atmospheric transport uncertainties from source/sink inversion modeling. will have one instrument (GeoFTS), hosted on a commercial communications satellite, planned for two years operation. GCPM will affordably advance the understanding of observed cycle variability improving future climate projections.
Developing a framework for energy technology portfolio selection
NASA Astrophysics Data System (ADS)
Davoudpour, Hamid; Ashrafi, Maryam
2012-11-01
Today, the increased consumption of energy in world, in addition to the risk of quick exhaustion of fossil resources, has forced industrial firms and organizations to utilize energy technology portfolio management tools viewed both as a process of diversification of energy sources and optimal use of available energy sources. Furthermore, the rapid development of technologies, their increasing complexity and variety, and market dynamics have made the task of technology portfolio selection difficult. Considering high level of competitiveness, organizations need to strategically allocate their limited resources to the best subset of possible candidates. This paper presents the results of developing a mathematical model for energy technology portfolio selection at a R&D center maximizing support of the organization's strategy and values. The model balances the cost and benefit of the entire portfolio.
Scott, Finlay; Jardim, Ernesto; Millar, Colin P; Cerviño, Santiago
2016-01-01
Estimating fish stock status is very challenging given the many sources and high levels of uncertainty surrounding the biological processes (e.g. natural variability in the demographic rates), model selection (e.g. choosing growth or stock assessment models) and parameter estimation. Incorporating multiple sources of uncertainty in a stock assessment allows advice to better account for the risks associated with proposed management options, promoting decisions that are more robust to such uncertainty. However, a typical assessment only reports the model fit and variance of estimated parameters, thereby underreporting the overall uncertainty. Additionally, although multiple candidate models may be considered, only one is selected as the 'best' result, effectively rejecting the plausible assumptions behind the other models. We present an applied framework to integrate multiple sources of uncertainty in the stock assessment process. The first step is the generation and conditioning of a suite of stock assessment models that contain different assumptions about the stock and the fishery. The second step is the estimation of parameters, including fitting of the stock assessment models. The final step integrates across all of the results to reconcile the multi-model outcome. The framework is flexible enough to be tailored to particular stocks and fisheries and can draw on information from multiple sources to implement a broad variety of assumptions, making it applicable to stocks with varying levels of data availability The Iberian hake stock in International Council for the Exploration of the Sea (ICES) Divisions VIIIc and IXa is used to demonstrate the framework, starting from length-based stock and indices data. Process and model uncertainty are considered through the growth, natural mortality, fishing mortality, survey catchability and stock-recruitment relationship. Estimation uncertainty is included as part of the fitting process. Simple model averaging is used to integrate across the results and produce a single assessment that considers the multiple sources of uncertainty.
The connectome mapper: an open-source processing pipeline to map connectomes with MRI.
Daducci, Alessandro; Gerhard, Stephan; Griffa, Alessandra; Lemkaddem, Alia; Cammoun, Leila; Gigandet, Xavier; Meuli, Reto; Hagmann, Patric; Thiran, Jean-Philippe
2012-01-01
Researchers working in the field of global connectivity analysis using diffusion magnetic resonance imaging (MRI) can count on a wide selection of software packages for processing their data, with methods ranging from the reconstruction of the local intra-voxel axonal structure to the estimation of the trajectories of the underlying fibre tracts. However, each package is generally task-specific and uses its own conventions and file formats. In this article we present the Connectome Mapper, a software pipeline aimed at helping researchers through the tedious process of organising, processing and analysing diffusion MRI data to perform global brain connectivity analyses. Our pipeline is written in Python and is freely available as open-source at www.cmtk.org.
Applications of absorption spectroscopy using quantum cascade lasers.
Zhang, Lizhu; Tian, Guang; Li, Jingsong; Yu, Benli
2014-01-01
Infrared laser absorption spectroscopy (LAS) is a promising modern technique for sensing trace gases with high sensitivity, selectivity, and high time resolution. Mid-infrared quantum cascade lasers, operating in a pulsed or continuous wave mode, have potential as spectroscopic sources because of their narrow linewidths, single mode operation, tunability, high output power, reliability, low power consumption, and compactness. This paper reviews some important developments in modern laser absorption spectroscopy based on the use of quantum cascade laser (QCL) sources. Among the various laser spectroscopic methods, this review is focused on selected absorption spectroscopy applications of QCLs, with particular emphasis on molecular spectroscopy, industrial process control, combustion diagnostics, and medical breath analysis.
Effects of high-dose ethanol intoxication and hangover on cognitive flexibility.
Wolff, Nicole; Gussek, Philipp; Stock, Ann-Kathrin; Beste, Christian
2018-01-01
The effects of high-dose ethanol intoxication on cognitive flexibility processes are not well understood, and processes related to hangover after intoxication have remained even more elusive. Similarly, it is unknown in how far the complexity of cognitive flexibility processes is affected by intoxication and hangover effects. We performed a neurophysiological study applying high density electroencephalography (EEG) recording to analyze event-related potentials (ERPs) and perform source localization in a task switching paradigm which varied the complexity of task switching by means of memory demands. The results show that high-dose ethanol intoxication only affects task switching (i.e. cognitive flexibility processes) when memory processes are required to control task switching mechanisms, suggesting that even high doses of ethanol compromise cognitive processes when they are highly demanding. The EEG and source localization data show that these effects unfold by modulating response selection processes in the anterior cingulate cortex. Perceptual and attentional selection processes as well as working memory processes were only unspecifically modulated. In all subprocesses examined, there were no differences between the sober and hangover states, thus suggesting a fast recovery of cognitive flexibility after high-dose ethanol intoxication. We assume that the gamma-aminobutyric acid (GABAergic) system accounts for the observed effects, while they can hardly be explained by the dopaminergic system. © 2016 Society for the Study of Addiction.
Photo-oxidation of PAHs with calcium peroxide as a source of the hydroxyl radicals
NASA Astrophysics Data System (ADS)
Kozak, Jolanta; Włodarczyk-Makuła, Maria
2018-02-01
The efficiency of the removal of selected PAHs from the pretreated coking wastewater with usage of CaO2, Fenton reagent (FeSO4) and UV rays are presented in this article. The investigations were carried out using coking wastewater originating from biological, industrial wastewater treatment plant. At the beginning of the experiment, the calcium peroxide (CaO2) powder as a source of hydroxyl radicals (OH•) and Fenton reagent were added to the samples of wastewater. Then, the samples were exposed to UV rays for 360 s. The process was carried out at pH 3.5-3.8. After photo-oxidation process a decrease in the PAHs concentration was observed. The removal efficiency of selected hydrocarbons was in the ranged of 89-98%. The effectiveness of PAHs degradation was directly proportional to the calcium peroxide dose.
Wang, Dongsheng; Xing, Linan; Xie, Jiankun; Chow, Christopher W K; Xu, Zhizhen; Zhao, Yanmei; Drikas, Mary
2010-09-01
China has a very complex water supply system which relies on many rivers and lakes. As the population and economic development increases, water quality is greatly impacted by anthropogenic processes. This seriously affects the character of the dissolved organic matter (DOM) and imposes operational challenges to the water treatment facilities in terms of process optimization. The aim of this investigation was to compare selected drinking water sources (raw) with different DOM character, and the respective treated waters after coagulation, using simple organic characterization techniques to obtain a better understanding of the impact of source water quality on water treatment. Results from the analyses of selected water samples showed that the dissolved organic carbon (DOC) of polluted waters is generally higher than that of un-polluted waters, but the specific UV absorbance value has the opposite trend. After resolving the high performance size exclusion chromatography (HPSEC) peak components of source waters using peak fitting, the twelve waters studied can be divided into two main groups (micro-polluted and un-polluted) by using cluster analysis. The DOM removal efficiency (treatability) of these waters has been compared using four coagulants. For water sources allocated to the un-polluted group, traditional coagulants (Al(2)(SO(4))(3) and FeCl(3)) achieved better removal. High performance poly aluminum chloride, a new type of composite coagulant, performed very well and more efficiently for polluted waters. After peak fitting the HPSEC chromatogram of each of the treated waters, average removal efficiency of the profiles can be calculated and these correspond well with DOC and UV removal. This provides a convenient tool to assess coagulation removal and coagulant selection. Copyright (c) 2010 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vaughan, D.
A compilation of data is presented. Included are properties of the elements, electron binding energies, characteristic x-ray energies, fluorescence yields for K and L shells, Auger energies, energy levels for hydrogen-, helium-, and neonlike ions, scattering factors and mass absorption coefficients, and transmission bands of selected filters. Also included are selected reprints on scattering processes, x-ray sources, optics, x-ray detectors, and synchrotron radiation facilities. (WRF)
2017-09-14
one such study, AOPs were investigated for the removal of organophosphorus pesticides in wastewater by selecting and optimizing oxidation processes...micropollutants (primarily pharmaceuticals, personal care products, and pesticides ) in four 64 different river water sources (Colorado River, Passaic...the National Institutes of Health PubChem data repository (National Institutes of Health 2016). Additional chemical properties were also selected for
NASA Astrophysics Data System (ADS)
Hong, JaeSub; van den Berg, Maureen; Schlegel, Eric M.; Grindlay, Jonathan E.; Koenig, Xavier; Laycock, Silas; Zhao, Ping
2005-12-01
We describe the X-ray analysis procedure of the ongoing Chandra Multiwavelength Plane (ChaMPlane) Survey and report the initial results from the analysis of 15 selected anti-Galactic center observations (90deg
DBCC Software as Database for Collisional Cross-Sections
NASA Astrophysics Data System (ADS)
Moroz, Daniel; Moroz, Paul
2014-10-01
Interactions of species, such as atoms, radicals, molecules, electrons, and photons, in plasmas used for materials processing could be very complex, and many of them could be described in terms of collisional cross-sections. Researchers involved in plasma simulations must select reasonable cross-sections for collisional processes for implementing them into their simulation codes to be able to correctly simulate plasmas. However, collisional cross-section data are difficult to obtain, and, for some collisional processes, the cross-sections are still not known. Data on collisional cross-sections can be obtained from numerous sources including numerical calculations, experiments, journal articles, conference proceedings, scientific reports, various universities' websites, national labs and centers specifically devoted to collecting data on cross-sections. The cross-sections data received from different sources could be partial, corresponding to limited energy ranges, or could even not be in agreement. The DBCC software package was designed to help researchers in collecting, comparing, and selecting cross-sections, some of which could be constructed from others or chosen as defaults. This is important as different researchers may place trust in different cross-sections or in different sources. We will discuss the details of DBCC and demonstrate how it works and why it is beneficial to researchers working on plasma simulations.
Locating the source of spreading in temporal networks
NASA Astrophysics Data System (ADS)
Huang, Qiangjuan; Zhao, Chengli; Zhang, Xue; Yi, Dongyun
2017-02-01
The topological structure of many real networks changes with time. Thus, locating the sources of a temporal network is a creative and challenging problem, as the enormous size of many real networks makes it unfeasible to observe the state of all nodes. In this paper, we propose an algorithm to solve this problem, named the backward temporal diffusion process. The proposed algorithm calculates the shortest temporal distance to locate the transmission source. We assume that the spreading process can be modeled as a simple diffusion process and by consensus dynamics. To improve the location accuracy, we also adopt four strategies to select which nodes should be observed by ranking their importance in the temporal network. Our paper proposes a highly accurate method for locating the source in temporal networks and is, to the best of our knowledge, a frontier work in this field. Moreover, our framework has important significance for controlling the transmission of diseases or rumors and formulating immediate immunization strategies.
NASA Technical Reports Server (NTRS)
Acton, C. H., Jr.; Ohtakay, H.
1975-01-01
Optical navigation uses spacecraft television pictures of a target body against a known star background in a process which relates the spacecraft trajectory to the target body. This technology was used in the Mariner-Venus-Mercury mission, with the optical data processed in near-real-time, simulating a mission critical environment. Optical data error sources were identified, and a star location error analysis was carried out. Several methods for selecting limb crossing coordinates were used, and a limb smear compensation was introduced. Omission of planetary aberration corrections was the source of large optical residuals.
NASA Astrophysics Data System (ADS)
Koiter, A. J.; Owens, P. N.; Petticrew, E. L.; Lobb, D. A.
2013-10-01
Sediment fingerprinting is a technique that is increasingly being used to improve the understanding of sediment dynamics within river basins. At present, one of the main limitations of the technique is the ability to link sediment back to their sources due to the non-conservative nature of many of the sediment properties. The processes that occur between the sediment source locations and the point of collection downstream are not well understood or quantified and currently represent a black-box in the sediment fingerprinting approach. The literature on sediment fingerprinting tends to assume that there is a direct connection between sources and sinks, while much of the broader environmental sedimentology literature identifies that numerous chemical, biological and physical transformations and alterations can occur as sediment moves through the landscape. The focus of this paper is on the processes that drive particle size and organic matter selectivity and biological, geochemical and physical transformations and how understanding these processes can be used to guide sampling protocols, fingerprint selection and data interpretation. The application of statistical approaches without consideration of how unique sediment fingerprints have developed and how robust they are within the environment is a major limitation of many recent studies. This review summarises the current information, identifies areas that need further investigation and provides recommendations for sediment fingerprinting that should be considered for adoption in future studies if the full potential and utility of the approach are to be realised.
48 CFR 2015.303 - Responsibilities.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 48 Federal Acquisition Regulations System 6 2010-10-01 2010-10-01 true Responsibilities. 2015.303 Section 2015.303 Federal Acquisition Regulations System NUCLEAR REGULATORY COMMISSION CONTRACTING METHODS AND CONTRACT TYPES CONTRACTING BY NEGOTIATION Source Selection Processes and Techniques 2015.303...
THE MAXIMIUM POWER PRINCIPLE: AN EMPIRICAL INVESTIGATION
The maximum power principle is a potential guide to understanding the patterns and processes of ecosystem development and sustainability. The principle predicts the selective persistence of ecosystem designs that capture a previously untapped energy source. This hypothesis was in...
Choosing a Geothermal as an HVAC System.
ERIC Educational Resources Information Center
Lensenbigler, John D.
2002-01-01
Describes the process of selecting and installing geothermal water source heat pumps for new residence halls at Johnson Bible College in Knoxville, Tennessee, including choosing the type of geothermal design, contractors, and interior equipment, and cost and payback. (EV)
Quality of Information Approach to Improving Source Selection in Tactical Networks
2017-02-01
consider the performance of this process based on metrics relating to quality of information: accuracy, timeliness, completeness and reliability. These...that are indicators of that the network is meeting these quality requirements. We study effective data rate, social distance, link integrity and the...utility of information as metrics within a multi-genre network to determine the quality of information of its available sources. This paper proposes a
NASA Astrophysics Data System (ADS)
Wang, Ji; Zhang, Ru; Yan, Yuting; Dong, Xiaoqiang; Li, Jun Ming
2017-05-01
Hazardous gas leaks in the atmosphere can cause significant economic losses in addition to environmental hazards, such as fires and explosions. A three-stage hazardous gas leak source localization method was developed that uses movable and stationary gas concentration sensors. The method calculates a preliminary source inversion with a modified genetic algorithm (MGA) and has the potential to crossover with eliminated individuals from the population, following the selection of the best candidate. The method then determines a search zone using Markov Chain Monte Carlo (MCMC) sampling, utilizing a partial evaluation strategy. The leak source is then accurately localized using a modified guaranteed convergence particle swarm optimization algorithm with several bad-performing individuals, following selection of the most successful individual with dynamic updates. The first two stages are based on data collected by motionless sensors, and the last stage is based on data from movable robots with sensors. The measurement error adaptability and the effect of the leak source location were analyzed. The test results showed that this three-stage localization process can localize a leak source within 1.0 m of the source for different leak source locations, with measurement error standard deviation smaller than 2.0.
NASA Astrophysics Data System (ADS)
Wolicka, Dorota; Borkowski, Andrzej
2014-03-01
Sulphidogenous microorganism communities were isolated from soil polluted by crude oil. The study was focused on determining the influence of 1) copper (II) concentration on the activity of selected microorganism communities and 2) the applied electron donor on the course and evolution of mineral-forming processes under conditions favouring growth of sulphate-reducing bacteria (SRB). The influence of copper concentration on the activity of selected microorganism communities and the type of mineral phases formed was determined during experiments in which copper (II) chloride at concentrations of 0.1, 0.2, 0.5 and 0.7 g/L was added to SRB cultures. The experiments were performed in two variants: with ethanol (4 g/L) or lactate (4 g/L) as the sole carbon source. In order to determine the taxonomic composition of the selected microorganism communities, the 16S rRNA method was used. Results of this analysis confirmed the presence of Desulfovibrio, Desulfohalobium, Desulfotalea, Thermotoga, Solibacter, Gramella, Anaeromyxobacter and Myxococcus sp. in the stationary cultures. The post-culture sediments contained covelline (CuS) and digenite (Cu9S5 ). Based on the results, it can be stated that the type of carbon source applied during incubation plays a crucial role in determining the mineral composition of the post-culture sediments. Thus, regardless of the amount of copper ion introduced to a culture with lactate as the sole carbon source, no copper sulphide was observed in the post-culture sediments. Cultures with ethanol as the sole carbon source, on the other hand, yielded covelline or digenite in all post-culture sediments.
1994-12-01
be INTRODUCTION familiar: best value source selection, processes and metrics In simplified terms, acquisition and continuous improvement ; of a training ...pro- continuous improvement , MIL-STD- posed processes and metrics are 1379D, the systems approach to placed in the contract in a training , concurrent...identification and 5 Continuous Process Improvement correction of errors are critical to software product 6 Training correctness and quality. Correcting
48 CFR 15.102 - Oral presentations.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Oral presentations. 15.102 Section 15.102 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION CONTRACTING METHODS AND CONTRACT TYPES CONTRACTING BY NEGOTIATION Source Selection Processes and Techniques 15.102 Oral...
48 CFR 1615.170 - Applicability.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 48 Federal Acquisition Regulations System 6 2011-10-01 2011-10-01 false Applicability. 1615.170 Section 1615.170 Federal Acquisition Regulations System OFFICE OF PERSONNEL MANAGEMENT FEDERAL EMPLOYEES... Source Selection Processes and Techniques. 1615.170 Applicability. FAR subpart 15.1 has no practical...
48 CFR 2115.170 - Applicability.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 48 Federal Acquisition Regulations System 6 2011-10-01 2011-10-01 false Applicability. 2115.170 Section 2115.170 Federal Acquisition Regulations System OFFICE OF PERSONNEL MANAGEMENT, FEDERAL EMPLOYEES... BY NEGOTIATION Source Selection Processes and Techniques 2115.170 Applicability. FAR subpart 15.1 has...
48 CFR 1615.170 - Applicability.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 48 Federal Acquisition Regulations System 6 2010-10-01 2010-10-01 true Applicability. 1615.170 Section 1615.170 Federal Acquisition Regulations System OFFICE OF PERSONNEL MANAGEMENT FEDERAL EMPLOYEES... Source Selection Processes and Techniques. 1615.170 Applicability. FAR subpart 15.1 has no practical...
48 CFR 1615.170 - Applicability.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 48 Federal Acquisition Regulations System 6 2014-10-01 2014-10-01 false Applicability. 1615.170 Section 1615.170 Federal Acquisition Regulations System OFFICE OF PERSONNEL MANAGEMENT FEDERAL EMPLOYEES... Source Selection Processes and Techniques 1615.170 Applicability. FAR subpart 15.1 has no practical...
48 CFR 1615.170 - Applicability.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 48 Federal Acquisition Regulations System 6 2012-10-01 2012-10-01 false Applicability. 1615.170 Section 1615.170 Federal Acquisition Regulations System OFFICE OF PERSONNEL MANAGEMENT FEDERAL EMPLOYEES... Source Selection Processes and Techniques 1615.170 Applicability. FAR subpart 15.1 has no practical...
48 CFR 2115.170 - Applicability.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 48 Federal Acquisition Regulations System 6 2012-10-01 2012-10-01 false Applicability. 2115.170 Section 2115.170 Federal Acquisition Regulations System OFFICE OF PERSONNEL MANAGEMENT, FEDERAL EMPLOYEES... BY NEGOTIATION Source Selection Processes and Techniques 2115.170 Applicability. FAR subpart 15.1 has...
48 CFR 2115.170 - Applicability.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 48 Federal Acquisition Regulations System 6 2014-10-01 2014-10-01 false Applicability. 2115.170 Section 2115.170 Federal Acquisition Regulations System OFFICE OF PERSONNEL MANAGEMENT, FEDERAL EMPLOYEES... BY NEGOTIATION Source Selection Processes and Techniques 2115.170 Applicability. FAR subpart 15.1 has...
Strobach, Tilo; Torsten, Schubert
2017-01-01
In dual-task situations, interference between two simultaneous tasks impairs performance. With practice, however, this impairment can be reduced. To identify mechanisms leading to a practice-related improvement in sensorimotor dual tasks, the present review applied the following general hypothesis: Sources that impair dual-task performance at the beginning of practice are associated with mechanisms for the reduction of dual-task impairment at the end of practice. The following types of processes provide sources for the occurrence of this impairment: (a) capacity-limited processes within the component tasks, such as response-selection or motor response stages, and (b) cognitive control processes independent of these tasks and thus operating outside of component-task performance. Dual-task practice studies show that, under very specific conditions, capacity-limited processes within the component tasks are automatized with practice, reducing the interference between two simultaneous tasks. Further, there is evidence that response-selection stages are shortened with practice. Thus, capacity limitations at these stages are sources for dual-task costs at the beginning of practice and are overcome with practice. However, there is no evidence demonstrating the existence of practice-related mechanisms associated with capacity-limited motor-response stages. Further, during practice, there is an acquisition of executive control skills for an improved allocation of limited attention resources to two tasks as well as some evidence supporting the assumption of improved task coordination. These latter mechanisms are associated with sources of dual-task interference operating outside of component task performance at the beginning of practice and also contribute to the reduction of dual-task interference at its end. PMID:28439319
Parameters in selective laser melting for processing metallic powders
NASA Astrophysics Data System (ADS)
Kurzynowski, Tomasz; Chlebus, Edward; Kuźnicka, Bogumiła; Reiner, Jacek
2012-03-01
The paper presents results of studies on Selective Laser Melting. SLM is an additive manufacturing technology which may be used to process almost all metallic materials in the form of powder. Types of energy emission sources, mainly fiber lasers and/or Nd:YAG laser with similar characteristics and the wavelength of 1,06 - 1,08 microns, are provided primarily for processing metallic powder materials with high absorption of laser radiation. The paper presents results of selected variable parameters (laser power, scanning time, scanning strategy) and fixed parameters such as the protective atmosphere (argon, nitrogen, helium), temperature, type and shape of the powder material. The thematic scope is very broad, so the work was focused on optimizing the process of selective laser micrometallurgy for producing fully dense parts. The density is closely linked with other two conditions: discontinuity of the microstructure (microcracks) and stability (repeatability) of the process. Materials used for the research were stainless steel 316L (AISI), tool steel H13 (AISI), and titanium alloy Ti6Al7Nb (ISO 5832-11). Studies were performed with a scanning electron microscope, a light microscopes, a confocal microscope and a μCT scanner.
Patient's decision making in selecting a hospital for elective orthopaedic surgery.
Moser, Albine; Korstjens, Irene; van der Weijden, Trudy; Tange, Huibert
2010-12-01
The admission to a hospital for elective surgery, like arthroplasty, can be planned ahead. The elective nature of arthroplasty and the increasing stimulus of the public to critically select a hospital raise the issue of how patients actually take such decisions. The aim of this paper is to describe the decision-making process of selecting a hospital as experienced by people who underwent elective joint arthroplasty and to understand what factors influenced the decision-making process. Qualitative descriptive study with 18 participants who had a hip or knee replacement within the last 5 years. Data were gathered from eight individual interviews and four focus group interviews and analysed by content analysis. Three categories that influenced the selection of a hospital were revealed: information sources, criteria in decision making and decision-making styles within the GP- patient relationship. Various contextual aspects influenced the decision-making process. Most participants gave higher priority to the selection of a medical specialist than to the selection of a hospital. Selecting a hospital for arthroplasty is extremely complex. The decision-making process is a highly individualized process because patients have to consider and assimilate a diversity of aspects, which are relevant to their specific situation. Our findings support the model of shared decision making, which indicates that general practitioners should be attuned to the distinct needs of each patient at various moments during the decision making, taking into account personal, medical and contextual factors. © 2010 Blackwell Publishing Ltd.
Pretreatment Solution for Water Recovery Systems
NASA Technical Reports Server (NTRS)
Muirhead, Dean (Inventor)
2018-01-01
Chemical pretreatments are used to produce usable water by treating a water source with a chemical pretreatment that contains a hexavalent chromium and an acid to generate a treated water source, wherein the concentration of sulfate compounds in the acid is negligible, and wherein the treated water source remains substantially free of precipitates after the addition of the chemical pretreatment. Other methods include reducing the pH in urine to be distilled for potable water extraction by pretreating the urine before distillation with a pretreatment solution comprising one or more acid sources selected from a group consisting of phosphoric acid, hydrochloric acid, and nitric acid, wherein the urine remains substantially precipitate free after the addition of the pretreatment solution. Another method described comprises a process for reducing precipitation in urine to be processed for water extraction by mixing the urine with a pretreatment solution comprising hexavalent chromium compound and phosphoric acid.
Residency selection process: description and annotated bibliography.
Aaron, P R; Frye, T L
1979-01-01
Specialty and residency training choices of medical students will affect the quality, mode, and geographic location of their future practice; the importance of such choices should not be underestimated. Medical school librarians have largely ignored the opportunity to interact with both medical students and medical school officials in providing sources needed to assist these career decisions, and for the most part students and administrators have ignored the opportunity to utilize the medical library in this process. This article presents an overview of the processes and procedures in which third- and fourth-year medical students are involved in selecting specialty and residency training, and provides a detailed description of the resources which the medical student should consult in order to make thoughtful, informed career decisions. The article urges medical school advisers and medical librarians to work as partners in providing information on specialty and residency selection to medical students. PMID:385087
Selection and application of microbial source tracking tools for water-quality investigations
Stoeckel, Donald M.
2005-01-01
Microbial source tracking (MST) is a complex process that includes many decision-making steps. Once a contamination problem has been defined, the potential user of MST tools must thoroughly consider study objectives before deciding upon a source identifier, a detection method, and an analytical approach to apply to the problem. Regardless of which MST protocol is chosen, underlying assumptions can affect the results and interpretation. It is crucial to incorporate tests of those assumptions in the study quality-control plan to help validate results and facilitate interpretation. Detailed descriptions of MST objectives, protocols, and assumptions are provided in this report to assist in selection and application of MST tools for water-quality investigations. Several case studies illustrate real-world applications of MST protocols over a range of settings, spatial scales, and types of contamination. Technical details of many available source identifiers and detection methods are included as appendixes. By use of this information, researchers should be able to formulate realistic expectations for the information that MST tools can provide and, where possible, successfully execute investigations to characterize sources of fecal contamination to resource waters.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Palmer, M.E.
1997-12-05
This V and V Report includes analysis of two revisions of the DMS [data management system] System Requirements Specification (SRS) and the Preliminary System Design Document (PSDD); the source code for the DMS Communication Module (DMSCOM) messages; the source code for selected DMS Screens, and the code for the BWAS Simulator. BDM Federal analysts used a series of matrices to: compare the requirements in the System Requirements Specification (SRS) to the specifications found in the System Design Document (SDD), to ensure the design supports the business functions, compare the discreet parts of the SDD with each other, to ensure thatmore » the design is consistent and cohesive, compare the source code of the DMS Communication Module with the specifications, to ensure that the resultant messages will support the design, compare the source code of selected screens to the specifications to ensure that resultant system screens will support the design, compare the source code of the BWAS simulator with the requirements to interface with DMS messages and data transfers relating to the BWAS operations.« less
Emission of Polychlorinated Naphthalenes during Thermal Related Processes
NASA Astrophysics Data System (ADS)
Liu, Guorui; Zheng, Minghui; Du, Bing; Liu, Wenbin; Zhang, Bing; Xiao, Ke
2010-05-01
Due to the structural similarity of polychlorinated naphthalenes (PCNs) to those of dioxins, PCNs exhibit toxicological properties similar to dioxins (Olivero-Verbel et al., 2004). Based on their high toxicity, persistence, bioaccumulation, and long-distance transmission, PCNs were also selected as a candidate POP for the UN-ECE (United Nations Economic Commission for Europe) POP protocol (Lerche et al., 2002). In addition, some studies suggested that PCNs contributed a greater proportion of the dioxin-like activity than polychlorinated biphenyls (PCBs) and polychlorinated dibenzo-p-dioxins/dibenzofurans (PCDD/Fs) contributed in some locations (Kannan et al., 1998). However, the identification and quantitation for PCN sources are very scarce compared with PCDD/Fs. Understanding the emission levels and developing the emission inventory of PCNs is important for regulatory and source reduction purposes. In this study, several potential sources were preliminarily investigated for PCN release. Coking process (CP), iron ore sintering (IOS), and electric arc furnace steel making units (AF) were selected due to their huge activity level of industrial production in China. Municipal solid waste incineration (MSWI) and medical waste incineration (MWI) were also investigated because of the possible high concentration of PCNs in stack gas. Two plants were investigated for each thermal related process, except for MWI with one incinerator was investigated. The stack gas samples were collected by automatic isokinetic sampling system (Isostack Basic, TCR TECORA, Milan Italy). Isotope dilution high resolution gas chromatography coupled with high resolution mass spectrometry (HRGC/HRMS) technique was used for the identification and quantitation of PCN congeners. The concentrations of PCNs from the selected thermal processes were determined in this study. The average concentrations of total PCNs were 26 ng Nm-3 for CP, 65 ng Nm-3 for IOS, 720 ng Nm-3 for AF, 443 ng Nm-3 for MSWI, and 45 ng Nm-3 for MWI. The emission factors of PCNs were derived, and the average values were 6 μg tonne-1 for CP, 42 μg tonne-1 for IOS, 2980 μg tonne-1 for AF, 1354 μg tonne-1 for MSWI, and 937 μg tonne-1 for MWI, which could be helpful for estimating the annual emission amounts of PCNs from the investigated sources. However, since the investigated plant numbers for each process are limited, there might be large uncertainties when developing the PCN emission inventory. From the obtained data in the preliminary investigation, it could be seen that these investigated sources are worthy of further concerns for PCN emission. Further investigation on PCN release from thermal related process is still in process. References Kannan, K., Imagawa, T., Blankenship, A.L., et al., 1998. Isomer-specific analysis and toxic evaluation of polychlorinated naphthalenes in soil, sediment, and biota collected near the site of a former chlor-alkali plant. Environ Sci Technol 32, 2507-2514. Lerche, D., van de Plassche, E., Schwegler, A., et al., 2002. Selecting chemical substances for the UN-ECE POP protocol. Chemosphere 47, 617-630. Olivero-Verbel, J., Vivas-Reyes, R., Pacheco-Londono, et al., 2004. Discriminant analysis for activation of the aryl hydrocarbon receptor by polychlorinated naphthalenes. J Mol Struc-Theochem 678, 157-161.
Method for localizing and isolating an errant process step
Tobin, Jr., Kenneth W.; Karnowski, Thomas P.; Ferrell, Regina K.
2003-01-01
A method for localizing and isolating an errant process includes the steps of retrieving from a defect image database a selection of images each image having image content similar to image content extracted from a query image depicting a defect, each image in the selection having corresponding defect characterization data. A conditional probability distribution of the defect having occurred in a particular process step is derived from the defect characterization data. A process step as a highest probable source of the defect according to the derived conditional probability distribution is then identified. A method for process step defect identification includes the steps of characterizing anomalies in a product, the anomalies detected by an imaging system. A query image of a product defect is then acquired. A particular characterized anomaly is then correlated with the query image. An errant process step is then associated with the correlated image.
Discovering highly obscured AGN with the Swift-BAT 100-month survey
NASA Astrophysics Data System (ADS)
Marchesi, Stefano; Ajello, Marco; Comastri, Andrea; Cusumano, Giancarlo; La Parola, Valentina; Segreto, Alberto
2017-01-01
In this talk, I present a new technique to find highly obscured AGN using the 100-month Swift-BAT survey. I will show the results of the combined Chandra and BAT spectral analysis in the 0.3-150 keV band of seven Seyfert 2 galaxies selected from the 100-month BAT catalog. We selected nearby (z<0.03) sources lacking of a ROSAT counterpart and never previously observed in the 0.3-10 keV energy range. All the objects are significantly obscured, having NH>1E23 cm-2 at a >99% confidence level, and one to three sources are candidate Compton thick Active Galactic Nuclei (CT-AGN), i.e., have NH>1E24 cm-2.Since the selection criteria we adopted have been extremely effective in detecting highly obscured AGN, further observations of these and other Seyfert 2 galaxies selected from the BAT 100-month catalog will allow us to create a statistically significant sample of highly obscured AGN, therefore better understanding the physics of the obscuration processes.
48 CFR 2015.300 - Scope of subpart.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 48 Federal Acquisition Regulations System 6 2010-10-01 2010-10-01 true Scope of subpart. 2015.300 Section 2015.300 Federal Acquisition Regulations System NUCLEAR REGULATORY COMMISSION CONTRACTING METHODS AND CONTRACT TYPES CONTRACTING BY NEGOTIATION Source Selection Processes and Techniques 2015.300 Scope...
48 CFR 15.100 - Scope of subpart.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Scope of subpart. 15.100 Section 15.100 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION CONTRACTING METHODS AND CONTRACT TYPES CONTRACTING BY NEGOTIATION Source Selection Processes and Techniques 15.100 Scope...
48 CFR 15.101 - Best value continuum.
Code of Federal Regulations, 2011 CFR
2011-10-01
....101 Section 15.101 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION CONTRACTING METHODS AND CONTRACT TYPES CONTRACTING BY NEGOTIATION Source Selection Processes and Techniques 15.101... cost or price may vary. For example, in acquisitions where the requirement is clearly definable and the...
7 CFR 1780.36 - Approving official review.
Code of Federal Regulations, 2010 CFR
2010-01-01
... funding sources, the approval official, after consultation with applicant, may submit a request for an... be used to determine the applications selected for further development and funding. After completing... priority scores for further processing. When authorizing the development of an application for funding, the...
A Selected Bibliography on Microbiological Laboratory Design.
ERIC Educational Resources Information Center
Laboratory Design Notes, 1967
1967-01-01
Reference sources on microbiological laboratory design are cited. Subjects covered include--(1) policies and general requirements, (2) ventilated cabinets, (3) animal isolation equipment, (4) air handling, ventilation, and filtration, (5) germicidal ultraviolet irradiation, (6) aerosol test facilities, (7) process production of microorganisms, and…
Riva-Murray, Karen; Burns, Douglas A.
2016-01-01
The U.S. Geological Survey has compiled a list of existing data sets, from selected sources, containing mercury (Hg) concentration data in fish and macroinvertebrate samples that were collected from flowing waters of New York State from 1970 through 2014. Data sets selected for inclusion in this report were limited to those that contain fish and (or) macroinvertebrate data that were collected across broad areas, cover relatively long time periods, and (or) were collected as part of a broader-scale (e.g. national) study or program. In addition, all data sets listed were collected, processed, and analyzed with documented methods, and contain critical sample information (e.g. fish species, fish size, Hg species) that is needed to analyze and interpret the reported Hg concentration data. Fourteen data sets, all from state or federal agencies, are listed in this report, along with selected descriptive information regarding each data source and data set contents. Together, these 14 data sets contain Hg and related data for more than 7,000 biological samples collected from more than 700 unique stream and river locations between 1970 and 2014.
2009-02-15
Magnon scattered light generally experiences a 90° rotation in polarization from the incident beam. The wave- vector selective BLS measurements...filters, phase locked microwave pulse sources, microwave and millimeter wave devices such as isolators, circulators, phase shifters, secure signal...Wave vector selective Brillouin light scattering measurements and analysis, " C. L. Ordofiez-Romero, B. A. Kalinikos, P. Krivosik, Wei Tong, P
Fabrication of selective-area growth InGaN LED by mixed-source hydride vapor-phase epitaxy
NASA Astrophysics Data System (ADS)
Bae, Sung Geun; Jeon, Injun; Jeon, Hunsoo; Kim, Kyoung Hwa; Yang, Min; Yi, Sam Nyung; Lee, Jae Hak; Ahn, Hyung Soo; Yu, Young Moon; Sawaki, Nobuhiko; Kim, Suck-Whan
2018-01-01
We prepared InGaN light-emitting diodes (LEDs) with the active layers grown from a mixed source of Ga-In-N materials on an n-type GaN substrate by a selective-area growth method and three fabrication steps: photolithography, epitaxial layer growth, and metallization. The preparation followed a previously developed experimental process using apparatus for mixed-source hydride vapor-phase epitaxy (HVPE), which consisted of a multi-graphite boat, for insulating against the high temperature and to control the growth rate of epilayers, filled with the mixed source on the inside and a radio-frequency (RF) heating coil for heating to a high temperature (T > 900 °C) and for easy control of temperature outside the source zone. Two types of LEDs were prepared, with In compositions of 11.0 and 6.0% in the InGaN active layer, and room-temperature electroluminescence measurements exhibited a main peak corresponding to the In composition at either 420 or 390 nm. The consecutive growth of InGaN LEDs by the mixed-source HVPE method provides a technique for the production of LEDs with a wide range of In compositions in the active layer.
3-D interactive visualisation tools for Hi spectral line imaging
NASA Astrophysics Data System (ADS)
van der Hulst, J. M.; Punzo, D.; Roerdink, J. B. T. M.
2017-06-01
Upcoming HI surveys will deliver such large datasets that automated processing using the full 3-D information to find and characterize HI objects is unavoidable. Full 3-D visualization is an essential tool for enabling qualitative and quantitative inspection and analysis of the 3-D data, which is often complex in nature. Here we present SlicerAstro, an open-source extension of 3DSlicer, a multi-platform open source software package for visualization and medical image processing, which we developed for the inspection and analysis of HI spectral line data. We describe its initial capabilities, including 3-D filtering, 3-D selection and comparative modelling.
[Source-monitoring deficits in schizophrenia: review and pharmacotherapeutic implications].
Juhász, Levente Zsolt; Bartkó, György
2007-03-01
The disturbance of source-monitoring is one of the various impairments in cognitive functioning observed in schizophrenic patients. The process of source-monitoring allows individuals to distinguish self generated thoughts and behaviours from those generated by others. The aim of the present study is to review the general psychological definition of source memory and source-monitoring and its neurological basis as well as the models for explanation of source-monitoring deficits. The relationship between source-monitoring-deficits and psychopathological symptoms as well as the effect of antipsychotic treatment on source-monitoring disturbances are introduced. There is evidence suggesting, that a selective source-monitoring deficit is in the occurrence of auditory hallucinations. The disturbance of prospective memory may influence unfavorably the compliance. Administration of antipsychotics in general can improve source-monitoring deficits. The neuropsychiatric perspective provides a more accurate and comprehensive understanding of schizophrenia.
Personalizing Protein Nourishment
DALLAS, DAVID C.; SANCTUARY, MEGAN R.; QU, YUNYAO; KHAJAVI, SHABNAM HAGHIGHAT; VAN ZANDT, ALEXANDRIA E.; DYANDRA, MELISSA; FRESE, STEVEN A.; BARILE, DANIELA; GERMAN, J. BRUCE
2016-01-01
Proteins are not equally digestible—their proteolytic susceptibility varies by their source and processing method. Incomplete digestion increases colonic microbial protein fermentation (putrefaction), which produces toxic metabolites that can induce inflammation in vitro and have been associated with inflammation in vivo. Individual humans differ in protein digestive capacity based on phenotypes, particularly disease states. To avoid putrefaction-induced intestinal inflammation, protein sources and processing methods must be tailored to the consumer’s digestive capacity. This review explores how food processing techniques alter protein digestibility and examines how physiological conditions alter digestive capacity. Possible solutions to improving digestive function or matching low digestive capacity with more digestible protein sources are explored. Beyond the ileal digestibility measurements of protein digestibility, less invasive, quicker and cheaper techniques for monitoring the extent of protein digestion and fermentation are needed to personalize protein nourishment. Biomarkers of protein digestive capacity and efficiency can be identified with the toolsets of peptidomics, metabolomics, microbial sequencing and multiplexed protein analysis of fecal and urine samples. By monitoring individual protein digestive function, the protein component of diets can be tailored via protein source and processing selection to match individual needs to minimize colonic putrefaction and, thus, optimize gut health. PMID:26713355
A pilot study to characterize fine particles in the environment of an automotive machining facility.
Sioutas, C
1999-04-01
The main goal of this study was to characterize fine particles (e.g., smaller than about 3 microns) in an automotive machining environment. The Toledo Machining Plant of Chrysler Corporation was selected for this purpose. The effect of local mechanical processes as aerosol sources was a major part of this investigation. To determine the size-dependent mass concentration of particles in the plant, the Micro-Orifice Uniform Deposit Impactor (MOUDI Model 100, MSP Corp., Minneapolis, Minnesota) was used. The MOUDI was placed at central locations in departments with sources inside the plant, so that the obtained information on the size distribution realistically represents the aerosol to which plant workers are exposed. Sampling was conducted over a 4-day period, and during three periods per day, each matching the work shifts. A special effort was made to place the MOUDI at a central location of a department with relatively homogeneous particle sources. The selected sampling sites included welding, grinding, steel machining, and heat treating processes. The average 24-hour mass concentrations of particles smaller than 3.2 microns in aerodynamic diameter were 167.8, 103.9, 201.7, and 112.7 micrograms/m3 for welding, grinding, mild steel, and heat treating processes, respectively. Finally, the mass median diameters of welding, heat treatment, machining, and grinding operations were approximately 0.5, 0.5, 0.6, and 0.8 micron, respectively.
Bovo, Barbara; Carlot, Milena; Fontana, Federico; Lombardi, Angiolella; Soligo, Stefano; Giacomini, Alessio; Corich, Viviana
2015-04-01
Nowadays grape marc represents one of the main by-product of winemaking. Many South Europe countries valorize this ligno-cellulosic waste through fermentation and distillation for industrial alcoholic beverage production. The storage of marcs is a crucial phase in the distillation process, due to the physicochemical transformations ascribed to microbial activity. Among the methods adopted by distillers to improve the quality of spirits, the use of selected yeasts has not been explored so far, therefore in this work we evaluated the selection criteria of Saccharomyces cerevisiae strains for grape marc fermentation. The proposed selection procedure included three steps: characterization of phenotypical traits, evaluation of selected strains on pasteurised grape marc at lab-scale (100 g) and pilot-scale fermentation (350 kg). This selection process was applied on 104 strains isolated from grape marcs of different origins and technological treatment. Among physiological traits, β-glucosidase activity level as quality trait seems to be only partially involved in increasing varietal flavour. More effective in describing yeast impact on distillate quality is the ratio higher alcohols/esters that indicates strain ability to increase positive flavours. Finally, evaluating grape marc as source of selected yeasts, industrial treatment rather than varietal origin seems to shape strain technological and quality traits. Copyright © 2014 Elsevier Ltd. All rights reserved.
Turinawe, Emmanueil B
2016-01-01
With the renewed call for community participation in health interventions after the Alma Ata Declaration, interest has been raised in volunteer community health workers (CHWs) acting as representatives of local communities. The present study interrogates the dynamic interface between local communities and the government in the selection of CHW volunteers in a rural community. Data were collected through participant observation of community events, 35 in-depth interviews, 20 focus groups and 15 informal conversations. A review of documents about Luwero district was also an important source of data. Ambiguous national guidelines and poor supervision of the selection process enabled the powerful community leaders to influence the selection of village health teams (VHTs). Intended to achieve community involvement, the selection process produced a disconnect in the local community where many members saw the selected VHTs as having been 'taken away'. Community involvement in the selection of VHTs took a form that, instead of empowering the local community, reinforced the responsibility of those in power and thus maintained the asymmetrical status quo.
Open source software in a practical approach for post processing of radiologic images.
Valeri, Gianluca; Mazza, Francesco Antonino; Maggi, Stefania; Aramini, Daniele; La Riccia, Luigi; Mazzoni, Giovanni; Giovagnoni, Andrea
2015-03-01
The purpose of this paper is to evaluate the use of open source software (OSS) to process DICOM images. We selected 23 programs for Windows and 20 programs for Mac from 150 possible OSS programs including DICOM viewers and various tools (converters, DICOM header editors, etc.). The programs selected all meet the basic requirements such as free availability, stand-alone application, presence of graphical user interface, ease of installation and advanced features beyond simple display monitor. Capabilities of data import, data export, metadata, 2D viewer, 3D viewer, support platform and usability of each selected program were evaluated on a scale ranging from 1 to 10 points. Twelve programs received a score higher than or equal to eight. Among them, five obtained a score of 9: 3D Slicer, MedINRIA, MITK 3M3, VolView, VR Render; while OsiriX received 10. OsiriX appears to be the only program able to perform all the operations taken into consideration, similar to a workstation equipped with proprietary software, allowing the analysis and interpretation of images in a simple and intuitive way. OsiriX is a DICOM PACS workstation for medical imaging and software for image processing for medical research, functional imaging, 3D imaging, confocal microscopy and molecular imaging. This application is also a good tool for teaching activities because it facilitates the attainment of learning objectives among students and other specialists.
Technologies for Upgrading Light Water Reactor Outlet Temperature
DOE Office of Scientific and Technical Information (OSTI.GOV)
Daniel S. Wendt; Piyush Sabharwall; Vivek Utgikar
Nuclear energy could potentially be utilized in hybrid energy systems to produce synthetic fuels and feedstocks from indigenous carbon sources such as coal and biomass. First generation nuclear hybrid energy system (NHES) technology will most likely be based on conventional light water reactors (LWRs). However, these LWRs provide thermal energy at temperatures of approximately 300°C, while the desired temperatures for many chemical processes are much higher. In order to realize the benefits of nuclear hybrid energy systems with the current LWR reactor fleets, selection and development of a complimentary temperature upgrading technology is necessary. This paper provides an initial assessmentmore » of technologies that may be well suited toward LWR outlet temperature upgrading for powering elevated temperature industrial and chemical processes during periods of off-peak power demand. Chemical heat transformers (CHTs) are a technology with the potential to meet LWR temperature upgrading requirements for NHESs. CHTs utilize chemical heat of reaction to change the temperature at which selected heat sources supply or consume thermal energy. CHTs could directly utilize LWR heat output without intermediate mechanical or electrical power conversion operations and the associated thermodynamic losses. CHT thermal characteristics are determined by selection of the chemical working pair and operating conditions. This paper discusses the chemical working pairs applicable to LWR outlet temperature upgrading and the CHT operating conditions required for providing process heat in NHES applications.« less
Method for materials deposition by ablation transfer processing
Weiner, Kurt H.
1996-01-01
A method in which a thin layer of semiconducting, insulating, or metallic material is transferred by ablation from a source substrate, coated uniformly with a thin layer of said material, to a target substrate, where said material is desired, with a pulsed, high intensity, patternable beam of energy. The use of a patternable beam allows area-selective ablation from the source substrate resulting in additive deposition of the material onto the target substrate which may require a very low percentage of the area to be covered. Since material is placed only where it is required, material waste can be minimized by reusing the source substrate for depositions on multiple target substrates. Due to the use of a pulsed, high intensity energy source the target substrate remains at low temperature during the process, and thus low-temperature, low cost transparent glass or plastic can be used as the target substrate. The method can be carried out atmospheric pressures and at room temperatures, thus eliminating vacuum systems normally required in materials deposition processes. This invention has particular application in the flat panel display industry, as well as minimizing materials waste and associated costs.
Fabrication of polycrystalline thin films by pulsed laser processing
Mitlitsky, Fred; Truher, Joel B.; Kaschmitter, James L.; Colella, Nicholas J.
1998-02-03
A method for fabricating polycrystalline thin films on low-temperature (or high-temperature) substrates which uses processing temperatures that are low enough to avoid damage to the substrate, and then transiently heating select layers of the thin films with at least one pulse of a laser or other homogenized beam source. The pulse length is selected so that the layers of interest are transiently heated to a temperature which allows recrystallization and/or dopant activation while maintaining the substrate at a temperature which is sufficiently low to avoid damage to the substrate. This method is particularly applicable in the fabrication of solar cells.
Fabrication of polycrystalline thin films by pulsed laser processing
Mitlitsky, F.; Truher, J.B.; Kaschmitter, J.L.; Colella, N.J.
1998-02-03
A method is disclosed for fabricating polycrystalline thin films on low-temperature (or high-temperature) substrates which uses processing temperatures that are low enough to avoid damage to the substrate, and then transiently heating select layers of the thin films with at least one pulse of a laser or other homogenized beam source. The pulse length is selected so that the layers of interest are transiently heated to a temperature which allows recrystallization and/or dopant activation while maintaining the substrate at a temperature which is sufficiently low to avoid damage to the substrate. This method is particularly applicable in the fabrication of solar cells. 1 fig.
USDA-ARS?s Scientific Manuscript database
Lignin depolymerization to aromatic monomers with high yields and selectivity is essential for the economic feasibility of many lignin-valorization strategies within integrated biorefining processes. Importantly, the quality and properties of the lignin source play an essential role in impacting the...
Progress in cryopreservation of dormant winter buds of selected tree species
USDA-ARS?s Scientific Manuscript database
In cryopreservation of germplasm, using dormant winter buds (DB) as source plant materials is economically favorable over tissue culture options (TC). Processing DB does not require aseptic conditions and involved cryopreservation procedures. Although, the DB cryopreservation method has been known f...
Commodity-based Approach for Evaluating the Value of Freight Moving on Texas’ Roadway Network
DOT National Transportation Integrated Search
2017-12-10
The researchers took a commodity-based approach to evaluate the value of a list of selected commodities moved on the Texas freight network. This approach takes advantage of commodity-specific data sources and modeling processes. It provides a unique ...
Treatment of power utilities exhaust
Koermer, Gerald [Basking Ridge, NJ
2012-05-15
Provided is a process for treating nitrogen oxide-containing exhaust produced by a stationary combustion source by the catalytic reduction of nitrogen oxide in the presence of a reductant comprising hydrogen, followed by ammonia selective catalytic reduction to further reduce the nitrogen oxide level in the exhaust.
Markusova, Valentina
2012-01-01
The aim of the paper is to overview the leading information processing domain in Russia and Eastern Europe, namely All Russian Institute for Scientific and Technical Information (VINITI ) of the Russian Academy of Sciences. Russian science structure is different from that in the Western Europe and the US. The main aim of VINITI is to collect, process and disseminate scientific information on various fields of science and technology, published in 70 countries in 40 languages, selected from books, journals, conference proceedings, and patents. A special attention is given to the journal selection and depositing manuscripts (a kind of grey literature), an important source for Russian research. VINITI has created the largest database containing about 30 million records dating back to 1980. About 80,000-100,000 new records are added monthly. VINITI publishes the Journal Abstract (JA) on 19 fields of science, including medicine, containing about a million publications annually. Two thirds of these records are foreign and 36.7% – Russian sources. PMID:23322964
Markusova, Valentina
2012-06-01
The aim of the paper is to overview the leading information processing domain in Russia and Eastern Europe, namely All Russian Institute for Scientific and Technical Information (VINITI ) of the Russian Academy of Sciences. Russian science structure is different from that in the Western Europe and the US. The main aim of VINITI is to collect, process and disseminate scientific information on various fields of science and technology, published in 70 countries in 40 languages, selected from books, journals, conference proceedings, and patents. A special attention is given to the journal selection and depositing manuscripts (a kind of grey literature), an important source for Russian research. VINITI has created the largest database containing about 30 million records dating back to 1980. About 80,000-100,000 new records are added monthly. VINITI publishes the Journal Abstract (JA) on 19 fields of science, including medicine, containing about a million publications annually. Two thirds of these records are foreign and 36.7% - Russian sources.
Innovative Alternative Technologies to Extract Carotenoids from Microalgae and Seaweeds
Poojary, Mahesha M.; Barba, Francisco J.; Aliakbarian, Bahar; Donsì, Francesco; Pataro, Gianpiero; Dias, Daniel A.; Juliano, Pablo
2016-01-01
Marine microalgae and seaweeds (microalgae) represent a sustainable source of various bioactive natural carotenoids, including β-carotene, lutein, astaxanthin, zeaxanthin, violaxanthin and fucoxanthin. Recently, the large-scale production of carotenoids from algal sources has gained significant interest with respect to commercial and industrial applications for health, nutrition, and cosmetic applications. Although conventional processing technologies, based on solvent extraction, offer a simple approach to isolating carotenoids, they suffer several, inherent limitations, including low efficiency (extraction yield), selectivity (purity), high solvent consumption, and long treatment times, which have led to advancements in the search for innovative extraction technologies. This comprehensive review summarizes the recent trends in the extraction of carotenoids from microalgae and seaweeds through the assistance of different innovative techniques, such as pulsed electric fields, liquid pressurization, supercritical fluids, subcritical fluids, microwaves, ultrasounds, and high-pressure homogenization. In particular, the review critically analyzes technologies, characteristics, advantages, and shortcomings of the different innovative processes, highlighting the differences in terms of yield, selectivity, and economic and environmental sustainability. PMID:27879659
Innovative Alternative Technologies to Extract Carotenoids from Microalgae and Seaweeds.
Poojary, Mahesha M; Barba, Francisco J; Aliakbarian, Bahar; Donsì, Francesco; Pataro, Gianpiero; Dias, Daniel A; Juliano, Pablo
2016-11-22
Marine microalgae and seaweeds (microalgae) represent a sustainable source of various bioactive natural carotenoids, including β-carotene, lutein, astaxanthin, zeaxanthin, violaxanthin and fucoxanthin. Recently, the large-scale production of carotenoids from algal sources has gained significant interest with respect to commercial and industrial applications for health, nutrition, and cosmetic applications. Although conventional processing technologies, based on solvent extraction, offer a simple approach to isolating carotenoids, they suffer several, inherent limitations, including low efficiency (extraction yield), selectivity (purity), high solvent consumption, and long treatment times, which have led to advancements in the search for innovative extraction technologies. This comprehensive review summarizes the recent trends in the extraction of carotenoids from microalgae and seaweeds through the assistance of different innovative techniques, such as pulsed electric fields, liquid pressurization, supercritical fluids, subcritical fluids, microwaves, ultrasounds, and high-pressure homogenization. In particular, the review critically analyzes technologies, characteristics, advantages, and shortcomings of the different innovative processes, highlighting the differences in terms of yield, selectivity, and economic and environmental sustainability.
Uncapher, Melina R.; Rugg, Michael D.
2009-01-01
Not all of what is experienced is remembered later. Behavioral evidence suggests that the manner in which an event is processed influences which aspects of the event will later be remembered. The present experiment investigated the neural correlates of ‘selective encoding’, or the mechanisms that support the encoding of some elements of an event in preference to others. Event-related functional magnetic resonance imaging (fMRI) data were acquired while volunteers selectively attended to one of two different contextual features of study items (color or location). A surprise memory test for the items and both contextual features was subsequently administered to determine the influence of selective attention on the neural correlates of contextual encoding. Activity in several cortical regions indexed later memory success selectively for color or location information, and this encoding-related activity was enhanced by selective attention to the relevant feature. Critically, a region in the hippocampus responded selectively to attended source information (whether color or location), demonstrating encoding-related activity for attended but not for nonattended source features. Together, the findings suggest that selective attention modulates the magnitude of activity in cortical regions engaged by different aspects of an event, and hippocampal encoding mechanisms seem to be sensitive to this modulation. Thus, the information that is encoded into a memory representation is biased by selective attention, and this bias is mediated by cortico-hippocampal interactions. PMID:19553466
Uncapher, Melina R; Rugg, Michael D
2009-06-24
Not all of what is experienced is remembered later. Behavioral evidence suggests that the manner in which an event is processed influences which aspects of the event will later be remembered. The present experiment investigated the neural correlates of "selective encoding," or the mechanisms that support the encoding of some elements of an event in preference to others. Event-related MRI data were acquired while volunteers selectively attended to one of two different contextual features of study items (color or location). A surprise memory test for the items and both contextual features was subsequently administered to determine the influence of selective attention on the neural correlates of contextual encoding. Activity in several cortical regions indexed later memory success selectively for color or location information, and this encoding-related activity was enhanced by selective attention to the relevant feature. Critically, a region in the hippocampus responded selectively to attended source information (whether color or location), demonstrating encoding-related activity for attended but not for nonattended source features. Together, the findings suggest that selective attention modulates the magnitude of activity in cortical regions engaged by different aspects of an event, and hippocampal encoding mechanisms seem to be sensitive to this modulation. Thus, the information that is encoded into a memory representation is biased by selective attention, and this bias is mediated by cortical-hippocampal interactions.
Siemann, Julia; Herrmann, Manfred; Galashan, Daniela
2016-08-01
Usually, incongruent flanker stimuli provoke conflict processing whereas congruent flankers should facilitate task performance. Various behavioral studies reported improved or even absent conflict processing with correctly oriented selective attention. In the present study we attempted to reinvestigate these behavioral effects and to disentangle neuronal activity patterns underlying the attentional cueing effect taking advantage of a combination of the high temporal resolution of Electroencephalographic (EEG) and the spatial resolution of functional magnetic resonance imaging (fMRI). Data from 20 participants were acquired in different sessions per method. We expected the conflict-related N200 event-related potential (ERP) component and areas associated with flanker processing to show validity-specific modulations. Additionally, the spatio-temporal dynamics during cued flanker processing were examined using an fMRI-constrained source analysis approach. In the ERP data we found early differences in flanker processing between validity levels. An early centro-parietal relative positivity for incongruent stimuli occurred only with valid cueing during the N200 time window, while a subsequent fronto-central negativity was specific to invalidly cued interference processing. The source analysis additionally pointed to separate neural generators of these effects. Regional sources in visual areas were involved in conflict processing with valid cueing, while a regional source in the anterior cingulate cortex (ACC) seemed to contribute to the ERP differences with invalid cueing. Moreover, the ACC and precentral gyrus demonstrated an early and a late phase of congruency-related activity differences with invalid cueing. We discuss the first effect to reflect conflict detection and response activation while the latter more likely originated from conflict monitoring and control processes during response competition. Copyright © 2016 Elsevier Inc. All rights reserved.
Auditory Attentional Control and Selection during Cocktail Party Listening
Hill, Kevin T.
2010-01-01
In realistic auditory environments, people rely on both attentional control and attentional selection to extract intelligible signals from a cluttered background. We used functional magnetic resonance imaging to examine auditory attention to natural speech under such high processing-load conditions. Participants attended to a single talker in a group of 3, identified by the target talker's pitch or spatial location. A catch-trial design allowed us to distinguish activity due to top-down control of attention versus attentional selection of bottom-up information in both the spatial and spectral (pitch) feature domains. For attentional control, we found a left-dominant fronto-parietal network with a bias toward spatial processing in dorsal precentral sulcus and superior parietal lobule, and a bias toward pitch in inferior frontal gyrus. During selection of the talker, attention modulated activity in left intraparietal sulcus when using talker location and in bilateral but right-dominant superior temporal sulcus when using talker pitch. We argue that these networks represent the sources and targets of selective attention in rich auditory environments. PMID:19574393
Wu, Qihua; Shi, Honglan; Ma, Yinfa; Adams, Craig; Eichholz, Todd; Timmons, Terry; Jiang, Hua
2015-01-01
N-Nitrosamines are potent mutagenic and carcinogenic emerging water disinfection by-products (DBPs). The most effective strategy to control the formation of these DBPs is minimizing their precursors from source water. Secondary and tertiary amines are dominating precursors of N-nitrosamines formation during drinking water disinfection process. Therefore, the screening and removal of these amines in source water are very essential for preventing the formation of N-nitrosamines. A rapid, simple, and sensitive ultrafast liquid chromatography-tandem mass spectrometry (UFLC-MS/MS) method has been developed in this study to determine seven amines, including dimethylamine, ethylmethylamine, diethylamine, dipropylamine, trimethylamine, 3-(dimethylaminomethyl)indole, and 4-dimethylaminoantipyrine, as major precursors of N-nitrosamines in drinking water system. No sample preparation process is needed except a simple filtration. Separation and detection can be achieved in 11 min per sample. The method detection limits of selected amines are ranging from 0.02 μg/L to 1 μg/L except EMA (5 μg/L), and good calibration linearity was achieved. The developed method was applied to determine the selected precursors in source water and drinking water samples collected from Midwest area of the United States. In most of water samples, the concentrations of selected precursors of N-nitrosamines were below their method detection limits. Dimethylamine was detected in some of water samples at the concentration up to 25.4 μg/L. Copyright © 2014 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Havasy, C.K.; Quach, T.K.; Bozada, C.A.
1995-12-31
This work is the development of a single-layer integrated-metal field effect transistor (SLIMFET) process for a high performance 0.2 {mu}m AlGaAs/InGaAs pseudomorphic high electron mobility transistor (PHEMT). This process is compatible with MMIC fabrication and minimizes process variations, cycle time, and cost. This process uses non-alloyed ohmic contacts, a selective gate-recess etching process, and a single gate/source/drain metal deposition step to form both Schottky and ohmic contacts at the same time.
IR-thermography for Quality Prediction in Selective Laser Deburring
NASA Astrophysics Data System (ADS)
Möller, Mauritz; Conrad, Christian; Haimerl, Walter; Emmelmann, Claus
Selective Laser Deburring (SLD) is an innovative edge-refinement process being developed at the Laser Zentrum Nord (LZN) in Hamburg. It offers a wear-free processing of defined radii and bevels at the edges as well as the possibility to deburr several materials with the same laser source. Sheet metal parts of various applications need to be post-processed to remove sharp edges and burrs remaining from the initial production process. Thus, SLD will provide an extended degree of automation for the next generation of manufacturing facilities. This paper investigates the dependence between the deburring result and the temperature field in- and post-process. In order to achieve this, the surface temperature near to the deburred edge is monitored with IR-thermography. Different strategies are discussed for the approach using the IR-information as a quality assurance. Additional experiments are performed to rate the accuracy of the quality prediction method in different deburring applications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Riza, Nabeel Agha; Perez, Frank
A remote temperature sensing system includes a light source selectively producing light at two different wavelengths and a sensor device having an optical path length that varies as a function of temperature. The sensor receives light emitted by the light source and redirects the light along the optical path length. The system also includes a detector receiving redirected light from the sensor device and generating respective signals indicative of respective intensities of received redirected light corresponding to respective wavelengths of light emitted by the light source. The system also includes a processor processing the signals generated by the detector tomore » calculate a temperature of the device.« less
Production of γ-aminobutyric acid by microorganisms from different food sources.
Hudec, Jozef; Kobida, Ľubomír; Čanigová, Margita; Lacko-Bartošová, Magdaléna; Ložek, Otto; Chlebo, Peter; Mrázová, Jana; Ducsay, Ladislav; Bystrická, Judita
2015-04-01
γ-Aminobutyric acid (GABA) is a potentially bioactive component of foods and pharmaceuticals. The aim of this study was screen lactic acid bacteria belonging to the Czech Collection of Microorganisms, and microorganisms (yeast and bacteria) from 10 different food sources for GABA production by fermentation in broth or plant and animal products. Under an aerobic atmosphere, very low selectivity of GABA production (from 0.8% to 1.3%) was obtained using yeast and filamentous fungi, while higher selectivity (from 6.5% to 21.0%) was obtained with bacteria. The use of anaerobic conditions, combined with the addition of coenzyme (pyridoxal-5-phosphate) and salts (CaCl2 , NaCl), led to the detection of a low concentration of GABA precursor. Simultaneously, using an optimal temperature of 33 °C, a pH of 6.5 and bacteria from banana (Pseudomonadaceae and Enterobacteriaceae families), surprisingly, a high selectivity of GABA was obtained. A positive impact of fenugreek sprouts on the proteolytic process and GABA production from plant material as a source of GABA precursor was identified. Lactic acid bacteria for the production of new plant and animal GABA-rich products from different natural sources containing GABA precursor can be used. © 2014 Society of Chemical Industry.
2009-06-01
their respective owners. The findings of this report are not to be construed as an official Department of the Army position unless so designated by...Reliable Sandberg bluegrass, a selected-class germplasm (Wal- dron et al. 2006c) Yakima western yarrow, a source-identified germplasm (Wal- dron ...contaminated lands. Although restoration is outside the scope of these guidelines, land managers may find the guide useful for planning the restoration process
QaaS (quality as a service) model for web services using big data technologies
NASA Astrophysics Data System (ADS)
Ahmad, Faisal; Sarkar, Anirban
2017-10-01
Quality of service (QoS) determines the service usability and utility and both of which influence the service selection process. The QoS varies from one service provider to other. Each web service has its own methodology for evaluating QoS. The lack of transparent QoS evaluation model makes the service selection challenging. Moreover, most QoS evaluation processes do not consider their historical data which not only helps in getting more accurate QoS but also helps for future prediction, recommendation and knowledge discovery. QoS driven service selection demands a model where QoS can be provided as a service to end users. This paper proposes a layered QaaS (quality as a service) model in the same line as PaaS and software as a service, where users can provide QoS attributes as inputs and the model returns services satisfying the user's QoS expectation. This paper covers all the key aspects in this context, like selection of data sources, its transformation, evaluation, classification and storage of QoS. The paper uses server log as the source for evaluating QoS values, common methodology for its evaluation and big data technologies for its transformation and analysis. This paper also establishes the fact that Spark outperforms the Pig with respect to evaluation of QoS from logs.
Howard, David M; Pong-Wong, Ricardo; Knap, Pieter W; Kremer, Valentin D; Woolliams, John A
2018-05-10
Optimal contributions selection (OCS) provides animal breeders with a framework for maximising genetic gain for a predefined rate of inbreeding. Simulation studies have indicated that the source of the selective advantage of OCS is derived from breeding decisions being more closely aligned with estimates of Mendelian sampling terms ([Formula: see text]) of selection candidates, rather than estimated breeding values (EBV). This study represents the first attempt to assess the source of the selective advantage provided by OCS using a commercial pig population and by testing three hypotheses: (1) OCS places more emphasis on [Formula: see text] compared to EBV for determining which animals were selected as parents, (2) OCS places more emphasis on [Formula: see text] compared to EBV for determining which of those parents were selected to make a long-term genetic contribution (r), and (3) OCS places more emphasis on [Formula: see text] compared to EBV for determining the magnitude of r. The population studied also provided an opportunity to investigate the convergence of r over time. Selection intensity limited the number of males available for analysis, but females provided some evidence that the selective advantage derived from applying an OCS algorithm resulted from greater weighting being placed on [Formula: see text] during the process of decision-making. Male r were found to converge initially at a faster rate than female r, with approximately 90% convergence achieved within seven generations across both sexes. This study of commercial data provides some support to results from theoretical and simulation studies that the source of selective advantage from OCS comes from [Formula: see text]. The implication that genomic selection (GS) improves estimation of [Formula: see text] should allow for even greater genetic gains for a predefined rate of inbreeding, once the synergistic benefits of combining OCS and GS are realised.
Studies of the use of heat from high temperature nuclear sources for hydrogen production processes
NASA Technical Reports Server (NTRS)
Farbman, G. H.
1976-01-01
Future uses of hydrogen and hydrogen production processes that can meet the demand for hydrogen in the coming decades were considered. To do this, a projection was made of the market for hydrogen through the year 2000. Four hydrogen production processes were selected, from among water electrolysis, fossil based and thermochemical water decomposition systems, and evaluated, using a consistent set of ground rules, in terms of relative performance, economics, resource requirements, and technology status.
Runaway cultural niche construction
Rendell, Luke; Fogarty, Laurel; Laland, Kevin N.
2011-01-01
Cultural niche construction is a uniquely potent source of selection on human populations, and a major cause of recent human evolution. Previous theoretical analyses have not, however, explored the local effects of cultural niche construction. Here, we use spatially explicit coevolutionary models to investigate how cultural processes could drive selection on human genes by modifying local resources. We show that cultural learning, expressed in local niche construction, can trigger a process with dynamics that resemble runaway sexual selection. Under a broad range of conditions, cultural niche-constructing practices generate selection for gene-based traits and hitchhike to fixation through the build up of statistical associations between practice and trait. This process can occur even when the cultural practice is costly, or is subject to counteracting transmission biases, or the genetic trait is selected against. Under some conditions a secondary hitchhiking occurs, through which genetic variants that enhance the capability for cultural learning are also favoured by similar dynamics. We suggest that runaway cultural niche construction could have played an important role in human evolution, helping to explain why humans are simultaneously the species with the largest relative brain size, the most potent capacity for niche construction and the greatest reliance on culture. PMID:21320897
Chapter 3: Design of the Saber-Tooth Project.
ERIC Educational Resources Information Center
Ward, Phillip
1999-01-01
Used data from interviews, surveys, and document analysis to describe the methods and reform processes of the Saber Tooth Project, examining selection of sites; demographics (school sites, teachers, data sources, and project assumptions); and project phases (development, planning, implementation, and support). The project's method of reform was…
Effects of a hydrodynamic process on extraction of carotenoids from tomato
USDA-ARS?s Scientific Manuscript database
We evaluated a proprietary sonoporation method that was introduced with the hope of increasing accessibility of phytonutrients in fruits and vegetables. Two important commodities were selected: tomato, a major source of carotenoids, notably lycopene, in the diet of the Western world; and Citrus, o...
NASA Technical Reports Server (NTRS)
Morris, A. Terry
1999-01-01
This paper examines various sources of error in MIT's improved top oil temperature rise over ambient temperature model and estimation process. The sources of error are the current parameter estimation technique, quantization noise, and post-processing of the transformer data. Results from this paper will show that an output error parameter estimation technique should be selected to replace the current least squares estimation technique. The output error technique obtained accurate predictions of transformer behavior, revealed the best error covariance, obtained consistent parameter estimates, and provided for valid and sensible parameters. This paper will also show that the output error technique should be used to minimize errors attributed to post-processing (decimation) of the transformer data. Models used in this paper are validated using data from a large transformer in service.
DOE Office of Scientific and Technical Information (OSTI.GOV)
McLoughlin, K.
2016-01-11
The overall aim of this project is to develop a software package, called MetaQuant, that can determine the constituents of a complex microbial sample and estimate their relative abundances by analysis of metagenomic sequencing data. The goal for Task 1 is to create a generative model describing the stochastic process underlying the creation of sequence read pairs in the data set. The stages in this generative process include the selection of a source genome sequence for each read pair, with probability dependent on its abundance in the sample. The other stages describe the evolution of the source genome from itsmore » nearest common ancestor with a reference genome, breakage of the source DNA into short fragments, and the errors in sequencing the ends of the fragments to produce read pairs.« less
48 CFR 715.370 - Alternative source selection procedures.
Code of Federal Regulations, 2010 CFR
2010-10-01
... selection procedures. 715.370 Section 715.370 Federal Acquisition Regulations System AGENCY FOR INTERNATIONAL DEVELOPMENT CONTRACTING METHODS AND CONTRACT TYPES CONTRACTING BY NEGOTIATION Source Selection 715.370 Alternative source selection procedures. The following selection procedures may be used, when...
Papp, John F.
2014-01-01
Post-beneficiation processing plants (generally called smelters and refineries) for 3TG mineral ores and concentrates were identified by company and industry association representatives as being the link in the 3TG mineral supply chain through which these minerals can be traced to their source of origin (mine). The determination of the source of origin is critical to the development of a complete and transparent conflict-free mineral supply chain. Tungsten processing plants were the subject of the first fact sheet in this series published by USGS NMIC in August 2014. Background information about historical conditions and multinational stakeholders’ voluntary due diligence guidance for minerals from conflict-affected and high-risk areas is presented in the tungsten fact sheet. This fact sheet, the second in a series about 3TG minerals, focuses on the tantalum supply chain by listing selected processors that produced tantalum materials commercially worldwide during 2013–14. It does not provide any information regarding the sources of material processed in these facilities.
Nevers, Meredith; Byappanahalli, Muruleedhara; Phanikumar, Mantha S.; Whitman, Richard L.
2016-01-01
Mathematical models have been widely applied to surface waters to estimate rates of settling, resuspension, flow, dispersion, and advection in order to calculate movement of particles that influence water quality. Of particular interest are the movement, survival, and persistence of microbial pathogens or their surrogates, which may contaminate recreational water, drinking water, or shellfish. Most models devoted to microbial water quality have been focused on fecal indicator organisms (FIO), which act as a surrogate for pathogens and viruses. Process-based modeling and statistical modeling have been used to track contamination events to source and to predict future events. The use of these two types of models require different levels of expertise and input; process-based models rely on theoretical physical constructs to explain present conditions and biological distribution while data-based, statistical models use extant paired data to do the same. The selection of the appropriate model and interpretation of results is critical to proper use of these tools in microbial source tracking. Integration of the modeling approaches could provide insight for tracking and predicting contamination events in real time. A review of modeling efforts reveals that process-based modeling has great promise for microbial source tracking efforts; further, combining the understanding of physical processes influencing FIO contamination developed with process-based models and molecular characterization of the population by gene-based (i.e., biological) or chemical markers may be an effective approach for locating sources and remediating contamination in order to protect human health better.
[Evaluation of treatment technology of odor pollution source in petrochemical industry].
Mu, Gui-Qin; Sui, Li-Hua; Guo, Ya-Feng; Ma, Chuan-Jun; Yang, Wen-Yu; Gao, Yang
2013-12-01
Using an environmental technology assessment system, we put forward the evaluation index system for treatment technology of the typical odor pollution sources in the petroleum refining process, which has been applied in the assessment of the industrial technology. And then the best available techniques are selected for emissions of gas refinery sewage treatment plant, headspace gas of acidic water jars, headspace gas of cold coke jugs/intermediate oil tank/dirty oil tank, exhaust of oxidative sweetening, and vapors of loading and unloading oil.
Ground control requirements for precision processing of ERTS images
Burger, Thomas C.
1973-01-01
With the successful flight of the ERTS-1 satellite, orbital height images are available for precision processing into products such as 1:1,000,000-scale photomaps and enlargements up to 1:250,000 scale. In order to maintain positional error below 100 meters, control points for the precision processing must be carefully selected, clearly definitive on photos in both X and Y. Coordinates of selected control points measured on existing ½ and 15-minute standard maps provide sufficient accuracy for any space imaging system thus far defined. This procedure references the points to accepted horizontal and vertical datums. Maps as small as 1:250,000 scale can be used as source material for coordinates, but to maintain the desired accuracy, maps of 1:100,000 and larger scale should be used when available.
Use of Information: Getting to the Heart of the Matter
ERIC Educational Resources Information Center
Eisenberg, Michael B.
2005-01-01
The Big6 approach to information problem solving is widely used by students in the US. Use of Information is the 4th stage and marks a shift in focus from selecting and accessing information sources to using the information itself in a process that involves "critical thinking."
Review and Analysis of Curricula for Occupations in Metalworking.
ERIC Educational Resources Information Center
Snyder, Thomas R.; Butler, Roy L.
To provide curriculum specialists and practitioners with specific information on the source and quality of instructional materials for the metal trades and area and to suggest areas of needed curriculum development, a selected review and analysis of previously processed Educational Resources Information Center (ERIC) documents were made.…
Personal Variables and Bias in Educational Decision-Making.
ERIC Educational Resources Information Center
Huebner, E. Scott; And Others
1984-01-01
Findings regarding the influence of four potential sources of bias (sex, socioeconimic status, race, physical attractiveness) upon decision-making stages of the assessment process are selectively reviewed. It is concluded that, though further research is needed, convincing evidence of bias in later stages of decision making has yet to be…
2008-06-01
Transportation Systems * The Worldwide Air Transportation and Air Traffic Control System * The Worldwide Web and the Underlying Internet * Automobile Production...their use in automobiles as a way to reduce gasoline consumption, increase fuel mileage, and reduce harmful emissions. They represent a power source that
Psychosocial Factors in Activity Selection, Activity Perseverance, and Performance Achievement.
ERIC Educational Resources Information Center
Singer, Robert N.
In order to understand what motivates children to participate in or avoid engaging in physical activities, it is necessary to understand something about motivation. There are two sources of motivation. Intrinsic motivation comes from drives, psychological and physiological processes, and needs such as the desire for achievement and…
SCORE A: A Student Research Paper Writing Strategy.
ERIC Educational Resources Information Center
Korinek, Lori; Bulls, Jill A.
1996-01-01
A mnemonic strategy for writing a research paper is explained. "SCORE A" reminds the student to select a subject, create categories, obtain sources, read and take notes, evenly organize the information, and apply process writing steps. Implementation of the strategy with five eighth graders with learning disabilities is reported. (DB)
Homework Motivation and Preferences of Turkish Students
ERIC Educational Resources Information Center
Iflazoglu, Ayten; Hong, Eunsook
2012-01-01
Turkish students' motivation sources, organisational approaches, physical needs and environmental and interpersonal preferences during the homework process were examined in 1776 students in Grades 5-8 from 10 randomly selected schools in two districts of a major urban city in Turkey. These constructs were examined to determine grade, gender,…
NASA Astrophysics Data System (ADS)
Dubey, Satish Kumar; Singh Mehta, Dalip; Anand, Arun; Shakher, Chandra
2008-01-01
We demonstrate simultaneous topography and tomography of latent fingerprints using full-field swept-source optical coherence tomography (OCT). The swept-source OCT system comprises a superluminescent diode (SLD) as broad-band light source, an acousto-optic tunable filter (AOTF) as frequency tuning device, and a compact, nearly common-path interferometer. Both the amplitude and the phase map of the interference fringe signal are reconstructed. Optical sectioning of the latent fingerprint sample is obtained by selective Fourier filtering and the topography is retrieved from the phase map. Interferometry, selective filtering, low coherence and hence better resolution are some of the advantages of the proposed system over the conventional fingerprint detection techniques. The present technique is non-invasive in nature and does not require any physical or chemical processing. Therefore, the quality of the sample does not alter and hence the same fingerprint can be used for other types of forensic test. Exploitation of low-coherence interferometry for fingerprint detection itself provides an edge over other existing techniques as fingerprints can even be lifted from low-reflecting surfaces. The proposed system is very economical and compact.
NASA Astrophysics Data System (ADS)
Khojasteh, Malak; Kresin, Vitaly V.
2016-12-01
We describe the production of size selected manganese nanoclusters using a dc magnetron sputtering/aggregation source. Since nanoparticle production is sensitive to a range of overlapping operating parameters (in particular, the sputtering discharge power, the inert gas flow rates, and the aggregation length) we focus on a detailed map of the influence of each parameter on the average nanocluster size. In this way it is possible to identify the main contribution of each parameter to the physical processes taking place within the source. The discharge power and argon flow supply the atomic vapor, and argon also plays the crucial role in the formation of condensation nuclei via three-body collisions. However, neither the argon flow nor the discharge power have a strong effect on the average nanocluster size in the exiting beam. Here the defining role is played by the source residence time, which is governed by the helium supply and the aggregation path length. The size of mass selected nanoclusters was verified by atomic force microscopy of deposited particles.
Anderson, Charles
2015-03-24
Post-beneficiation processing plants (generally called smelters and refineries) for 3TG mineral ores and concentrates were identified by company and industry association representatives as being a link in the 3TG mineral supply chain through which these minerals can be traced to their source of origin (mine). The determination of the source of origin is critical to the development of a complete and transparent conflict-free mineral supply chain. Tungsten processing plants were the subject of the first fact sheet in this series published by the USGS NMIC in August 2014. Background information about historical conditions and multinational stakeholders’ voluntary due diligence guidance for minerals from conflict-affected and high-risk areas was presented in the tungsten fact sheet. Tantalum processing plants were the subject of the second fact sheet in this series published by the USGS NMIC in December 2014. This fact sheet, the third in the series about 3TG minerals, focuses on the tin supply chain by listing selected processors that produced tin materials commercially worldwide during 2013–14. It does not provide any information regarding the sources of the material processed in these facilities.
Source recognition by stimulus content in the MTL.
Park, Heekyeong; Abellanoza, Cheryl; Schaeffer, James; Gandy, Kellen
2014-03-17
Source memory is considered to be the cornerstone of episodic memory that enables us to discriminate similar but different events. In the present fMRI study, we investigated whether neural correlates of source retrieval differed by stimulus content in the medial temporal lobe (MTL) when the item and context had been integrated as a perceptually unitized entity. Participants were presented with a list of items either in verbal or pictorial form overlaid on a colored square and instructed to integrate both the item and context into a single image. At test, participants judged the study status of test items and the color in which studied items were presented. Source recognition invariant of stimulus content elicited retrieval activity in both the left anterior hippocampus extending to the perirhinal cortex and the right posterior hippocampus. Word-selective source recognition was related to activity in the left perirhinal cortex, whereas picture-selective source recognition was identified in the left posterior hippocampus. Neural activity sensitive to novelty detection common to both words and pictures was found in the left anterior and right posterior hippocampus. Novelty detection selective to words was associated with the left perirhinal cortex, while activity sensitive to new pictures was identified in the bilateral hippocampus and adjacent MTL cortices, including the parahippocampal, entorhinal, and perirhinal cortices. These findings provide further support for the integral role of the hippocampus both in source recognition and in detection of new stimuli across stimulus content. Additionally, novelty effects in the MTL reveal the integral role of the MTL cortex as the interface for processing new information. Collectively, the present findings demonstrate the importance of the MTL for both previously experienced and novel events. Copyright © 2014 Elsevier B.V. All rights reserved.
Computationally efficient thermal-mechanical modelling of selective laser melting
NASA Astrophysics Data System (ADS)
Yang, Yabin; Ayas, Can
2017-10-01
The Selective laser melting (SLM) is a powder based additive manufacturing (AM) method to produce high density metal parts with complex topology. However, part distortions and accompanying residual stresses deteriorates the mechanical reliability of SLM products. Modelling of the SLM process is anticipated to be instrumental for understanding and predicting the development of residual stress field during the build process. However, SLM process modelling requires determination of the heat transients within the part being built which is coupled to a mechanical boundary value problem to calculate displacement and residual stress fields. Thermal models associated with SLM are typically complex and computationally demanding. In this paper, we present a simple semi-analytical thermal-mechanical model, developed for SLM that represents the effect of laser scanning vectors with line heat sources. The temperature field within the part being build is attained by superposition of temperature field associated with line heat sources in a semi-infinite medium and a complimentary temperature field which accounts for the actual boundary conditions. An analytical solution of a line heat source in a semi-infinite medium is first described followed by the numerical procedure used for finding the complimentary temperature field. This analytical description of the line heat sources is able to capture the steep temperature gradients in the vicinity of the laser spot which is typically tens of micrometers. In turn, semi-analytical thermal model allows for having a relatively coarse discretisation of the complimentary temperature field. The temperature history determined is used to calculate the thermal strain induced on the SLM part. Finally, a mechanical model governed by elastic-plastic constitutive rule having isotropic hardening is used to predict the residual stresses.
NASA Technical Reports Server (NTRS)
Mckee, James W.
1988-01-01
This final report describes the accomplishments of the General Purpose Intelligent Sensor Interface task of the Applications of Artificial Intelligence to Space Station grant for the period from October 1, 1987 through September 30, 1988. Portions of the First Biannual Report not revised will not be included but only referenced. The goal is to develop an intelligent sensor system that will simplify the design and development of expert systems using sensors of the physical phenomena as a source of data. This research will concentrate on the integration of image processing sensors and voice processing sensors with a computer designed for expert system development. The result of this research will be the design and documentation of a system in which the user will not need to be an expert in such areas as image processing algorithms, local area networks, image processor hardware selection or interfacing, television camera selection, voice recognition hardware selection, or analog signal processing. The user will be able to access data from video or voice sensors through standard LISP statements without any need to know about the sensor hardware or software.
Laine, R.M.; Hirschon, A.S.; Wilson, R.B. Jr.
1987-12-29
A process is described for the preparation of a multimetallic catalyst for the hydrodenitrogenation of an organic feedstock, which process comprises: (a) forming a precatalyst itself comprising: (1) a first metal compound selected from compounds of nickel, cobalt or mixtures thereof; (2) a second metal compound selected from compounds of chromium, molybdenum, tungsten, or mixtures thereof; and (3) an inorganic support; (b) heating the precatalyst of step (a) with a source of sulfide in a first non-oxidizing gas at a temperature and for a time effective to presulfide the precatalyst; (c) adding in a second non-oxidizing gas to the sulfided precatalyst of step (b) an organometallic transition metal moiety selected from compounds of iridium, rhodium, iron, ruthenium, tungsten or mixtures thereof for a time and at a temperature effective to chemically combine the metal components; and (d) optionally heating the chemically combined catalyst of step (b) in vacuum at a temperature and for a time effective to remove residual volatile organic materials. 12 figs.
NASA Astrophysics Data System (ADS)
Neuhoff, John G.
2003-04-01
Increasing acoustic intensity is a primary cue to looming auditory motion. Perceptual overestimation of increasing intensity could provide an evolutionary selective advantage by specifying that an approaching sound source is closer than actual, thus affording advanced warning and more time than expected to prepare for the arrival of the source. Here, multiple lines of converging evidence for this evolutionary hypothesis are presented. First, it is shown that intensity change specifying accelerating source approach changes in loudness more than equivalent intensity change specifying decelerating source approach. Second, consistent with evolutionary hunter-gatherer theories of sex-specific spatial abilities, it is shown that females have a significantly larger bias for rising intensity than males. Third, using functional magnetic resonance imaging in conjunction with approaching and receding auditory motion, it is shown that approaching sources preferentially activate a specific neural network responsible for attention allocation, motor planning, and translating perception into action. Finally, it is shown that rhesus monkeys also exhibit a rising intensity bias by orienting longer to looming tones than to receding tones. Together these results illustrate an adaptive perceptual bias that has evolved because it provides a selective advantage in processing looming acoustic sources. [Work supported by NSF and CDC.
Supply chain optimization for pediatric perioperative departments.
Davis, Janice L; Doyle, Robert
2011-09-01
Economic challenges compel pediatric perioperative departments to reduce nonlabor supply costs while maintaining the quality of patient care. Optimization of the supply chain introduces a framework for decision making that drives fiscally responsible decisions. The cost-effective supply chain is driven by implementing a value analysis process for product selection, being mindful of product sourcing decisions to reduce supply expense, creating logistical efficiency that will eliminate redundant processes, and managing inventory to ensure product availability. The value analysis approach is an analytical methodology for product selection that involves product evaluation and recommendation based on consideration of clinical benefit, overall financial impact, and revenue implications. Copyright © 2011 AORN, Inc. Published by Elsevier Inc. All rights reserved.
Bovine Milk as a Source of Functional Oligosaccharides for Improving Human Health12
Zivkovic, Angela M.; Barile, Daniela
2011-01-01
Human milk oligosaccharides are complex sugars that function as selective growth substrates for specific beneficial bacteria in the gastrointestinal system. Bovine milk is a potentially excellent source of commercially viable analogs of these unique molecules. However, bovine milk has a much lower concentration of these oligosaccharides than human milk, and the majority of the molecules are simpler in structure than those found in human milk. Specific structural characteristics of milk-derived oligosaccharides are crucial to their ability to selectively enrich beneficial bacteria while inhibiting or being less than ideal substrates for undesirable and pathogenic bacteria. Thus, if bovine milk products are to provide human milk–like benefits, it is important to identify specific dairy streams that can be processed commercially and cost-effectively and that can yield specific oligosaccharide compositions that will be beneficial as new food ingredients or supplements to improve human health. Whey streams have the potential to be commercially viable sources of complex oligosaccharides that have the structural resemblance and diversity of the bioactive oligosaccharides in human milk. With further refinements to dairy stream processing techniques and functional testing to identify streams that are particularly suitable for enriching beneficial intestinal bacteria, the future of oligosaccharides isolated from dairy streams as a food category with substantiated health claims is promising. PMID:22332060
Bovine milk as a source of functional oligosaccharides for improving human health.
Zivkovic, Angela M; Barile, Daniela
2011-05-01
Human milk oligosaccharides are complex sugars that function as selective growth substrates for specific beneficial bacteria in the gastrointestinal system. Bovine milk is a potentially excellent source of commercially viable analogs of these unique molecules. However, bovine milk has a much lower concentration of these oligosaccharides than human milk, and the majority of the molecules are simpler in structure than those found in human milk. Specific structural characteristics of milk-derived oligosaccharides are crucial to their ability to selectively enrich beneficial bacteria while inhibiting or being less than ideal substrates for undesirable and pathogenic bacteria. Thus, if bovine milk products are to provide human milk-like benefits, it is important to identify specific dairy streams that can be processed commercially and cost-effectively and that can yield specific oligosaccharide compositions that will be beneficial as new food ingredients or supplements to improve human health. Whey streams have the potential to be commercially viable sources of complex oligosaccharides that have the structural resemblance and diversity of the bioactive oligosaccharides in human milk. With further refinements to dairy stream processing techniques and functional testing to identify streams that are particularly suitable for enriching beneficial intestinal bacteria, the future of oligosaccharides isolated from dairy streams as a food category with substantiated health claims is promising.
Integrated optics technology study
NASA Technical Reports Server (NTRS)
Chen, B.; Findakly, T.; Innarella, R.
1982-01-01
The status and near term potential of materials and processes available for the fabrication of single mode integrated electro-optical components are discussed. Issues discussed are host material and orientation, waveguide formation, optical loss mechanisms, wavelength selection, polarization effects and control, laser to integrated optics coupling fiber optic waveguides to integrated optics coupling, sources, and detectors. Recommendations of the best materials, technology, and processes for fabrication of integrated optical components for communications and fiber gyro applications are given.
Sediment fingerprinting experiments to test the sensitivity of multivariate mixing models
NASA Astrophysics Data System (ADS)
Gaspar, Leticia; Blake, Will; Smith, Hugh; Navas, Ana
2014-05-01
Sediment fingerprinting techniques provide insight into the dynamics of sediment transfer processes and support for catchment management decisions. As questions being asked of fingerprinting datasets become increasingly complex, validation of model output and sensitivity tests are increasingly important. This study adopts an experimental approach to explore the validity and sensitivity of mixing model outputs for materials with contrasting geochemical and particle size composition. The experiments reported here focused on (i) the sensitivity of model output to different fingerprint selection procedures and (ii) the influence of source material particle size distributions on model output. Five soils with significantly different geochemistry, soil organic matter and particle size distributions were selected as experimental source materials. A total of twelve sediment mixtures were prepared in the laboratory by combining different quantified proportions of the < 63 µm fraction of the five source soils i.e. assuming no fluvial sorting of the mixture. The geochemistry of all source and mixture samples (5 source soils and 12 mixed soils) were analysed using X-ray fluorescence (XRF). Tracer properties were selected from 18 elements for which mass concentrations were found to be significantly different between sources. Sets of fingerprint properties that discriminate target sources were selected using a range of different independent statistical approaches (e.g. Kruskal-Wallis test, Discriminant Function Analysis (DFA), Principal Component Analysis (PCA), or correlation matrix). Summary results for the use of the mixing model with the different sets of fingerprint properties for the twelve mixed soils were reasonably consistent with the initial mixing percentages initially known. Given the experimental nature of the work and dry mixing of materials, geochemical conservative behavior was assumed for all elements, even for those that might be disregarded in aquatic systems (e.g. P). In general, the best fits between actual and modeled proportions were found using a set of nine tracer properties (Sr, Rb, Fe, Ti, Ca, Al, P, Si, K, Si) that were derived using DFA coupled with a multivariate stepwise algorithm, with errors between real and estimated value that did not exceed 6.7 % and values of GOF above 94.5 %. The second set of experiments aimed to explore the sensitivity of model output to variability in the particle size of source materials assuming that a degree of fluvial sorting of the resulting mixture took place. Most particle size correction procedures assume grain size affects are consistent across sources and tracer properties which is not always the case. Consequently, the < 40 µm fraction of selected soil mixtures was analysed to simulate the effect of selective fluvial transport of finer particles and the results were compared to those for source materials. Preliminary findings from this experiment demonstrate the sensitivity of the numerical mixing model outputs to different particle size distributions of source material and the variable impact of fluvial sorting on end member signatures used in mixing models. The results suggest that particle size correction procedures require careful scrutiny in the context of variable source characteristics.
Source apportionment and location by selective wind sampling and Positive Matrix Factorization.
Venturini, Elisa; Vassura, Ivano; Raffo, Simona; Ferroni, Laura; Bernardi, Elena; Passarini, Fabrizio
2014-10-01
In order to determine the pollution sources in a suburban area and identify the main direction of their origin, PM2.5 was collected with samplers coupled with a wind select sensor and then subjected to Positive Matrix Factorization (PMF) analysis. In each sample, soluble ions, organic carbon, elemental carbon, levoglucosan, metals, and Polycyclic Aromatic Hydrocarbons (PAHs) were determined. PMF results identified six main sources affecting the area: natural gas home appliances, motor vehicles, regional transport, biomass combustion, manufacturing activities, and secondary aerosol. The connection of factor temporal trends with other parameters (i.e., temperature, PM2.5 concentration, and photochemical processes) confirms factor attributions. PMF analysis indicated that the main source of PM2.5 in the area is secondary aerosol. This should be mainly due to regional contributions, owing to both the secondary nature of the source itself and the higher concentration registered in inland air masses. The motor vehicle emission source contribution is also important. This source likely has a prevalent local origin. The most toxic determined components, i.e., PAHs, Cd, Pb, and Ni, are mainly due to vehicular traffic. Even if this is not the main source in the study area, it is the one of greatest concern. The application of PMF analysis to PM2.5 collected with this new sampling technique made it possible to obtain more detailed results on the sources affecting the area compared to a classical PMF analysis.
Selective attention in normal and impaired hearing.
Shinn-Cunningham, Barbara G; Best, Virginia
2008-12-01
A common complaint among listeners with hearing loss (HL) is that they have difficulty communicating in common social settings. This article reviews how normal-hearing listeners cope in such settings, especially how they focus attention on a source of interest. Results of experiments with normal-hearing listeners suggest that the ability to selectively attend depends on the ability to analyze the acoustic scene and to form perceptual auditory objects properly. Unfortunately, sound features important for auditory object formation may not be robustly encoded in the auditory periphery of HL listeners. In turn, impaired auditory object formation may interfere with the ability to filter out competing sound sources. Peripheral degradations are also likely to reduce the salience of higher-order auditory cues such as location, pitch, and timbre, which enable normal-hearing listeners to select a desired sound source out of a sound mixture. Degraded peripheral processing is also likely to increase the time required to form auditory objects and focus selective attention so that listeners with HL lose the ability to switch attention rapidly (a skill that is particularly important when trying to participate in a lively conversation). Finally, peripheral deficits may interfere with strategies that normal-hearing listeners employ in complex acoustic settings, including the use of memory to fill in bits of the conversation that are missed. Thus, peripheral hearing deficits are likely to cause a number of interrelated problems that challenge the ability of HL listeners to communicate in social settings requiring selective attention.
Selective Attention in Normal and Impaired Hearing
Shinn-Cunningham, Barbara G.; Best, Virginia
2008-01-01
A common complaint among listeners with hearing loss (HL) is that they have difficulty communicating in common social settings. This article reviews how normal-hearing listeners cope in such settings, especially how they focus attention on a source of interest. Results of experiments with normal-hearing listeners suggest that the ability to selectively attend depends on the ability to analyze the acoustic scene and to form perceptual auditory objects properly. Unfortunately, sound features important for auditory object formation may not be robustly encoded in the auditory periphery of HL listeners. In turn, impaired auditory object formation may interfere with the ability to filter out competing sound sources. Peripheral degradations are also likely to reduce the salience of higher-order auditory cues such as location, pitch, and timbre, which enable normal-hearing listeners to select a desired sound source out of a sound mixture. Degraded peripheral processing is also likely to increase the time required to form auditory objects and focus selective attention so that listeners with HL lose the ability to switch attention rapidly (a skill that is particularly important when trying to participate in a lively conversation). Finally, peripheral deficits may interfere with strategies that normal-hearing listeners employ in complex acoustic settings, including the use of memory to fill in bits of the conversation that are missed. Thus, peripheral hearing deficits are likely to cause a number of interrelated problems that challenge the ability of HL listeners to communicate in social settings requiring selective attention. PMID:18974202
Ben-David, Boaz M; Tewari, Anita; Shakuf, Vered; Van Lieshout, Pascal H H M
2014-01-01
Selective attention, an essential part of daily activity, is often impaired in people with Alzheimer's disease (AD). Usually, it is measured by the color-word Stroop test. However, there is no universal agreement whether performance on the Stroop task changes significantly in AD patients; or if so, whether an increase in Stroop effects reflects a decrease in selective attention, a slowing in generalized speed of processing (SOP), or is the result of degraded color-vision. The current study investigated the impact of AD on Stroop performance and its potential sources in a meta-analysis and mathematical modeling of 18 studies, comparing 637 AD patients with 977 healthy age-matched participants. We found a significant increase in Stroop effects for AD patients, across studies. This AD-related change was associated with a slowing in SOP. However, after correcting for a bias in the distribution of latencies, SOP could only explain a moderate portion of the total variance (25%). Moreover, we found strong evidence for an AD-related increase in the latency difference between naming the font-color and reading color-neutral stimuli (r2 = 0.98). This increase in the dimensional imbalance between color-naming and word-reading was found to explain a significant portion of the AD-related increase in Stroop effects (r2 = 0.87), hinting on a possible sensory source. In conclusion, our analysis highlights the importance of controlling for sensory degradation and SOP when testing cognitive performance and, specifically, selective attention in AD patients. We also suggest possible measures and tools to better test for selective attention in AD.
Active Galactic Nuclei at All Wavelengths and from All Angles
NASA Astrophysics Data System (ADS)
Padovani, Paolo
2017-11-01
AGN are quite unique astronomical sources emitting over more than 20 orders of magnitude in frequency, with different electromagnetic bands providing windows on different sub-structures and their physics. They come in a large number of flavors only partially related to intrinsic differences. I highlight here the types of sources selected in different bands, the relevant selection effects and biases, and the underlying physical processes. I then look at the "big picture" by describing the most important parameters one needs to describe the variety of AGN classes and by discussing AGN at all frequencies in terms of their sky surface density. I conclude with a look at the most pressing open issues and the main new facilities, which will flood us with new data to tackle them.
Active Galactic Nuclei at all wavelengths and from all angles
NASA Astrophysics Data System (ADS)
Padovani, Paolo
2017-11-01
AGN are quite unique astronomical sources emitting over more than twenty orders of magnitude in frequency, with different electromagnetic bands providing windows on different sub-structures and their physics. They come in a large number of flavors only partially related to intrinsic differences. I highlight here the types of sources selected in different bands, the relevant selection effects and biases, and the underlying physical processes. I then look at the ``big picture'' by describing the most important parameters one needs to describe the variety of AGN classes and by discussing AGN at all frequencies in terms of their sky surface density. I conclude with a look at the most pressing open issues and the main new facilities, which will flood us with new data to tackle them.
High temperature, minimally invasive optical sensing modules
Riza, Nabeel Agha [Oviedo, FL; Perez, Frank [Tujunga, CA
2008-02-05
A remote temperature sensing system includes a light source selectively producing light at two different wavelengths and a sensor device having an optical path length that varies as a function of temperature. The sensor receives light emitted by the light source and redirects the light along the optical path length. The system also includes a detector receiving redirected light from the sensor device and generating respective signals indicative of respective intensities of received redirected light corresponding to respective wavelengths of light emitted by the light source. The system also includes a processor processing the signals generated by the detector to calculate a temperature of the device.
Frison, Nicola; Katsou, Evina; Malamis, Simos; Oehmen, Adrian; Fatone, Francesco
2015-09-15
Polyhydroxyalkanoates (PHAs) from activated sludge and renewable organic material can become an alternative product to traditional plastics since they are biodegradable and are produced from renewable sources. In this work, the selection of PHA storing bacteria was integrated with the side stream treatment of nitrogen removal via nitrite from sewage sludge reject water. A novel process was developed and applied where the alternation of aerobic-feast and anoxic-famine conditions accomplished the selection of PHA storing biomass and nitrogen removal via nitrite. Two configurations were examined: in configuration 1 the ammonium conversion to nitrite occurred in the same reactor in which the PHA selection process occurred, while in configuration 2 two separate reactors were used. The results showed that the selection of PHA storing biomass was successful in both configurations, while the nitrogen removal efficiency was much higher (almost 90%) in configuration 2. The PHA selection degree was evaluated by the volatile fatty acid (VFA) uptake rate (-qVFAs) and the PHA production rate (qPHA), which were 239 ± 74 and 89 ± 7 mg of COD per gram of active biomass (Xa) per hour, respectively. The characterization of the biopolymer recovered after the accumulation step, showed that it was composed of 3-hydroxybutyrate (3HB) (60%) and 3-hydroxyvalerate (3HV) (40%). The properties associated with the produced PHA suggest that they are suitable for thermoplastic processing.
Bluschke, A; Roessner, V; Beste, C
2016-04-01
Attention-deficit/hyperactivity disorder (ADHD) is one of the most prevalent neuropsychiatric disorders in childhood. Besides inattention and hyperactivity, impulsivity is the third core symptom leading to diverse and serious problems. However, the neuronal mechanisms underlying impulsivity in ADHD are still not fully understood. This is all the more the case when patients with the ADHD combined subtype (ADHD-C) are considered who are characterized by both symptoms of inattention and hyperactivity/impulsivity. Combining high-density electroencephalography (EEG) recordings with source localization analyses, we examined what information processing stages are dysfunctional in ADHD-C (n = 20) compared with controls (n = 18). Patients with ADHD-C made more impulsive errors in a Go/No-go task than healthy controls. Neurophysiologically, different subprocesses from perceptual gating to attentional selection, resource allocation and response selection processes are altered in this patient group. Perceptual gating, stimulus-driven attention selection and resource allocation processes were more pronounced in ADHD-C, are related to activation differences in parieto-occipital networks and suggest attentional filtering deficits. However, only response selection processes, associated with medial prefrontal networks, predicted impulsive errors in ADHD-C. Although the clinical picture of ADHD-C is complex and a multitude of processing steps are altered, only a subset of processes seems to directly modulate impulsive behaviour. The present findings improve the understanding of mechanisms underlying impulsivity in patients with ADHD-C and might help to refine treatment algorithms focusing on impulsivity.
Propagative selection of tilted array patterns in directional solidification
NASA Astrophysics Data System (ADS)
Song, Younggil; Akamatsu, Silvère; Bottin-Rousseau, Sabine; Karma, Alain
2018-05-01
We investigate the dynamics of tilted cellular/dendritic array patterns that form during directional solidification of a binary alloy when a preferred-growth crystal axis is misoriented with respect to the temperature gradient. In situ experimental observations and phase-field simulations in thin samples reveal the existence of a propagative source-sink mechanism of array spacing selection that operates on larger space and time scales than the competitive growth at play during the initial solidification transient. For tilted arrays, tertiary branching at the diverging edge of the sample acts as a source of new cells with a spacing that can be significantly larger than the initial average spacing. A spatial domain of large spacing then invades the sample propagatively. It thus yields a uniform spacing everywhere, selected independently of the initial conditions, except in a small region near the converging edge of the sample, which acts as a sink of cells. We propose a discrete geometrical model that describes the large-scale evolution of the spatial spacing profile based on the local dependence of the cell drift velocity on the spacing. We also derive a nonlinear advection equation that predicts the invasion velocity of the large-spacing domain, and sheds light on the fundamental nature of this process. The models also account for more complex spacing modulations produced by an irregular dynamics at the source, in good quantitative agreement with both phase-field simulations and experiments. This basic knowledge provides a theoretical basis to improve the processing of single crystals or textured polycrystals for advanced materials.
Method for materials deposition by ablation transfer processing
Weiner, K.H.
1996-04-16
A method in which a thin layer of semiconducting, insulating, or metallic material is transferred by ablation from a source substrate, coated uniformly with a thin layer of said material, to a target substrate, where said material is desired, with a pulsed, high intensity, patternable beam of energy. The use of a patternable beam allows area-selective ablation from the source substrate resulting in additive deposition of the material onto the target substrate which may require a very low percentage of the area to be covered. Since material is placed only where it is required, material waste can be minimized by reusing the source substrate for depositions on multiple target substrates. Due to the use of a pulsed, high intensity energy source the target substrate remains at low temperature during the process, and thus low-temperature, low cost transparent glass or plastic can be used as the target substrate. The method can be carried out atmospheric pressures and at room temperatures, thus eliminating vacuum systems normally required in materials deposition processes. This invention has particular application in the flat panel display industry, as well as minimizing materials waste and associated costs. 1 fig.
NASA Astrophysics Data System (ADS)
Sajil Kumar, P. J.; Jegathambal, P.; James, E. J.
2014-12-01
This paper presents the results of investigations on groundwater nitrate contamination in the Dharapuram area of Tamil Nadu in south India as a primary step to initiate denitrification. Groundwater samples were collected from 26 selected locations during the pre-monsoon season in July 2010 and analysed for nitrate and other water quality parameters. Two important water types were identified, viz. Ca-Na-HCO3 and mixed Ca-Mg-Cl. It is found that the majority of samples possess high nitrate concentration; 57 % of samples exceeded the permissible limit of Indian (45 mg/L) and WHO (50 mg/L) drinking water standard. Spatial distribution map of NO3 suggested that major contamination was observed in the SW and NW parts of the study area. This result was in agreement with the corresponding land-use pattern in this study area. Denitrification process at greater depths was evident from the negative correlation between NO3 and well depth. The sources and controlling factors of high nitrate were investigated using cross plots of NO3 with other selected hydrochemical parameters. Positive correlation for NO3 was observed with EC, K, Cl and SO4. This analysis was capable of differentiating the various sources of nitrate in groundwater. The major sources of nitrate contamination are identified as areas of high fertilizer application, sewages and animal waste dumping yards. Regulation of these pollutant sources with appropriate and cost-effective denitrification process can restore the water quality in this area.
Moutsatsos, Ioannis K; Hossain, Imtiaz; Agarinis, Claudia; Harbinski, Fred; Abraham, Yann; Dobler, Luc; Zhang, Xian; Wilson, Christopher J; Jenkins, Jeremy L; Holway, Nicholas; Tallarico, John; Parker, Christian N
2017-03-01
High-throughput screening generates large volumes of heterogeneous data that require a diverse set of computational tools for management, processing, and analysis. Building integrated, scalable, and robust computational workflows for such applications is challenging but highly valuable. Scientific data integration and pipelining facilitate standardized data processing, collaboration, and reuse of best practices. We describe how Jenkins-CI, an "off-the-shelf," open-source, continuous integration system, is used to build pipelines for processing images and associated data from high-content screening (HCS). Jenkins-CI provides numerous plugins for standard compute tasks, and its design allows the quick integration of external scientific applications. Using Jenkins-CI, we integrated CellProfiler, an open-source image-processing platform, with various HCS utilities and a high-performance Linux cluster. The platform is web-accessible, facilitates access and sharing of high-performance compute resources, and automates previously cumbersome data and image-processing tasks. Imaging pipelines developed using the desktop CellProfiler client can be managed and shared through a centralized Jenkins-CI repository. Pipelines and managed data are annotated to facilitate collaboration and reuse. Limitations with Jenkins-CI (primarily around the user interface) were addressed through the selection of helper plugins from the Jenkins-CI community.
Moutsatsos, Ioannis K.; Hossain, Imtiaz; Agarinis, Claudia; Harbinski, Fred; Abraham, Yann; Dobler, Luc; Zhang, Xian; Wilson, Christopher J.; Jenkins, Jeremy L.; Holway, Nicholas; Tallarico, John; Parker, Christian N.
2016-01-01
High-throughput screening generates large volumes of heterogeneous data that require a diverse set of computational tools for management, processing, and analysis. Building integrated, scalable, and robust computational workflows for such applications is challenging but highly valuable. Scientific data integration and pipelining facilitate standardized data processing, collaboration, and reuse of best practices. We describe how Jenkins-CI, an “off-the-shelf,” open-source, continuous integration system, is used to build pipelines for processing images and associated data from high-content screening (HCS). Jenkins-CI provides numerous plugins for standard compute tasks, and its design allows the quick integration of external scientific applications. Using Jenkins-CI, we integrated CellProfiler, an open-source image-processing platform, with various HCS utilities and a high-performance Linux cluster. The platform is web-accessible, facilitates access and sharing of high-performance compute resources, and automates previously cumbersome data and image-processing tasks. Imaging pipelines developed using the desktop CellProfiler client can be managed and shared through a centralized Jenkins-CI repository. Pipelines and managed data are annotated to facilitate collaboration and reuse. Limitations with Jenkins-CI (primarily around the user interface) were addressed through the selection of helper plugins from the Jenkins-CI community. PMID:27899692
Molecular Diagnostics of the Interstellar Medium and Star Forming Regions
NASA Astrophysics Data System (ADS)
Hartquist, T. W.; Dalgarno, A.
1996-03-01
Selected examples of the use of observationally inferred molecular level populations and chemical compositions in the diagnosis of interstellar sources and processes important in them (and in other diffuse astrophysical sources) are given. The sources considered include the interclump medium of a giant molecular cloud, dark cores which are the progenitors of star formation, material responding to recent star formation and which may form further stars, and stellar ejecta (including those of supernovae) about to merge with the interstellar medium. The measurement of the microwave background, mixing of material between different nuclear burning zones in evolved stars and turbulent boundary layers (which are present in and influence the structures and evolution of all diffuse astrophysical sources) are treated.
Occipital TMS at phosphene detection threshold captures attention automatically.
Rangelov, Dragan; Müller, Hermann J; Taylor, Paul C J
2015-04-01
Strong stimuli may capture attention automatically, suggesting that attentional selection is determined primarily by physical stimulus properties. The mechanisms underlying capture remain controversial, in particular, whether feedforward subcortical processes are its main source. Also, it remains unclear whether only physical stimulus properties determine capture strength. Here, we demonstrate strong capture in the absence of feedforward input to subcortical structures such as the superior colliculus, by using transcranial magnetic stimulation (TMS) over occipital visual cortex as an attention cue. This implies that the feedforward sweep through subcortex is not necessary for capture to occur but rather provides an additional source of capture. Furthermore, seen cues captured attention more strongly than (physically identical) unseen cues, suggesting that the momentary state of the nervous system modulates attentional selection. In summary, we demonstrate the existence of several sources of attentional capture, and that both physical stimulus properties and the state of the nervous system influence capture. Copyright © 2015 Elsevier Inc. All rights reserved.
Image steganalysis using Artificial Bee Colony algorithm
NASA Astrophysics Data System (ADS)
Sajedi, Hedieh
2017-09-01
Steganography is the science of secure communication where the presence of the communication cannot be detected while steganalysis is the art of discovering the existence of the secret communication. Processing a huge amount of information takes extensive execution time and computational sources most of the time. As a result, it is needed to employ a phase of preprocessing, which can moderate the execution time and computational sources. In this paper, we propose a new feature-based blind steganalysis method for detecting stego images from the cover (clean) images with JPEG format. In this regard, we present a feature selection technique based on an improved Artificial Bee Colony (ABC). ABC algorithm is inspired by honeybees' social behaviour in their search for perfect food sources. In the proposed method, classifier performance and the dimension of the selected feature vector depend on using wrapper-based methods. The experiments are performed using two large data-sets of JPEG images. Experimental results demonstrate the effectiveness of the proposed steganalysis technique compared to the other existing techniques.
2008 Fuel Cell Technologies Market Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
DOE
Fuel cells are electrochemical devices that combine hydrogen and oxygen to produce electricity, water, and heat. Unlike batteries, fuel cells continuously generate electricity, as long as a source of fuel is supplied. Moreover, fuel cells do not burn fuel, making the process quiet, pollution-free and two to three times more efficient than combustion. Fuel cell systems can be a truly zero-emission source of electricity, if the hydrogen is produced from non-polluting sources. Global concerns about climate change, energy security, and air pollution are driving demand for fuel cell technology. More than 630 companies and laboratories in the United States aremore » investing $1 billion a year in fuel cells or fuel cell component technologies. This report provides an overview of trends in the fuel cell industry and markets, including product shipments, market development, and corporate performance. It also provides snapshots of select fuel cell companies, including general business strategy and market focus, as well as, financial information for select publicly-traded companies.« less
Validation and calibration of structural models that combine information from multiple sources.
Dahabreh, Issa J; Wong, John B; Trikalinos, Thomas A
2017-02-01
Mathematical models that attempt to capture structural relationships between their components and combine information from multiple sources are increasingly used in medicine. Areas covered: We provide an overview of methods for model validation and calibration and survey studies comparing alternative approaches. Expert commentary: Model validation entails a confrontation of models with data, background knowledge, and other models, and can inform judgments about model credibility. Calibration involves selecting parameter values to improve the agreement of model outputs with data. When the goal of modeling is quantitative inference on the effects of interventions or forecasting, calibration can be viewed as estimation. This view clarifies issues related to parameter identifiability and facilitates formal model validation and the examination of consistency among different sources of information. In contrast, when the goal of modeling is the generation of qualitative insights about the modeled phenomenon, calibration is a rather informal process for selecting inputs that result in model behavior that roughly reproduces select aspects of the modeled phenomenon and cannot be equated to an estimation procedure. Current empirical research on validation and calibration methods consists primarily of methodological appraisals or case-studies of alternative techniques and cannot address the numerous complex and multifaceted methodological decisions that modelers must make. Further research is needed on different approaches for developing and validating complex models that combine evidence from multiple sources.
Resiliency of the Multiscale Retinex Image Enhancement Algorithm
NASA Technical Reports Server (NTRS)
Rahman, Zia-Ur; Jobson, Daniel J.; Woodell, Glenn A.
1998-01-01
The multiscale retinex with color restoration (MSRCR) continues to prove itself in extensive testing to be very versatile automatic image enhancement algorithm that simultaneously provides dynamic range compression, color constancy, and color rendition, However, issues remain with regard to the resiliency of the MSRCR to different image sources and arbitrary image manipulations which may have been applied prior to retinex processing. In this paper we define these areas of concern, provide experimental results, and, examine the effects of commonly occurring image manipulation on retinex performance. In virtually all cases the MSRCR is highly resilient to the effects of both the image source variations and commonly encountered prior image-processing. Significant artifacts are primarily observed for the case of selective color channel clipping in large dark zones in a image. These issues are of concerning the processing of digital image archives and other applications where there is neither control over the image acquisition process, nor knowledge about any processing done on th data beforehand.
López-Blanco, Rafael; Gilbert-López, Bienvenida; Rojas-Jiménez, Rubén; Robles-Molina, José; Ramos-Martos, Natividad; García-Reyes, Juan F; Molina-Díaz, Antonio
2016-05-15
The presence of BTEXS (benzene, toluene, ethylbenzene, xylenes and styrene) in virgin olive oils can be attributed to environmental contamination, but also to biological processes during oil lipogenesis (styrene). In this work, the processing factor of BTEXS from olives to olive oil during its production was evaluated at lab-scale with an Abencor system. Benzene showed the lowest processing factor (15%), whereas toluene and xylenes showed an intermediate behavior (with 40-60% efficiency), and ethylbenzene and styrene were completely transferred (100%). In addition, an attempt to examine the contribution of potential sources to olives contamination with BTEXS was carried out for the first time. Two types of olives samples were classified according to their proximity to the contamination source (road). Although higher levels of BTEXS were found in samples close to roads, the concentrations were relatively low and do not constitute a major contribution to BTEXS usually detected in olive oil. Copyright © 2015 Elsevier Ltd. All rights reserved.
48 CFR 873.116 - Source selection decision.
Code of Federal Regulations, 2014 CFR
2014-10-01
... Source selection decision. (a) An integrated comparative assessment of proposals should be performed... source selection team, or advisory boards or panels, may conduct comparative analysis(es) of proposals...
48 CFR 873.116 - Source selection decision.
Code of Federal Regulations, 2013 CFR
2013-10-01
... Source selection decision. (a) An integrated comparative assessment of proposals should be performed... source selection team, or advisory boards or panels, may conduct comparative analysis(es) of proposals...
48 CFR 873.116 - Source selection decision.
Code of Federal Regulations, 2011 CFR
2011-10-01
... Source selection decision. (a) An integrated comparative assessment of proposals should be performed... source selection team, or advisory boards or panels, may conduct comparative analysis(es) of proposals...
48 CFR 873.116 - Source selection decision.
Code of Federal Regulations, 2012 CFR
2012-10-01
... Source selection decision. (a) An integrated comparative assessment of proposals should be performed... source selection team, or advisory boards or panels, may conduct comparative analysis(es) of proposals...
48 CFR 873.116 - Source selection decision.
Code of Federal Regulations, 2010 CFR
2010-10-01
... Source selection decision. (a) An integrated comparative assessment of proposals should be performed... source selection team, or advisory boards or panels, may conduct comparative analysis(es) of proposals...
Goodnough, L Henry; Dinuoscio, Gregg J; Ferguson, James W; Williams, Trevor; Lang, Richard A; Atit, Radhika P
2014-02-01
The cranial bones and dermis differentiate from mesenchyme beneath the surface ectoderm. Fate selection in cranial mesenchyme requires the canonical Wnt effector molecule β-catenin, but the relative contribution of Wnt ligand sources in this process remains unknown. Here we show Wnt ligands are expressed in cranial surface ectoderm and underlying supraorbital mesenchyme during dermal and osteoblast fate selection. Using conditional genetics, we eliminate secretion of all Wnt ligands from cranial surface ectoderm or undifferentiated mesenchyme, to uncover distinct roles for ectoderm- and mesenchyme-derived Wnts. Ectoderm Wnt ligands induce osteoblast and dermal fibroblast progenitor specification while initiating expression of a subset of mesenchymal Wnts. Mesenchyme Wnt ligands are subsequently essential during differentiation of dermal and osteoblast progenitors. Finally, ectoderm-derived Wnt ligands provide an inductive cue to the cranial mesenchyme for the fate selection of dermal fibroblast and osteoblast lineages. Thus two sources of Wnt ligands perform distinct functions during osteoblast and dermal fibroblast formation.
BioImageXD: an open, general-purpose and high-throughput image-processing platform.
Kankaanpää, Pasi; Paavolainen, Lassi; Tiitta, Silja; Karjalainen, Mikko; Päivärinne, Joacim; Nieminen, Jonna; Marjomäki, Varpu; Heino, Jyrki; White, Daniel J
2012-06-28
BioImageXD puts open-source computer science tools for three-dimensional visualization and analysis into the hands of all researchers, through a user-friendly graphical interface tuned to the needs of biologists. BioImageXD has no restrictive licenses or undisclosed algorithms and enables publication of precise, reproducible and modifiable workflows. It allows simple construction of processing pipelines and should enable biologists to perform challenging analyses of complex processes. We demonstrate its performance in a study of integrin clustering in response to selected inhibitors.
The "Eye Avoidance" Hypothesis of Autism Face Processing
ERIC Educational Resources Information Center
Tanaka, James W.; Sung, Andrew
2016-01-01
Although a growing body of research indicates that children with autism spectrum disorder (ASD) exhibit selective deficits in their ability to recognize facial identities and expressions, the source of their face impairment is, as yet, undetermined. In this paper, we consider three possible accounts of the autism face deficit: (1) the holistic…
ERIC Educational Resources Information Center
Primi, Ricardo
2002-01-01
Created two geometric inductive reasoning matrix tests by manipulating four sources of complexity orthogonally. Results for 313 undergraduates show that fluid intelligence is most strongly associated with the part of the central executive component of working memory that is related to controlled attention processing and selective encoding. (SLD)
Photosynthesis and growth of selected scotch pine seed sources
John C. Gordon; Gordon E. Gatherum
1968-01-01
A number of problems related to the culture of Scotch pine (Pinus sylvestris L.) arose following the increased planting of this species in Iowa. Therefore, a program of controlled-environment experiments to determine the effects of genetic and environmental factors on physiological processes important to the culture of Scotch pine was begun by the...
1988-03-01
operators recommendations, certain select individuals are trained and used as SPC technicians. " POKA - YOKE " or mistake proofing from Shiegeo Shingo’s "Zero...Quality Control: Source Inspection and the POKA - YOKE System" has been locally applied to operators processes with great success. This pre-control
Selected Legal Issues in Catholic Schools.
ERIC Educational Resources Information Center
Shaughnessy, Mary Angela
This book examines legal issues that affect Catholic high schools. Chapter 1 discusses sources of the law and how fairness and due process, federal and state statutes, and various guidelines shape the law. Tort law, corporal punishment, search and seizure, defamation of character, and negligence are covered in chapter 2. Chapter 3 details issues…
Foundations that Provide Support for Human Services: A Selected List.
ERIC Educational Resources Information Center
Smith, Bertha, Comp.
Lists of foundations can aid the user in securing funding sources for projects in the areas of health, education, community development and/or social services. The user is cautioned that grantsmanship is a competitive process; grants application procedures vary from one foundation to another. In the absence of any specific guidelines, important…
Preparation of .alpha., .beta.-unsaturated carboxylic acids and anhydrides
Spivey, James Jerry; Gogate, Makarand Ratnakav; Zoeller, Joseph Robert; Tustin, Gerald Charles
1998-01-01
Disclosed is a process for the preparation of .alpha.,.beta.-unsaturated carboxylic acids and anhydrides thereof which comprises contacting formaldehyde or a source of formaldehyde with a carboxylic anhydride in the presence of a catalyst comprising mixed oxides of vanadium, phosphorus and, optionally, a third component selected from titanium, aluminum or, preferably silicon.
Preparation of {alpha}, {beta}-unsaturated carboxylic acids and anhydrides
Spivey, J.J.; Gogate, M.R.; Zoeller, J.R.; Tustin, G.C.
1998-01-20
Disclosed is a process for the preparation of {alpha},{beta}-unsaturated carboxylic acids and anhydrides thereof which comprises contacting formaldehyde or a source of formaldehyde with a carboxylic anhydride in the presence of a catalyst comprising mixed oxides of vanadium, phosphorus and, optionally, a third component selected from titanium, aluminum or, preferably silicon.
ERIC Educational Resources Information Center
Dick, Anthony Steven
2012-01-01
Two experiments examined processes underlying cognitive inflexibility in set-shifting tasks typically used to assess the development of executive function in children. Adult participants performed a Flexible Item Selection Task (FIST) that requires shifting from categorizing by one dimension (e.g., color) to categorizing by a second orthogonal…
ERIC Educational Resources Information Center
Jorgensen, Earl; Mabry, Edward A.
During the past decade, the influence of electronically recorded music and the message it transmits have caused media scholars to reexamine and modify the theories upon which the basic process of communication is dependent. While the five primary functions (source, transmitter, channel, receiver, and destination) remain unchanged, an additional…
Photoacoustic Spectroscopy with Quantum Cascade Lasers for Trace Gas Detection
Elia, Angela; Di Franco, Cinzia; Lugarà, Pietro Mario; Scamarcio, Gaetano
2006-01-01
Various applications, such as pollution monitoring, toxic-gas detection, non invasive medical diagnostics and industrial process control, require sensitive and selective detection of gas traces with concentrations in the parts in 109 (ppb) and sub-ppb range. The recent development of quantum-cascade lasers (QCLs) has given a new aspect to infrared laser-based trace gas sensors. In particular, single mode distributed feedback QCLs are attractive spectroscopic sources because of their excellent properties in terms of narrow linewidth, average power and room temperature operation. In combination with these laser sources, photoacoustic spectroscopy offers the advantage of high sensitivity and selectivity, compact sensor platform, fast time-response and user friendly operation. This paper reports recent developments on quantum cascade laser-based photoacoustic spectroscopy for trace gas detection. In particular, different applications of a photoacoustic trace gas sensor employing a longitudinal resonant cell with a detection limit on the order of hundred ppb of ozone and ammonia are discussed. We also report two QC laser-based photoacoustic sensors for the detection of nitric oxide, for environmental pollution monitoring and medical diagnostics, and hexamethyldisilazane, for applications in semiconductor manufacturing process.
Radioactive waste management treatments: A selection for the Italian scenario
DOE Office of Scientific and Technical Information (OSTI.GOV)
Locatelli, G.; Mancini, M.; Sardini, M.
2012-07-01
The increased attention for radioactive waste management is one of the most peculiar aspects of the nuclear sector considering both reactors and not power sources. The aim of this paper is to present the state-of-art of treatments for radioactive waste management all over the world in order to derive guidelines for the radioactive waste management in the Italian scenario. Starting with an overview on the international situation, it analyses the different sources, amounts, treatments, social and economic impacts looking at countries with different industrial backgrounds, energetic policies, geography and population. It lists all these treatments and selects the most reasonablemore » according to technical, economic and social criteria. In particular, a double scenario is discussed (to be considered in case of few quantities of nuclear waste): the use of regional, centralized, off site processing facilities, which accept waste from many nuclear plants, and the use of mobile systems, which can be transported among multiple nuclear sites for processing campaigns. At the end the treatments suitable for the Italian scenario are presented providing simplified work-flows and guidelines. (authors)« less
Challenges of UV light processing of low UVT foods and beverages
NASA Astrophysics Data System (ADS)
Koutchma, Tatiana
2010-08-01
Ultraviolet (UV) technology holds promise as a low cost non-thermal alternative to heat pasteurization of liquid foods and beverages. However, its application for foods is still limited due to low UV transmittance (LUVT). LUVT foods have a diverse range of chemical (pH, Brix, Aw), physical (density and viscosity) and optical properties (absorbance and scattering) that are critical for systems and process designs. The commercially available UV sources tested for foods include low and medium pressure mercury lamps (LPM and MPM), excimer and pulsed lamps (PUV). The LPM and excimer lamps are monochromatic sources whereas emission of MPM and PUV is polychromatic. The optimized design of UV-systems and UV-sources with parameters that match to specific product spectra have a potential to make UV treatments of LUVT foods more effective and will serve its further commercialization. In order to select UV source for specific food application, processing effects on nutritional, quality, sensorial and safety markers have to be evaluated. This paper will review current status of UV technology for food processing along with regulatory requirements. Discussion of approaches and results of measurements of chemico-physical and optical properties of various foods (fresh juices, milk, liquid whey proteins and sweeteners) that are critical for UV process and systems design will follow. Available UV sources did not prove totally effective either resulting in low microbial reduction or UV over-dosing of the product thereby leading to sensory changes. Beam shaping of UV light presents new opportunities to improve dosage uniformity and delivery of UV photons in LUVT foods.
Social-cognitive processes in preschoolers' selective trust: three cultures compared.
Lucas, Amanda J; Lewis, Charlie; Pala, F Cansu; Wong, Katie; Berridge, Damon
2013-03-01
Research on preschoolers' selective learning has mostly been conducted in English-speaking countries. We compared the performance of Turkish preschoolers (who are exposed to a language with evidential markers), Chinese preschoolers (known to be advanced in executive skills), and English preschoolers on an extended selective trust task (N = 144). We also measured children's executive function skills and their ability to attribute false belief. Overall we found a Turkish (rather than a Chinese) advantage in selective trust and a relationship between selective trust and false belief (rather than executive function). This is the 1st evidence that exposure to a language that obliges speakers to state the sources of their knowledge may sensitize preschoolers to informant reliability. It is also the first demonstration of an association between false belief and selective trust. Together these findings suggest that effective selective learning may progress alongside children's developing capacity to assess the knowledge of others.
Torabifard, Mina; Arjmandi, Reza; Rashidi, Alimorad; Nouri, Jafar; Mohammadfam, Iraj
2018-01-10
The health and environmental effects of chemical processes can be assessed during the initial stage of their production. In this paper, the Chemical Screening Tool for Exposure and Environmental Release (ChemSTEER) software was used to compare the health and environmental risks of spray pyrolysis and wet chemical techniques for the fabrication of nanostructured metal oxide on a semi-industrial scale with a capacity of 300 kg/day in Iran. The pollution sources identified in each production process were pairwise compared in Expert Choice software using indicators including respiratory damage, skin damage, and environmental damages including air, water, and soil pollution. The synthesis of nanostructured zinc oxide using the wet chemical technique (with 0.523 wt%) leads to lower health and environmental risks compared to when spray pyrolysis is used (with 0.477 wt%). The health and environmental risk assessment of nanomaterial production processes can help select safer processes, modify the operation conditions, and select or modify raw materials that can help eliminate the risks.
Scandium recovery from slags after oxidized nickel ore processing
NASA Astrophysics Data System (ADS)
Smyshlyaev, Denis; Botalov, Maxim; Bunkov, Grigory; Rychkov, Vladimir; Kirillov, Evgeny; Kirillov, Sergey; Semenishchev, Vladimir
2017-09-01
One of the possible sources of scandium production - waste (slags) from processing of oxidized nickel ores, has been considered in present research work. The hydrometallurgical method has been selected as the primary for scandium extraction. Different reagents for leaching of scandium, such as sulfuric acid, various carbonate salts and fluorides, have been tested. Sulfuric acid has been recognized as an optimal leaching reagent. Sulfuric acid concentration of 100 g L-1 allowed recovering up to 97 % of scandium.
USDA-ARS?s Scientific Manuscript database
The objective was to quantify the effect of marketing group (MG) on the variability of primal quality. Pigs (N=7,684) were slaughtered in 3 MGs from 8 barns. Pigs were from genetic selection programs focused on lean growth (L; group 1 n=1,131; group 2 n=1,466; group 3 n=1,030) or superior meat qua...
The search and selection of assisted living facilities by elders and family.
Castle, Nicholas G; Sonon, Kristen E
2007-08-01
In this study, we examine factors associated with the search, selection, and satisfaction of residents and family members in assisted living. Data were collected from 375 residents of 25 assisted living facilities matched with 375 family members. We conducted face-to-face interviews with the residents to determine: (1) the principal decision maker; (2) the process of searching for a facility; (3) the factors crucial to facility selection; (4) the time frame from the relocation decision to relocation; and (5) satisfaction with selection. Similar questions were asked of family members, using a mail survey. Residents described themselves as extremely influential in 39% of cases when searching for a facility, and in 27% of cases when selecting a facility. Quality, cost, and location were the most influential factors for both residents and family members in selecting a facility. Almost all residents and family would use different search and selection processes if they had to select a facility again, and almost all were dissatisfied with the sources of information available. Consumers and policy makers both favor the use of assisted living settings; but, the information available to choose a setting is far from ideal, and may represent a barrier to both consumer and policy makers' agendas.
Processing Uav and LIDAR Point Clouds in Grass GIS
NASA Astrophysics Data System (ADS)
Petras, V.; Petrasova, A.; Jeziorska, J.; Mitasova, H.
2016-06-01
Today's methods of acquiring Earth surface data, namely lidar and unmanned aerial vehicle (UAV) imagery, non-selectively collect or generate large amounts of points. Point clouds from different sources vary in their properties such as number of returns, density, or quality. We present a set of tools with applications for different types of points clouds obtained by a lidar scanner, structure from motion technique (SfM), and a low-cost 3D scanner. To take advantage of the vertical structure of multiple return lidar point clouds, we demonstrate tools to process them using 3D raster techniques which allow, for example, the development of custom vegetation classification methods. Dense point clouds obtained from UAV imagery, often containing redundant points, can be decimated using various techniques before further processing. We implemented and compared several decimation techniques in regard to their performance and the final digital surface model (DSM). Finally, we will describe the processing of a point cloud from a low-cost 3D scanner, namely Microsoft Kinect, and its application for interaction with physical models. All the presented tools are open source and integrated in GRASS GIS, a multi-purpose open source GIS with remote sensing capabilities. The tools integrate with other open source projects, specifically Point Data Abstraction Library (PDAL), Point Cloud Library (PCL), and OpenKinect libfreenect2 library to benefit from the open source point cloud ecosystem. The implementation in GRASS GIS ensures long term maintenance and reproducibility by the scientific community but also by the original authors themselves.
The feminist approach in the decision-making process for treatment of women with breast cancer.
Szumacher, Ewa
2006-09-01
The principal aim of this review was to investigate a feminist approach to the decision-making process for women with breast cancer. Empirical research into patient preferences for being informed about and participating in healthcare decisions has some limitations because it is mostly quantitative and designed within the dominant medical culture. Indigenous medical knowledge and alternative medical treatments are not widely accepted because of the lack of confirmed efficacy of such treatments in evidence-based literature. While discussing their treatment options with oncologists, women with breast cancer frequently express many concerns regarding treatment side effects, and sometimes decline conventional treatment when the risks are too high. A search of all relevant literary sources, including Pub-Med, ERIC, Medline, and the Ontario Institute for Studies in Education at the University of Toronto was conducted. The key words for selection of the articles were "feminism," "decision-making," "patients preferences for treatment," and "breast cancer." Fifty-one literary sources were selected. The review was divided into the following themes: (1) limitations of the patient decision-making process in conventional medicine; (2) participation of native North American patients in healthcare decisions; (3) towards a feminist approach to breast cancer; and (4) towards a feminist theory of breast cancer. This article discusses the importance of a feminist approach to the decision-making process for treatment of patients with breast cancer. As the literature suggests, the needs of minority patients are not completely fulfilled in Western medical culture. Introducing feminist theory into evidence-based medicine will help patients to be better informed about treatment choices and will assist them to select treatment according to their own beliefs and values.
Wang, S F; Zhan, S Y
2016-07-01
Electronic healthcare databases have become an important source for active surveillance of drug safety in the era of big data. The traditional epidemiology research designs are needed to confirm the association between drug use and adverse events based on these datasets, and the selection of the comparative control is essential to each design. This article aims to explain the principle and application of each type of control selection, introduce the methods and parameters for method comparison, and describe the latest achievements in the batch processing of control selection, which would provide important methodological reference for the use of electronic healthcare databases to conduct post-marketing drug safety surveillance in China.
Multiple source/multiple target fluid transfer apparatus
Turner, Terry D.
1997-01-01
A fluid transfer apparatus includes: a) a plurality of orifices for connection with fluid sources; b) a plurality of orifices for connection with fluid targets; c) a set of fluid source conduits and fluid target conduits associated with the orifices; d) a pump fluidically interposed between the source and target conduits to transfer fluid therebetween; e) a purge gas conduit in fluid communication with the fluid source conduits, fluid target conduits and pump to receive and pass a purge gas under pressure; f) a solvent conduit in fluid communication with the fluid source conduits, fluid target conduits and pump to receive and pass solvent, the solvent conduit including a solvent valve; g) pump control means for controlling operation of the pump; h) purge gas valve control means for controlling operation of the purge gas valve to selectively impart flow of purge gas to the fluid source conduits, fluid target conduits and pump; i) solvent valve control means for controlling operation of the solvent valve to selectively impart flow of solvent to the fluid source conduits, fluid target conduits and pump; and j) source and target valve control means for controlling operation of the fluid source conduit valves and the fluid target conduit valves to selectively impart passage of fluid between a selected one of the fluid source conduits and a selected one of the fluid target conduits through the pump and to enable passage of solvent or purge gas through selected fluid source conduits and selected fluid target conduits.
Multiple source/multiple target fluid transfer apparatus
Turner, T.D.
1997-08-26
A fluid transfer apparatus includes: (a) a plurality of orifices for connection with fluid sources; (b) a plurality of orifices for connection with fluid targets; (c) a set of fluid source conduits and fluid target conduits associated with the orifices; (d) a pump fluidically interposed between the source and target conduits to transfer fluid there between; (e) a purge gas conduit in fluid communication with the fluid source conduits, fluid target conduits and pump to receive and pass a purge gas under pressure; (f) a solvent conduit in fluid communication with the fluid source conduits, fluid target conduits and pump to receive and pass solvent, the solvent conduit including a solvent valve; (g) pump control means for controlling operation of the pump; (h) purge gas valve control means for controlling operation of the purge gas valve to selectively impart flow of purge gas to the fluid source conduits, fluid target conduits and pump; (i) solvent valve control means for controlling operation of the solvent valve to selectively impart flow of solvent to the fluid source conduits, fluid target conduits and pump; and (j) source and target valve control means for controlling operation of the fluid source conduit valves and the fluid target conduit valves to selectively impart passage of fluid between a selected one of the fluid source conduits and a selected one of the fluid target conduits through the pump and to enable passage of solvent or purge gas through selected fluid source conduits and selected fluid target conduits. 6 figs.
Reconfigurable pipelined processor
DOE Office of Scientific and Technical Information (OSTI.GOV)
Saccardi, R.J.
1989-09-19
This patent describes a reconfigurable pipelined processor for processing data. It comprises: a plurality of memory devices for storing bits of data; a plurality of arithmetic units for performing arithmetic functions with the data; cross bar means for connecting the memory devices with the arithmetic units for transferring data therebetween; at least one counter connected with the cross bar means for providing a source of addresses to the memory devices; at least one variable tick delay device connected with each of the memory devices and arithmetic units; and means for providing control bits to the variable tick delay device formore » variably controlling the input and output operations thereof to selectively delay the memory devices and arithmetic units to align the data for processing in a selected sequence.« less
Mushroom-free selective epitaxial growth of Si, SiGe and SiGe:B raised sources and drains
NASA Astrophysics Data System (ADS)
Hartmann, J. M.; Benevent, V.; Barnes, J. P.; Veillerot, M.; Lafond, D.; Damlencourt, J. F.; Morvan, S.; Prévitali, B.; Andrieu, F.; Loubet, N.; Dutartre, D.
2013-05-01
We have evaluated various Cyclic Selective Epitaxial Growth/Etch (CSEGE) processes in order to grow "mushroom-free" Si and SiGe:B Raised Sources and Drains (RSDs) on each side of ultra-short gate length Extra-Thin Silicon-On-Insulator (ET-SOI) transistors. The 750 °C, 20 Torr Si CSEGE process we have developed (5 chlorinated growth steps with four HCl etch steps in-between) yielded excellent crystalline quality, typically 18 nm thick Si RSDs. Growth was conformal along the Si3N4 sidewall spacers, without any poly-Si mushrooms on top of unprotected gates. We have then evaluated on blanket 300 mm Si(001) wafers the feasibility of a 650 °C, 20 Torr SiGe:B CSEGE process (5 chlorinated growth steps with four HCl etch steps in-between, as for Si). As expected, the deposited thickness decreased as the total HCl etch time increased. This came hands in hands with unforeseen (i) decrease of the mean Ge concentration (from 30% down to 26%) and (ii) increase of the substitutional B concentration (from 2 × 1020 cm-3 up to 3 × 1020 cm-3). They were due to fluctuations of the Ge concentration and of the atomic B concentration [B] in such layers (drop of the Ge% and increase of [B] at etch step locations). Such blanket layers were a bit rougher than layers grown using a single epitaxy step, but nevertheless of excellent crystalline quality. Transposition of our CSEGE process on patterned ET-SOI wafers did not yield the expected results. HCl etch steps indeed helped in partly or totally removing the poly-SiGe:B mushrooms on top of the gates. This was however at the expense of the crystalline quality and 2D nature of the ˜45 nm thick Si0.7Ge0.3:B recessed sources and drains selectively grown on each side of the imperfectly protected poly-Si gates. The only solution we have so far identified that yields a lesser amount of mushrooms while preserving the quality of the S/D is to increase the HCl flow during growth steps.
Natural Resource Information System. Volume 1: Overall description
NASA Technical Reports Server (NTRS)
1972-01-01
A prototype computer-based Natural Resource Information System was designed which could store, process, and display data of maximum usefulness to land management decision making. The system includes graphic input and display, the use of remote sensing as a data source, and it is useful at multiple management levels. A survey established current decision making processes and functions, information requirements, and data collection and processing procedures. The applications of remote sensing data and processing requirements were established. Processing software was constructed and a data base established using high-altitude imagery and map coverage of selected areas of SE Arizona. Finally a demonstration of system processing functions was conducted utilizing material from the data base.
TRACC: An open source software for processing sap flux data from thermal dissipation probes
Ward, Eric J.; Domec, Jean-Christophe; King, John; ...
2017-05-02
Here, thermal dissipation probes (TDPs) have become a widely used method of monitoring plant water use in recent years. The use of TDPs requires calibration to a theoretical zero-flow value (ΔT0); usually based upon the assumption that at least some nighttime measurements represent zero-flow conditions. Fully automating the processing of data from TDPs is made exceedingly difficult due to errors arising from many sources. However, it is desirable to minimize variation arising from different researchers’ processing data, and thus, a common platform for processing data, including editing raw data and determination of ΔT0, is useful and increases the transparency andmore » replicability of TDP-based research. Here, we present the TDP data processing software TRACC (Thermal dissipation Review Assessment Cleaning and Conversion) to serve this purpose. TRACC is an open-source software written in the language R, using graphical presentation of data and on screen prompts with yes/no or simple numerical responses. It allows the user to select several important options, such as calibration coefficients and the exclusion of nights when vapor pressure deficit does not approach zero. Although it is designed for users with no coding experience, the outputs of TRACC could be easily incorporated into more complex models or software.« less
TRACC: An open source software for processing sap flux data from thermal dissipation probes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ward, Eric J.; Domec, Jean-Christophe; King, John
Here, thermal dissipation probes (TDPs) have become a widely used method of monitoring plant water use in recent years. The use of TDPs requires calibration to a theoretical zero-flow value (ΔT0); usually based upon the assumption that at least some nighttime measurements represent zero-flow conditions. Fully automating the processing of data from TDPs is made exceedingly difficult due to errors arising from many sources. However, it is desirable to minimize variation arising from different researchers’ processing data, and thus, a common platform for processing data, including editing raw data and determination of ΔT0, is useful and increases the transparency andmore » replicability of TDP-based research. Here, we present the TDP data processing software TRACC (Thermal dissipation Review Assessment Cleaning and Conversion) to serve this purpose. TRACC is an open-source software written in the language R, using graphical presentation of data and on screen prompts with yes/no or simple numerical responses. It allows the user to select several important options, such as calibration coefficients and the exclusion of nights when vapor pressure deficit does not approach zero. Although it is designed for users with no coding experience, the outputs of TRACC could be easily incorporated into more complex models or software.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
This Interim Record of Decision (ROD) presents the selected remedial action for vadose zone soil at Site 24 at Marine Corps Air Station (MCAS) El Toro, located in El Toro, California. The selected remedy for remediation of soil at Site 24 is soil vapor extraction (SVE), the US EPA presumptive remedy for VOC-contaminated soil. The process uses a vacuum to pull VOC-contaminated vapors from the soil through SVE wells.
Mohammed, Yassene; Domański, Dominik; Jackson, Angela M; Smith, Derek S; Deelder, André M; Palmblad, Magnus; Borchers, Christoph H
2014-06-25
One challenge in Multiple Reaction Monitoring (MRM)-based proteomics is to select the most appropriate surrogate peptides to represent a target protein. We present here a software package to automatically generate these most appropriate surrogate peptides for an LC/MRM-MS analysis. Our method integrates information about the proteins, their tryptic peptides, and the suitability of these peptides for MRM which is available online in UniProtKB, NCBI's dbSNP, ExPASy, PeptideAtlas, PRIDE, and GPMDB. The scoring algorithm reflects our knowledge in choosing the best candidate peptides for MRM, based on the uniqueness of the peptide in the targeted proteome, its physiochemical properties, and whether it previously has been observed. The modularity of the workflow allows further extension and additional selection criteria to be incorporated. We have developed a simple Web interface where the researcher provides the protein accession number, the subject organism, and peptide-specific options. Currently, the software is designed for human and mouse proteomes, but additional species can be easily be added. Our software improved the peptide selection by eliminating human error, considering multiple data sources and all of the isoforms of the protein, and resulted in faster peptide selection - approximately 50 proteins per hour compared to 8 per day. Compiling a list of optimal surrogate peptides for target proteins to be analyzed by LC/MRM-MS has been a cumbersome process, in which expert researchers retrieved information from different online repositories and used their own reasoning to find the most appropriate peptides. Our scientific workflow automates this process by integrating information from different data sources including UniProt, Global Proteome Machine, NCBI's dbSNP, and PeptideAtlas, simulating the researchers' reasoning, and incorporating their knowledge of how to select the best proteotypic peptides for an MRM analysis. The developed software can help to standardize the selection of peptides, eliminate human error, and increase productivity. Copyright © 2014 Elsevier B.V. All rights reserved.
Naser, Mohamed A.; Patterson, Michael S.
2011-01-01
Reconstruction algorithms are presented for two-step solutions of the bioluminescence tomography (BLT) and the fluorescence tomography (FT) problems. In the first step, a continuous wave (cw) diffuse optical tomography (DOT) algorithm is used to reconstruct the tissue optical properties assuming known anatomical information provided by x-ray computed tomography or other methods. Minimization problems are formed based on L1 norm objective functions, where normalized values for the light fluence rates and the corresponding Green’s functions are used. Then an iterative minimization solution shrinks the permissible regions where the sources are allowed by selecting points with higher probability to contribute to the source distribution. Throughout this process the permissible region shrinks from the entire object to just a few points. The optimum reconstructed bioluminescence and fluorescence distributions are chosen to be the results of the iteration corresponding to the permissible region where the objective function has its global minimum This provides efficient BLT and FT reconstruction algorithms without the need for a priori information about the bioluminescence sources or the fluorophore concentration. Multiple small sources and large distributed sources can be reconstructed with good accuracy for the location and the total source power for BLT and the total number of fluorophore molecules for the FT. For non-uniform distributed sources, the size and magnitude become degenerate due to the degrees of freedom available for possible solutions. However, increasing the number of data points by increasing the number of excitation sources can improve the accuracy of reconstruction for non-uniform fluorophore distributions. PMID:21326647
syris: a flexible and efficient framework for X-ray imaging experiments simulation.
Faragó, Tomáš; Mikulík, Petr; Ershov, Alexey; Vogelgesang, Matthias; Hänschke, Daniel; Baumbach, Tilo
2017-11-01
An open-source framework for conducting a broad range of virtual X-ray imaging experiments, syris, is presented. The simulated wavefield created by a source propagates through an arbitrary number of objects until it reaches a detector. The objects in the light path and the source are time-dependent, which enables simulations of dynamic experiments, e.g. four-dimensional time-resolved tomography and laminography. The high-level interface of syris is written in Python and its modularity makes the framework very flexible. The computationally demanding parts behind this interface are implemented in OpenCL, which enables fast calculations on modern graphics processing units. The combination of flexibility and speed opens new possibilities for studying novel imaging methods and systematic search of optimal combinations of measurement conditions and data processing parameters. This can help to increase the success rates and efficiency of valuable synchrotron beam time. To demonstrate the capabilities of the framework, various experiments have been simulated and compared with real data. To show the use case of measurement and data processing parameter optimization based on simulation, a virtual counterpart of a high-speed radiography experiment was created and the simulated data were used to select a suitable motion estimation algorithm; one of its parameters was optimized in order to achieve the best motion estimation accuracy when applied on the real data. syris was also used to simulate tomographic data sets under various imaging conditions which impact the tomographic reconstruction accuracy, and it is shown how the accuracy may guide the selection of imaging conditions for particular use cases.
Dual sensitivity mode system for monitoring processes and sensors
Wilks, Alan D.; Wegerich, Stephan W.; Gross, Kenneth C.
2000-01-01
A method and system for analyzing a source of data. The system and method involves initially training a system using a selected data signal, calculating at least two levels of sensitivity using a pattern recognition methodology, activating a first mode of alarm sensitivity to monitor the data source, activating a second mode of alarm sensitivity to monitor the data source and generating a first alarm signal upon the first mode of sensitivity detecting an alarm condition and a second alarm signal upon the second mode of sensitivity detecting an associated alarm condition. The first alarm condition and second alarm condition can be acted upon by an operator and/or analyzed by a specialist or computer program.
Code of Federal Regulations, 2010 CFR
2010-10-01
..., and marking of contractor bid or proposal information and source selection information. 603.104-4... contractor bid or proposal information and source selection information. (a) The following classes of persons may be authorized to receive contractor bid or proposal information or source selection information by...
Shim, Kyusung; Do, Nhu Tri; An, Beongku
2017-01-01
In this paper, we study the physical layer security (PLS) of opportunistic scheduling for uplink scenarios of multiuser multirelay cooperative networks. To this end, we propose a low-complexity, yet comparable secrecy performance source relay selection scheme, called the proposed source relay selection (PSRS) scheme. Specifically, the PSRS scheme first selects the least vulnerable source and then selects the relay that maximizes the system secrecy capacity for the given selected source. Additionally, the maximal ratio combining (MRC) technique and the selection combining (SC) technique are considered at the eavesdropper, respectively. Investigating the system performance in terms of secrecy outage probability (SOP), closed-form expressions of the SOP are derived. The developed analysis is corroborated through Monte Carlo simulation. Numerical results show that the PSRS scheme significantly improves the secure ability of the system compared to that of the random source relay selection scheme, but does not outperform the optimal joint source relay selection (OJSRS) scheme. However, the PSRS scheme drastically reduces the required amount of channel state information (CSI) estimations compared to that required by the OJSRS scheme, specially in dense cooperative networks. PMID:28212286
Marshall, Tom R; den Boer, Sebastiaan; Cools, Roshan; Jensen, Ole; Fallon, Sean James; Zumer, Johanna M
2018-01-01
Selective attention is reflected neurally in changes in the power of posterior neural oscillations in the alpha (8-12 Hz) and gamma (40-100 Hz) bands. Although a neural mechanism that allows relevant information to be selectively processed has its advantages, it may lead to lucrative or dangerous information going unnoticed. Neural systems are also in place for processing rewarding and punishing information. Here, we examine the interaction between selective attention (left vs. right) and stimulus's learned value associations (neutral, punished, or rewarded) and how they compete for control of posterior neural oscillations. We found that both attention and stimulus-value associations influenced neural oscillations. Whereas selective attention had comparable effects on alpha and gamma oscillations, value associations had dissociable effects on these neural markers of attention. Salient targets (associated with positive and negative outcomes) hijacked changes in alpha power-increasing hemispheric alpha lateralization when salient targets were attended, decreasing it when they were being ignored. In contrast, hemispheric gamma-band lateralization was specifically abolished by negative distractors. Source analysis indicated occipital generators of both attentional and value effects. Thus, posterior cortical oscillations support both the ability to selectively attend while at the same time retaining the ability to remain sensitive to valuable features in the environment. Moreover, the versatility of our attentional system to respond separately to salient from merely positively valued stimuli appears to be carried out by separate neural processes reflected in different frequency bands.
Selectivity of a lithium-recovery process based on LiFePO4.
Trócoli, Rafael; Battistel, Alberto; Mantia, Fabio La
2014-08-04
The demand for lithium will increase in the near future to 713,000 tonnes per year. Although lake brines contribute to 80% of the production, existing methods for purification of lithium from this source are expensive, slow, and inefficient. A novel electrochemical process with low energy consumption and the ability to increase the purity of a brine solution to close to 98% with a single-stage galvanostatic cycle is presented. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Sites that Can Produce Left-handed Amino Acids in the Supernova Neutrino Amino Acid Processing Model
NASA Astrophysics Data System (ADS)
Boyd, Richard N.; Famiano, Michael A.; Onaka, Takashi; Kajino, Toshitaka
2018-03-01
The Supernova Neutrino Amino Acid Processing model, which uses electron anti-neutrinos and the magnetic field from a source object such as a supernova to selectively destroy one amino acid chirality, is studied for possible sites that would produce meteoroids with partially left-handed amino acids. Several sites appear to provide the requisite magnetic field intensities and electron anti-neutrino fluxes. These results have obvious implications for the origin of life on Earth.
[Methodologic inconsistency in anamnesis education at medical schools].
Zago, M A
1989-01-01
Some relevant points of the process of obtaining the medical anamnesis and physical examination, and the formulation of diagnostic hypotheses are analyzed. The main methodological features include: preponderance of qualitative data, absence of preselected hypotheses, direct involvement of the observer (physician) with the data source (patient), and selection of hypotheses and changes of the patient during the process. Thus, diagnostic investigation does not follow the paradigm of quantitative scientific method, rooted on the logic positivism, which dominates medical research and education.
Implications of scale-independent habitat specialization on persistence of a rare small mammal
Cleaver, Michael; Klinger, Robert C.; Anderson, Steven T.; Maier, Paul A.; Clark, Jonathan
2015-01-01
We assessed the habitat use patterns of the Amargosa vole Microtus californicus scirpensis , an endangered rodent endemic to wetland vegetation along a 3.5 km stretch of the Amargosa River in the Mojave Desert, USA. Our goals were to: (1) quantify the vole’s abundance, occupancy rates and habitat selection patterns along gradients of vegetation cover and spatial scale; (2) identify the processes that likely had the greatest influence on its habitat selection patterns. We trapped voles monthly in six 1 ha grids from January to May 2012 and measured habitat structure at subgrid (View the MathML source225m2) and trap (View the MathML source1m2) scales in winter and spring seasons. Regardless of scale, analyses of density, occupancy and vegetation structure consistently indicated that voles occurred in patches of bulrush (Schoenoplectus americanus ; Cyperaceae) where cover >50%. The majority of evidence indicates the vole's habitat selectivity is likely driven by bulrush providing protection from intense predation. However, a combination of selective habitat use and limited movement resulted in a high proportion of apparently suitable bulrush patches being unoccupied. This suggests the Amargosa vole's habitat selection behavior confers individual benefits but may not allow the overall population to persist in a changing environment.
Help or Hurt? Why We Select and How We Process Online Social Information about Health
ERIC Educational Resources Information Center
Hocevar, Kristin Page
2017-01-01
Health information is increasingly being shared online not just by credentialed sources such as physicians or health organizations, but also by patients with personal experience with a health concern. This dissertation proposes a new measure of vigilance-avoidance, or tendency to approach or avoid threatening stimuli, in order to understand how…
2005-05-01
efficiencies similar to those in the private sector . However, along the way, Government and private sector industry have begun to disagree about how PPI is...double that of the private sector due to an evaluation process that is cumbersome, time-consuming, and lacking the efficiencies enjoyed by private
A Selected Bibliography on Employee Attitude Surveys. Special Report.
ERIC Educational Resources Information Center
Blasingame, Margaret C.
This bibliography provides an up-to-date compilation of literature covering all aspects of the employee attitude survey process. It is intended for both researcher and practitioner. A wide variety of sources are represented with a focus on the past 10-12 years of attitude survey research. The 473 citations compiled are categorized under 7 separate…
Soil conservation service tests of Eucalyptus species for windbreaks
Gary L. Young
1983-01-01
The Soil Conservation Service is in the early stages of testing many species of Eucalyptus or windbreaks. Over 260 different species have been collected. The pre-planting selection criteria and process is described as well as the test conditions and procedures. Some sources of information on the use of the Eucalypts may be misleading through...
USDA-ARS?s Scientific Manuscript database
Tomatoes (Solanum lycopersicum L.) are an important source of nutrients in contemporary diets due to readily available fresh fruit and processed products, their popularity, and the sheer volume consumed. This study is part of a larger project undertaken by the Agricultural Research Service (ARS) to...
Cloudy 94 and Applications to Quasar Emission Line Regions
NASA Technical Reports Server (NTRS)
Ferland, Gary J.
2000-01-01
This review discusses the most recent developments of the plasma simulation code Cloudy and its application to the, emission-line regions of quasars. The longterm goal is to develop the tools needed to determine the chemical composition of the emitting gas and the luminosity of the central engine for any emission line source. Emission lines and the underlying thermal continuum are formed in plasmas that are far from thermodynamic equilibrium. Their thermal and ionization states are the result of a balance of a vast set of microphysical processes. Once produced, radiation must, propagate out of the (usually) optically thick source. No analytic solutions are possible, and recourse to numerical simulations is necessary. I am developing the large-scale plasma simulation code Cloudy as an investigative tool for this work, much as an observer might build a spectrometer. This review describes the current version of Cloudy, version 94. It describes improvements made since the, release of the previous version, C90. The major recent, application has been the development of the "Locally Optimally-Emitting Cloud" (LOC) model of AGN emission line regions. Powerful selection effects, introduced by the atomic physics and line formation process, permit individual lines to form most efficiently only near certain selected parameters. These selection effects, together with the presence of gas with a wide range of conditions, are enough to reproduce the spectrum of a typical quasar with little dependence on details. The spectrum actually carries little information to the identity of the emitters. I view this as a major step forward since it provides a method to handle accidental details at the source, so that we can concentrate on essential information such as the luminosity or chemical composition of the quasar.
[Pharmacological treatment conciliation methodology in patients with multiple conditions].
Alfaro-Lara, Eva Rocío; Vega-Coca, María Dolores; Galván-Banqueri, Mercedes; Nieto-Martín, María Dolores; Pérez-Guerrero, Concepción; Santos-Ramos, Bernardo
2014-02-01
To carry out a bibliographic review in order to identify the different methodologies used along the reconciliation process of drug therapy applicable to polypathological patients. We performed a literature review. Data sources The bibliographic review (February 2012) included the following databases: Pubmed, EMBASE, CINAHL, PsycINFO and Spanish Medical Index (IME). The different methodologies, identified on those databases, to measure the conciliation process in polypathological patients, or otherwise elderly patients or polypharmacy, were studied. Study selection Two hundred and seventy three articles were retrieved, of which 25 were selected. Data extraction Specifically: the level of care, the sources of information, the use of registration forms, the established time, the medical professional in charge and the registered variables such as errors of reconciliation. Most of studies selected when the patient was admitted into the hospital and after the hospital discharge of the patient. The main sources of information to be highlighted are: the interview and the medical history of the patient. An established time is not explicitly stated on most of them, nor the registration form is used. The main professional in charge is the clinical pharmacologist. Apart from the home medication, the habits of self-medication and phytotherapy are also identified. The common errors of reconciliation vary from the omission of drugs to different forms of interaction with other medicinal products (drugs interactions). There is a large heterogeneity of methodologies used for reconciliation. There is not any work done on the specific figure of the polypathological patient, which precisely requires a standardized methodology due to its complexity and its susceptibility to errors of reconciliation. Copyright © 2012 Elsevier España, S.L. All rights reserved.
The sources of adaptive variation
2017-01-01
The role of natural selection in the evolution of adaptive phenotypes has undergone constant probing by evolutionary biologists, employing both theoretical and empirical approaches. As Darwin noted, natural selection can act together with other processes, including random changes in the frequencies of phenotypic differences that are not under strong selection, and changes in the environment, which may reflect evolutionary changes in the organisms themselves. As understanding of genetics developed after 1900, the new genetic discoveries were incorporated into evolutionary biology. The resulting general principles were summarized by Julian Huxley in his 1942 book Evolution: the modern synthesis. Here, we examine how recent advances in genetics, developmental biology and molecular biology, including epigenetics, relate to today's understanding of the evolution of adaptations. We illustrate how careful genetic studies have repeatedly shown that apparently puzzling results in a wide diversity of organisms involve processes that are consistent with neo-Darwinism. They do not support important roles in adaptation for processes such as directed mutation or the inheritance of acquired characters, and therefore no radical revision of our understanding of the mechanism of adaptive evolution is needed. PMID:28566483
The sources of adaptive variation.
Charlesworth, Deborah; Barton, Nicholas H; Charlesworth, Brian
2017-05-31
The role of natural selection in the evolution of adaptive phenotypes has undergone constant probing by evolutionary biologists, employing both theoretical and empirical approaches. As Darwin noted, natural selection can act together with other processes, including random changes in the frequencies of phenotypic differences that are not under strong selection, and changes in the environment, which may reflect evolutionary changes in the organisms themselves. As understanding of genetics developed after 1900, the new genetic discoveries were incorporated into evolutionary biology. The resulting general principles were summarized by Julian Huxley in his 1942 book Evolution: the modern synthesis Here, we examine how recent advances in genetics, developmental biology and molecular biology, including epigenetics, relate to today's understanding of the evolution of adaptations. We illustrate how careful genetic studies have repeatedly shown that apparently puzzling results in a wide diversity of organisms involve processes that are consistent with neo-Darwinism. They do not support important roles in adaptation for processes such as directed mutation or the inheritance of acquired characters, and therefore no radical revision of our understanding of the mechanism of adaptive evolution is needed. © 2017 The Author(s).
X-Ray Spectral Properties of Seven Heavily Obscured Seyfert 2 Galaxies
NASA Astrophysics Data System (ADS)
Marchesi, S.; Ajello, M.; Comastri, A.; Cusumano, G.; La Parola, V.; Segreto, A.
2017-02-01
We present the combined Chandra and Swift-BAT spectral analysis of seven Seyfert 2 galaxies selected from the Swift-BAT 100 month catalog. We selected nearby (z ≤ 0.03) sources lacking a ROSAT counterpart that never previously been observed with Chandra in the 0.3-10 keV energy range, and targeted these objects with 10 ks Chandra ACIS-S observations. The X-ray spectral fitting over the 0.3-150 keV energy range allows us to determine that all the objects are significantly obscured, with N H ≥ 1023 cm-2 at a >99% confidence level. Moreover, one to three sources are candidate Compton-thick Active Galactic Nuclei (CT-AGNs; I.e., N H ≥ 1024 cm-2). We also test the recent spectral curvature method developed by Koss et al. to find candidate CT-AGNs, finding a good agreement between our results and their predictions. Because the selection criteria we adopted were effective in detecting highly obscured AGNs, further observations of these and other Seyfert 2 galaxies selected from the Swift-BAT 100 month catalog will allow us to create a statistically significant sample of highly obscured AGNs, therefore providing a better understanding of the physics of the obscuration processes.
Hong, Xiangfei; Wang, Yao; Sun, Junfeng; Li, Chunbo; Tong, Shanbao
2017-08-29
Successfully inhibiting a prepotent response tendency requires the attentional detection of signals which cue response cancellation. Although neuroimaging studies have identified important roles of stimulus-driven processing in the attentional detection, the effects of top-down control were scarcely investigated. In this study, scalp EEG was recorded from thirty-two participants during a modified Go/NoGo task, in which a spatial-cueing approach was implemented to manipulate top-down selective attention. We observed classical event-related potential components, including N2 and P3, in the attended condition of response inhibition. While in the ignored condition of response inhibition, a smaller P3 was observed and N2 was absent. The correlation between P3 and CNV during the foreperiod suggested an inhibitory role of P3 in both conditions. Furthermore, source analysis suggested that P3 generation was mainly localized to the midcingulate cortex, and the attended condition showed increased activation relative to the ignored condition in several regions, including inferior frontal gyrus, middle frontal gyrus, precentral gyrus, insula and uncus, suggesting that these regions were involved in top-down attentional control rather than inhibitory processing. Taken together, by segregating electrophysiological correlates of top-down selective attention from those of response inhibition, our findings provide new insights in understanding the neural mechanisms of response inhibition.
Fast coincidence counting with active inspection systems
NASA Astrophysics Data System (ADS)
Mullens, J. A.; Neal, J. S.; Hausladen, P. A.; Pozzi, S. A.; Mihalczo, J. T.
2005-12-01
This paper describes 2nd and 3rd order time coincidence distributions measurements with a GHz processor that synchronously samples 5 or 10 channels of data from radiation detectors near fissile material. On-line, time coincidence distributions are measured between detectors or between detectors and an external stimulating source. Detector-to-detector correlations are useful for passive measurements also. The processor also measures the number of times n pulses occur in a selectable time window and compares this multiplet distribution to a Poisson distribution as a method of determining the occurrence of fission. The detectors respond to radiation emitted in the fission process induced internally by inherent sources or by external sources such as LINACS, DT generators either pulsed or steady state with alpha detectors, etc. Data can be acquired from prompt emission during the source pulse, prompt emissions immediately after the source pulse, or delayed emissions between source pulses. These types of time coincidence measurements (occurring on the time scale of the fission chain multiplication processes for nuclear weapons grade U and Pu) are useful for determining the presence of these fissile materials and quantifying the amount, and are useful for counter terrorism and nuclear material control and accountability. This paper presents the results for a variety of measurements.
NASA Astrophysics Data System (ADS)
Maringanti, Chetan; Chaubey, Indrajeet; Popp, Jennie
2009-06-01
Best management practices (BMPs) are effective in reducing the transport of agricultural nonpoint source pollutants to receiving water bodies. However, selection of BMPs for placement in a watershed requires optimization of the available resources to obtain maximum possible pollution reduction. In this study, an optimization methodology is developed to select and place BMPs in a watershed to provide solutions that are both economically and ecologically effective. This novel approach develops and utilizes a BMP tool, a database that stores the pollution reduction and cost information of different BMPs under consideration. The BMP tool replaces the dynamic linkage of the distributed parameter watershed model during optimization and therefore reduces the computation time considerably. Total pollutant load from the watershed, and net cost increase from the baseline, were the two objective functions minimized during the optimization process. The optimization model, consisting of a multiobjective genetic algorithm (NSGA-II) in combination with a watershed simulation tool (Soil Water and Assessment Tool (SWAT)), was developed and tested for nonpoint source pollution control in the L'Anguille River watershed located in eastern Arkansas. The optimized solutions provided a trade-off between the two objective functions for sediment, phosphorus, and nitrogen reduction. The results indicated that buffer strips were very effective in controlling the nonpoint source pollutants from leaving the croplands. The optimized BMP plans resulted in potential reductions of 33%, 32%, and 13% in sediment, phosphorus, and nitrogen loads, respectively, from the watershed.
NASA Technical Reports Server (NTRS)
Gordon, Pierce E. C.; Colozza, Anthony J.; Hepp, Aloysius F.; Heller, Richard S.; Gustafson, Robert; Stern, Ted; Nakamura, Takashi
2011-01-01
Oxygen production from lunar raw materials is critical for sustaining a manned lunar base but is very power intensive. Solar concentrators are a well-developed technology for harnessing the Sun s energy to heat regolith to high temperatures (over 1375 K). The high temperature and potential material incompatibilities present numerous technical challenges. This study compares and contrasts different solar concentrator designs that have been developed, such as Cassegrains, offset parabolas, compound parabolic concentrators, and secondary concentrators. Differences between concentrators made from lenses and mirrors, and between rigid and flexible concentrators are also discussed. Possible substrate elements for a rigid mirror concentrator are selected and then compared, using the following (target) criteria: (low) coefficient of thermal expansion, (high) modulus of elasticity, and (low) density. Several potential lunar locations for solar concentrators are compared; environmental and processing-related challenges related to dust and optical surfaces are addressed. This brief technology survey examines various sources of thermal energy that can be utilized for materials processing on the lunar surface. These include heat from nuclear or electric sources and solar concentrators. Options for collecting and transporting thermal energy to processing reactors for each source are examined. Overall system requirements for each thermal source are compared and system limitations, such as maximum achievable temperature are discussed.
48 CFR 715.370 - Alternative source selection procedures.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 48 Federal Acquisition Regulations System 5 2011-10-01 2011-10-01 false Alternative source selection procedures. 715.370 Section 715.370 Federal Acquisition Regulations System AGENCY FOR INTERNATIONAL DEVELOPMENT CONTRACTING METHODS AND CONTRACT TYPES CONTRACTING BY NEGOTIATION Source Selection 715...
Process for forming retrograde profiles in silicon
Weiner, K.H.; Sigmon, T.W.
1996-10-15
A process is disclosed for forming retrograde and oscillatory profiles in crystalline and polycrystalline silicon. The process consisting of introducing an n- or p-type dopant into the silicon, or using prior doped silicon, then exposing the silicon to multiple pulses of a high-intensity laser or other appropriate energy source that melts the silicon for short time duration. Depending on the number of laser pulses directed at the silicon, retrograde profiles with peak/surface dopant concentrations which vary are produced. The laser treatment can be performed in air or in vacuum, with the silicon at room temperature or heated to a selected temperature.
Craven, Galen T; Nitzan, Abraham
2018-01-28
Statistical properties of Brownian motion that arise by analyzing, separately, trajectories over which the system energy increases (upside) or decreases (downside) with respect to a threshold energy level are derived. This selective analysis is applied to examine transport properties of a nonequilibrium Brownian process that is coupled to multiple thermal sources characterized by different temperatures. Distributions, moments, and correlation functions of a free particle that occur during upside and downside events are investigated for energy activation and energy relaxation processes and also for positive and negative energy fluctuations from the average energy. The presented results are sufficiently general and can be applied without modification to the standard Brownian motion. This article focuses on the mathematical basis of this selective analysis. In subsequent articles in this series, we apply this general formalism to processes in which heat transfer between thermal reservoirs is mediated by activated rate processes that take place in a system bridging them.
NASA Astrophysics Data System (ADS)
Craven, Galen T.; Nitzan, Abraham
2018-01-01
Statistical properties of Brownian motion that arise by analyzing, separately, trajectories over which the system energy increases (upside) or decreases (downside) with respect to a threshold energy level are derived. This selective analysis is applied to examine transport properties of a nonequilibrium Brownian process that is coupled to multiple thermal sources characterized by different temperatures. Distributions, moments, and correlation functions of a free particle that occur during upside and downside events are investigated for energy activation and energy relaxation processes and also for positive and negative energy fluctuations from the average energy. The presented results are sufficiently general and can be applied without modification to the standard Brownian motion. This article focuses on the mathematical basis of this selective analysis. In subsequent articles in this series, we apply this general formalism to processes in which heat transfer between thermal reservoirs is mediated by activated rate processes that take place in a system bridging them.
Source-independent full waveform inversion of seismic data
Lee, Ki Ha
2006-02-14
A set of seismic trace data is collected in an input data set that is first Fourier transformed in its entirety into the frequency domain. A normalized wavefield is obtained for each trace of the input data set in the frequency domain. Normalization is done with respect to the frequency response of a reference trace selected from the set of seismic trace data. The normalized wavefield is source independent, complex, and dimensionless. The normalized wavefield is shown to be uniquely defined as the normalized impulse response, provided that a certain condition is met for the source. This property allows construction of the inversion algorithm disclosed herein, without any source or source coupling information. The algorithm minimizes the error between data normalized wavefield and the model normalized wavefield. The methodology is applicable to any 3-D seismic problem, and damping may be easily included in the process.
An automated multi-scale network-based scheme for detection and location of seismic sources
NASA Astrophysics Data System (ADS)
Poiata, N.; Aden-Antoniow, F.; Satriano, C.; Bernard, P.; Vilotte, J. P.; Obara, K.
2017-12-01
We present a recently developed method - BackTrackBB (Poiata et al. 2016) - allowing to image energy radiation from different seismic sources (e.g., earthquakes, LFEs, tremors) in different tectonic environments using continuous seismic records. The method exploits multi-scale frequency-selective coherence in the wave field, recorded by regional seismic networks or local arrays. The detection and location scheme is based on space-time reconstruction of the seismic sources through an imaging function built from the sum of station-pair time-delay likelihood functions, projected onto theoretical 3D time-delay grids. This imaging function is interpreted as the location likelihood of the seismic source. A signal pre-processing step constructs a multi-band statistical representation of the non stationary signal, i.e. time series, by means of higher-order statistics or energy envelope characteristic functions. Such signal-processing is designed to detect in time signal transients - of different scales and a priori unknown predominant frequency - potentially associated with a variety of sources (e.g., earthquakes, LFE, tremors), and to improve the performance and the robustness of the detection-and-location location step. The initial detection-location, based on a single phase analysis with the P- or S-phase only, can then be improved recursively in a station selection scheme. This scheme - exploiting the 3-component records - makes use of P- and S-phase characteristic functions, extracted after a polarization analysis of the event waveforms, and combines the single phase imaging functions with the S-P differential imaging functions. The performance of the method is demonstrated here in different tectonic environments: (1) analysis of the one year long precursory phase of 2014 Iquique earthquake in Chile; (2) detection and location of tectonic tremor sources and low-frequency earthquakes during the multiple episodes of tectonic tremor activity in southwestern Japan.
Ma, Yujun; Wang, Enguo; Yuan, Tian; Zhao, Guo Xiang
2016-08-01
As the reading process is inseparable from working memory, inhibition, and other higher cognitive processes, the deep cognitive processing defects that are associated with dyslexia may be due to defective distraction inhibition systems. In this study, we used event-related potential technology to explore the source of negative priming effects in children with developmental dyslexia and in a group of healthy children for comparison. We found that the changes in the average response times in the negative priming and control conditions were consistent across the two groups, while the negative priming effects differed significantly between the groups. The magnitude of the negative priming effect was significantly different between the two groups, with the magnitude being significantly higher in the control group than it was in the developmental dyslexia group. These results indicate that there are deficits in distraction inhibition in children with developmental dyslexia. In terms of the time course of processing, inhibition deficits in the dyslexia group appeared during early-stage cognition selection and lasted through the response selection phase. Regarding the cerebral cortex locations, early-stage cognition selection was mainly located in the parietal region, while late-stage response selection was mainly located in the frontal and central regions. The results of our study may help further our understanding of the intrinsic causes of developmental dyslexia. Copyright © 2016 Elsevier Ltd. All rights reserved.
The NASA Goddard Group's Source Monitoring Database and Program
NASA Astrophysics Data System (ADS)
Gipson, John; Le Bail, Karine; Ma, Chopo
2014-12-01
Beginning in 2003, the Goddard VLBI group developed a program to purposefully monitor when sources were observed and to increase the observations of ``under-observed'' sources. The heart of the program consists of a MySQL database that keeps track of, on a session-by-session basis: the number of observations that are scheduled for a source, the number of observations that are successfully correlated, and the number of observations that are used in a session. In addition, there is a table that contains the target number of successful sessions over the last twelve months. Initially this table just contained two categories. Sources in the geodetic catalog had a target of 12 sessions/year; the remaining ICRF-1 defining sources had a target of two sessions/year. All other sources did not have a specific target. As the program evolved, different kinds of sources with different observing targets were added. During the scheduling process, the scheduler has the option of automatically selecting N sources which have not met their target. We discuss the history and present some results of this successful program.
Border, Shana E
2018-01-01
Abstract Natural selection has been shown to drive population differentiation and speciation. The role of sexual selection in this process is controversial; however, most of the work has centered on mate choice while the role of male–male competition in speciation is relatively understudied. Here, we outline how male–male competition can be a source of diversifying selection on male competitive phenotypes, and how this can contribute to the evolution of reproductive isolation. We highlight how negative frequency-dependent selection (advantage of rare phenotype arising from stronger male–male competition between similar male phenotypes compared with dissimilar male phenotypes) and disruptive selection (advantage of extreme phenotypes) drives the evolution of diversity in competitive traits such as weapon size, nuptial coloration, or aggressiveness. We underscore that male–male competition interacts with other life-history functions and that variable male competitive phenotypes may represent alternative adaptive options. In addition to competition for mates, aggressive interference competition for ecological resources can exert selection on competitor signals. We call for a better integration of male–male competition with ecological interference competition since both can influence the process of speciation via comparable but distinct mechanisms. Altogether, we present a more comprehensive framework for studying the role of male–male competition in speciation, and emphasize the need for better integration of insights gained from other fields studying the evolutionary, behavioral, and physiological consequences of agonistic interactions. PMID:29492042
Blind column selection protocol for two-dimensional high performance liquid chromatography.
Burns, Niki K; Andrighetto, Luke M; Conlan, Xavier A; Purcell, Stuart D; Barnett, Neil W; Denning, Jacquie; Francis, Paul S; Stevenson, Paul G
2016-07-01
The selection of two orthogonal columns for two-dimensional high performance liquid chromatography (LC×LC) separation of natural product extracts can be a labour intensive and time consuming process and in many cases is an entirely trial-and-error approach. This paper introduces a blind optimisation method for column selection of a black box of constituent components. A data processing pipeline, created in the open source application OpenMS®, was developed to map the components within the mixture of equal mass across a library of HPLC columns; LC×LC separation space utilisation was compared by measuring the fractional surface coverage, fcoverage. It was found that for a test mixture from an opium poppy (Papaver somniferum) extract, the combination of diphenyl and C18 stationary phases provided a predicted fcoverage of 0.48 and was matched with an actual usage of 0.43. OpenMS®, in conjunction with algorithms designed in house, have allowed for a significantly quicker selection of two orthogonal columns, which have been optimised for a LC×LC separation of crude extractions of plant material. Copyright © 2016 Elsevier B.V. All rights reserved.
Anchoring in Numeric Judgments of Visual Stimuli
Langeborg, Linda; Eriksson, Mårten
2016-01-01
This article investigates effects of anchoring in age estimation and estimation of quantities, two tasks which to different extents are based on visual stimuli. The results are compared to anchoring in answers to classic general knowledge questions that rely on semantic knowledge. Cognitive load was manipulated to explore possible differences between domains. Effects of source credibility, manipulated by differing instructions regarding the selection of anchor values (no information regarding anchor selection, information that the anchors are randomly generated or information that the anchors are answers from an expert) on anchoring were also investigated. Effects of anchoring were large for all types of judgments but were not affected by cognitive load or by source credibility in either one of the researched domains. A main effect of cognitive load on quantity estimations and main effects of source credibility in the two visually based domains indicate that the manipulations were efficient. Implications for theoretical explanations of anchoring are discussed. In particular, because anchoring did not interact with cognitive load, the results imply that the process behind anchoring in visual tasks is predominantly automatic and unconscious. PMID:26941684
TMS evidence for a selective role of the precuneus in source memory retrieval.
Bonnì, Sonia; Veniero, Domenica; Mastropasqua, Chiara; Ponzo, Viviana; Caltagirone, Carlo; Bozzali, Marco; Koch, Giacomo
2015-04-01
The posteromedial cortex including the precuneus (PC) is thought to be involved in episodic memory retrieval. Here we used continuous theta burst stimulation (cTBS) to disentangle the role of the precuneus in the recognition memory process in a sample of healthy subjects. During the encoding phase, subjects were presented with a series of colored pictures. Afterwards, during the retrieval phase, all previously presented items and a sample of new pictures were presented in black, and subjects were asked to indicate whether each item was new or old, and in the latter case to indicate the associated color. cTBS was delivered over PC, posterior parietal cortex (PPC) and vertex before the retrieval phase. The data were analyzed in terms of hits, false alarms, source errors and omissions. cTBS over the precuneus, but not over the PPC or the vertex, induced a selective decrease in source memory errors, indicating an improvement in context retrieval. All the other accuracy measurements were unchanged. These findings suggest a direct implication of the precuneus in successful context-dependent retrieval. Copyright © 2015 Elsevier B.V. All rights reserved.
Framework for Selecting Best Practices in Public Health: A Systematic Literature Review
de Colombani, Pierpaolo
2015-01-01
Evidence-based public health has commonly relied on findings from empirical studies, or research-based evidence. However, this paper advocates that practice-based evidence derived from programmes implemented in real-life settings is likely to be a more suitable source of evidence for inspiring and guiding public health programmes. Selection of best practices from the array of implemented programmes is one way of generating such practice-based evidence. Yet the lack of consensus on the definition and criteria for practice-based evidence and best practices has limited their application in public health so far. To address the gap in literature on practice-based evidence, this paper hence proposes measures of success for public health interventions by developing an evaluation framework for selection of best practices. The proposed framework was synthesised from a systematic literature review of peer-reviewed and grey literature on existing evaluation frameworks for public health programmes as well as processes employed by health-related organisations when selecting best practices. A best practice is firstly defined as an intervention that has shown evidence of effectiveness in a particular setting and is likely to be replicable to other situations. Regardless of the area of public health, interventions should be evaluated by their context, process and outcomes. A best practice should hence meet most, if not all, of eight identified evaluation criteria: relevance, community participation, stakeholder collaboration, ethical soundness, replicability, effectiveness, efficiency and sustainability. Ultimately, a standardised framework for selection of best practices will improve the usefulness and credibility of practice-based evidence in informing evidence-based public health interventions. Significance for public health Best practices are a valuable source of practice-based evidence on effective public health interventions implemented in real-life settings. Yet, despite the frequent branding of interventions as best practices or good practices, there is no consensus on the definition and desirable characteristics of such best practices. Hence, this is likely to be the first systematic review on the topic of best practices in public health. Having a single widely accepted framework for selecting best practices will ensure that the selection processes by different agencies are fair and comparable, as well as enable public health workers to better appreciate and adopt best practices in different settings. Ultimately, standardisation will improve the credibility and usefulness of practice-based evidence to that of research-based evidence. PMID:26753159
Framework for Selecting Best Practices in Public Health: A Systematic Literature Review.
Ng, Eileen; de Colombani, Pierpaolo
2015-11-17
Evidence-based public health has commonly relied on findings from empirical studies, or research-based evidence. However, this paper advocates that practice-based evidence derived from programmes implemented in real-life settings is likely to be a more suitable source of evidence for inspiring and guiding public health programmes. Selection of best practices from the array of implemented programmes is one way of generating such practice-based evidence. Yet the lack of consensus on the definition and criteria for practice-based evidence and best practices has limited their application in public health so far. To address the gap in literature on practice-based evidence, this paper hence proposes measures of success for public health interventions by developing an evaluation framework for selection of best practices. The proposed framework was synthesised from a systematic literature review of peer-reviewed and grey literature on existing evaluation frameworks for public health programmes as well as processes employed by health-related organisations when selecting best practices. A best practice is firstly defined as an intervention that has shown evidence of effectiveness in a particular setting and is likely to be replicable to other situations. Regardless of the area of public health, interventions should be evaluated by their context, process and outcomes. A best practice should hence meet most, if not all, of eight identified evaluation criteria: relevance, community participation, stakeholder collaboration, ethical soundness, replicability, effectiveness, efficiency and sustainability. Ultimately, a standardised framework for selection of best practices will improve the usefulness and credibility of practice-based evidence in informing evidence-based public health interventions. Significance for public healthBest practices are a valuable source of practice-based evidence on effective public health interventions implemented in real-life settings. Yet, despite the frequent branding of interventions as best practices or good practices, there is no consensus on the definition and desirable characteristics of such best practices. Hence, this is likely to be the first systematic review on the topic of best practices in public health. Having a single widely accepted framework for selecting best practices will ensure that the selection processes by different agencies are fair and comparable, as well as enable public health workers to better appreciate and adopt best practices in different settings. Ultimately, standardisation will improve the credibility and usefulness of practice-based evidence to that of research-based evidence.
Trial Maneuver Generation and Selection in the Paladin Tactical Decision Generation System
NASA Technical Reports Server (NTRS)
Chappell, Alan R.; McManus, John W.; Goodrich, Kenneth H.
1992-01-01
To date, increased levels of maneuverability and controllability in aircraft have been postulated as tactically advantageous, but little research has studied maneuvers or tactics that make use of these capabilities. In order to help fill this void, a real time tactical decision generation system for air combat engagements, Paladin, has been developed. Paladin models an air combat engagement as a series of discrete decisions. A detailed description of Paladin's decision making process is presented. This includes the sources of data used, methods of generating reasonable maneuvers for the Paladin aircraft, and selection criteria for choosing the "best" maneuver. Simulation results are presented that show Paladin to be relatively insensitive to errors introduced into the decision process by estimation of future positional and geometric data.
Trial maneuver generation and selection in the Paladin tactical decision generation system
NASA Technical Reports Server (NTRS)
Chappell, Alan R.; Mcmanus, John W.; Goodrich, Kenneth H.
1993-01-01
To date, increased levels of maneuverability and controllability in aircraft have been postulated as tactically advantageous, but little research has studied maneuvers or tactics that make use of these capabilities. In order to help fill this void, a real-time tactical decision generation system for air combat engagements, Paladin, has been developed. Paladin models an air combat engagement as a series of discrete decisions. A detailed description of Paladin's decision making process is presented. This includes the sources of data used, methods of generating reasonable maneuvers for the Paladin aircraft, and selection criteria for choosing the 'best' maneuver. Simulation results are presented that show Paladin to be relatively insensitive to errors introduced into the decision process by estimation of future positional and geometric data.
Image intensifier gain uniformity improvements in sealed tubes by selective scrubbing
Thomas, S.W.
1995-04-18
The gain uniformity of sealed microchannel plate image intensifiers (MCPIs) is improved by selectively scrubbing the high gain sections with a controlled bright light source. Using the premise that ions returning to the cathode from the microchannel plate (MCP) damage the cathode and reduce its sensitivity, a HeNe laser beam light source is raster scanned across the cathode of a microchannel plate image intensifier (MCPI) tube. Cathode current is monitored and when it exceeds a preset threshold, the sweep rate is decreased 1000 times, giving 1000 times the exposure to cathode areas with sensitivity greater than the threshold. The threshold is set at the cathode current corresponding to the lowest sensitivity in the active cathode area so that sensitivity of the entire cathode is reduced to this level. This process reduces tube gain by between 10% and 30% in the high gain areas while gain reduction in low gain areas is negligible. 4 figs.
Image intensifier gain uniformity improvements in sealed tubes by selective scrubbing
Thomas, Stanley W.
1995-01-01
The gain uniformity of sealed microchannel plate image intensifiers (MCPIs) is improved by selectively scrubbing the high gain sections with a controlled bright light source. Using the premise that ions returning to the cathode from the microchannel plate (MCP) damage the cathode and reduce its sensitivity, a HeNe laser beam light source is raster scanned across the cathode of a microchannel plate image intensifier (MCPI) tube. Cathode current is monitored and when it exceeds a preset threshold, the sweep rate is decreased 1000 times, giving 1000 times the exposure to cathode areas with sensitivity greater than the threshold. The threshold is set at the cathode current corresponding to the lowest sensitivity in the active cathode area so that sensitivity of the entire cathode is reduced to this level. This process reduces tube gain by between 10% and 30% in the high gain areas while gain reduction in low gain areas is negligible.
Are Forensic Experts Already Biased before Adversarial Legal Parties Hire Them?
2016-01-01
This survey of 206 forensic psychologists tested the “filtering” effects of preexisting expert attitudes in adversarial proceedings. Results confirmed the hypothesis that evaluator attitudes toward capital punishment influence willingness to accept capital case referrals from particular adversarial parties. Stronger death penalty opposition was associated with higher willingness to conduct evaluations for the defense and higher likelihood of rejecting referrals from all sources. Conversely, stronger support was associated with higher willingness to be involved in capital cases generally, regardless of referral source. The findings raise the specter of skewed evaluator involvement in capital evaluations, where evaluators willing to do capital casework may have stronger capital punishment support than evaluators who opt out, and evaluators with strong opposition may work selectively for the defense. The results may provide a partial explanation for the “allegiance effect” in adversarial legal settings such that preexisting attitudes may contribute to partisan participation through a self-selection process. PMID:27124416
An automated workflow for parallel processing of large multiview SPIM recordings
Schmied, Christopher; Steinbach, Peter; Pietzsch, Tobias; Preibisch, Stephan; Tomancak, Pavel
2016-01-01
Summary: Selective Plane Illumination Microscopy (SPIM) allows to image developing organisms in 3D at unprecedented temporal resolution over long periods of time. The resulting massive amounts of raw image data requires extensive processing interactively via dedicated graphical user interface (GUI) applications. The consecutive processing steps can be easily automated and the individual time points can be processed independently, which lends itself to trivial parallelization on a high performance computing (HPC) cluster. Here, we introduce an automated workflow for processing large multiview, multichannel, multiillumination time-lapse SPIM data on a single workstation or in parallel on a HPC cluster. The pipeline relies on snakemake to resolve dependencies among consecutive processing steps and can be easily adapted to any cluster environment for processing SPIM data in a fraction of the time required to collect it. Availability and implementation: The code is distributed free and open source under the MIT license http://opensource.org/licenses/MIT. The source code can be downloaded from github: https://github.com/mpicbg-scicomp/snakemake-workflows. Documentation can be found here: http://fiji.sc/Automated_workflow_for_parallel_Multiview_Reconstruction. Contact: schmied@mpi-cbg.de Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26628585
An automated workflow for parallel processing of large multiview SPIM recordings.
Schmied, Christopher; Steinbach, Peter; Pietzsch, Tobias; Preibisch, Stephan; Tomancak, Pavel
2016-04-01
Selective Plane Illumination Microscopy (SPIM) allows to image developing organisms in 3D at unprecedented temporal resolution over long periods of time. The resulting massive amounts of raw image data requires extensive processing interactively via dedicated graphical user interface (GUI) applications. The consecutive processing steps can be easily automated and the individual time points can be processed independently, which lends itself to trivial parallelization on a high performance computing (HPC) cluster. Here, we introduce an automated workflow for processing large multiview, multichannel, multiillumination time-lapse SPIM data on a single workstation or in parallel on a HPC cluster. The pipeline relies on snakemake to resolve dependencies among consecutive processing steps and can be easily adapted to any cluster environment for processing SPIM data in a fraction of the time required to collect it. The code is distributed free and open source under the MIT license http://opensource.org/licenses/MIT The source code can be downloaded from github: https://github.com/mpicbg-scicomp/snakemake-workflows Documentation can be found here: http://fiji.sc/Automated_workflow_for_parallel_Multiview_Reconstruction : schmied@mpi-cbg.de Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press.
NASA Astrophysics Data System (ADS)
Kaplita, George A.; Schmitz, Stefan; Ranade, Rajiv; Mathad, Gangadhara S.
1999-09-01
The planarization and recessing of polysilicon to form a plug are processes of increasing importance in silicon IC fabrication. While this technology has been developed and applied to DRAM technology using Trench Storage Capacitors, the need for such processes in other IC applications (i.e. polysilicon studs) has increased. Both planarization and recess processes usually have stringent requirements on etch rate, recess uniformity, and selectivity to underlying films. Additionally, both processes generally must be isotropic, yet must not expand any seams that might be present in the polysilicon fill. These processes should also be insensitive to changes in exposed silicon area (pattern factor) on the wafer. A SF6 plasma process in a polysilicon DPS (Decoupled Plasma Source) reactor has demonstrated the capability of achieving the above process requirements for both planarization and recess etch. The SF6 process in the decoupled plasma source reactor exhibited less sensitivity to pattern factor than in other types of reactors. Control of these planarization and recess processes requires two endpoint systems to work sequentially in the same recipe: one for monitoring the endpoint when blanket polysilicon (100% Si loading) is being planarized and one for monitoring the recess depth while the plug is being recessed (less than 10% Si loading). The planarization process employs an optical emission endpoint system (OES). An interferometric endpoint system (IEP), capable of monitoring lateral interference, is used for determining the recess depth. The ability of using either or both systems is required to make these plug processes manufacturable. Measuring the recess depth resulting from the recess process can be difficult, costly and time- consuming. An Atomic Force Microscope (AFM) can greatly alleviate these problems and can serve as a critical tool in the development of recess processes.
Solar dynamic power for Earth orbital and lunar applications
NASA Technical Reports Server (NTRS)
Calogeras, James E.; Dustin, Miles O.; Secunde, Richard R.
1991-01-01
Development of solar dynamic (SD) technologies for space over the past 25 years by NASA Lewis Research Center brought SD power to the point where it was selected in the design phase of Space Station Freedom Program as the power source for evolutionary growth. More recent studies showed that large cost savings are possible in establishing manufacturing processes at a Lunar Base if SD is considered as a power source. Technology efforts over the past 5 years have made possible lighter, more durable, SD components for these applications. A review of these efforts and respective benefits is presented.
Redox Catalysis Facilitates Lignin Depolymerization
2017-01-01
Lignin is a recalcitrant and underexploited natural feedstock for aromatic commodity chemicals, and its degradation generally requires the use of high temperatures and harsh reaction conditions. Herein we present an ambient temperature one-pot process for the controlled oxidation and depolymerization of this potent resource. Harnessing the potential of electrocatalytic oxidation in conjugation with our photocatalytic cleavage methodology, we have developed an operationally simple procedure for selective fragmentation of β-O-4 bonds with excellent mass recovery, which provides a unique opportunity to expand the existing lignin usage from energy source to commodity chemicals and synthetic building block source. PMID:28691074
Selected highlights from the Extreme Ultraviolet Explorer
NASA Technical Reports Server (NTRS)
Bowyer, S.; Malina, R. F.
1995-01-01
We present a few scientific highlights from the Extreme Ultraviolet Explorer (EUVE) all-sky and deep surveys, from the EUVE Righ Angle Program, and from the EUVE Guest Observer Program. The First EUVE Source Catalog includes 410 extreme ultraviolet (EUV) sources detected in the initial processing of the EUVE all-sky data. A program of optical identification indicates that counterparts include cool star coronae, flare stars, hot white dwarfs, central stars of planetary nebulae, B star photospheres and winds, an X-ray binary, extragalactic objects (active galactic nuclei, BL Lacertae), solar system objects (Moon, Mars, Io,), supernova remnants, and two novae.
Patel, Darshika; Dufour, Yvon; Domigan, Neil
2008-01-01
Purpose - This paper looks into the functional food and nutraceutical registration processes in Japan and China. The Japanese have developed the Foods for Specified Health Use (FOSHU) registration process whereas the Chinese have put into place the Health Food (HF) registration process. The aim of this paper is to compare the regulation processes between the two countries in search for answers to three core empirical questions: (1) how have the registration processes developed and changed? (2) What are the similarities and differences between the processes of registration in the two countries investigated? (3) Why are the registration processes similar/different? Method - The study was conducted using secondary sources. The literature surveyed covered academic journals, trade journals, magazine and newspaper articles, market reports, proceedings, books and web pages of relevant regulatory authorities and regulatory consultants. Information from the more recently published sources was used preferentially over older sources. As well as using the most recent sources, information was selected on the basis of which source it was from. Official regulations and SFDA and MHLW websites would contain accurate and up to date information and information from here would be taken as true over other sources of information. Results - The two diagrams of the registration processes respectively in Japan and China clearly show that there are similarities and differences. There are six categories under which these can be found: (1) the scientific evidence required; (2) the application process; (3) the evaluation process; (4) the law and the categories of products; (5) the labels and the types of claims; and finally (6) the cost and the time involved. Conclusions -The data analysis suggests that the process of diffusion of innovation played a role in the development of the regulations. Further it was found that while Japan was at the outset a pioneer innovator in nutraceutical registration processes, there are indications that in more recent years it too imitated other countries. NOVELTY STATEMENT: The assortment of regulatory regimes creates much uncertainty for the firms and the lack of familiarity and poor knowledge of the regulatory situation increases the risk of failure. The research presented in this paper provides highly valuable information to any biotech/pharmaceutical/nutraceutical companies developing their market entry strategy in Japan and China. There are few national and international studies of drug registration application processes but even fewer comparative studies of functional food and neutraceutical registration application processes such as this one and none using a diffusion of innovation perspective.
Katsoyiannis, Athanasios; Sweetman, Andrew J; Jones, Kevin C
2011-10-15
Molecular diagnostic ratios (MDRs)-the ratios of defined pairs of individual compounds-have been widely used as markers of different source categories of polycyclic aromatic hydrocarbons (PAHs). However, it is well-known that variations in combustion conditions and environmental degradation processes can cause substantial variability in the emission and degradation of individual compounds, potentially undermining the application of MDRs as reliable source apportionment tools. The United Kingdom produces a national inventory of atmospheric emissions of PAHs, and has an ambient air monitoring program at a range of rural, semirural, urban, and industrial sites. The inventory and the monitoring data are available over the past 20 years (1990-2010), a time frame that has seen known changes in combustion type and source. Here we assess 5 MDRs that have been used in the literature as source markers. We examine the spatial and temporal variability in the ratios and consider whether they are responsive to known differences in source strength and types between sites (on rural-urban gradients) and to underlying changes in national emissions since 1990. We conclude that the use of these 5 MDRs produces contradictory results and that they do not respond to known differences (in time and space) in atmospheric emission sources. For example, at a site near a motorway and far from other evident emission sources, the use of MDRs suggests "non-traffic" emissions. The ANT/(ANT + PHE) ratio is strongly seasonal at some sites; it is the most susceptible MDR to atmospheric processes, so these results illustrate how weathering in the environment will undermine the effectiveness of MDRs as markers of source(s). We conclude that PAH MDRs can exhibit spatial and temporal differences, but they are not valid markers of known differences in source categories and type. Atmospheric sources of PAHs in the UK are probably not dominated by any single clear and strong source type, so the mixture of PAHs in air is quickly "blended" away from the influence of the few major point sources which exist and further weathered in the environment by atmospheric reactions and selective loss processes.
NASA Astrophysics Data System (ADS)
Reiser, Fabienne; Schmelzbach, Cedric; Maurer, Hansruedi; Greenhalgh, Stewart; Hellwig, Olaf
2017-04-01
A primary focus of geothermal seismic imaging is to map dipping faults and fracture zones that control rock permeability and fluid flow. Vertical seismic profiling (VSP) is therefore a most valuable means to image the immediate surroundings of an existing borehole to guide, for example, the placing of new boreholes to optimize production from known faults and fractures. We simulated 2D and 3D acoustic synthetic seismic data and processed it through to pre-stack depth migration to optimize VSP survey layouts for mapping moderately to steeply dipping fracture zones within possible basement geothermal reservoirs. Our VSP survey optimization procedure for sequentially selecting source locations to define the area where source points are best located for optimal imaging makes use of a cross-correlation statistic, by which a subset of migrated shot gathers is compared with a target or reference image from a comprehensive set of source gathers. In geothermal exploration at established sites, it is reasonable to assume that sufficient à priori information is available to construct such a target image. We generally obtained good results with a relatively small number of optimally chosen source positions distributed over an ideal source location area for different fracture zone scenarios (different dips, azimuths, and distances from the surveying borehole). Adding further sources outside the optimal source area did not necessarily improve the results, but rather resulted in image distortions. It was found that fracture zones located at borehole-receiver depths and laterally offset from the borehole by 300 m can be imaged reliably for a range of the different dips, but more source positions and large offsets between sources and the borehole are required for imaging steeply dipping interfaces. When such features cross-cut the borehole, they are particularly difficult to image. For fracture zones with different azimuths, 3D effects are observed. Far offset source positions contribute less to the image quality as fracture zone azimuth increases. Our optimization methodology is best suited for designing future field surveys with a favorable benefit-cost ratio in areas with significant à priori knowledge. Moreover, our optimization workflow is valuable for selecting useful subsets of acquired data for optimum target-oriented processing.
2011-01-01
Background Machine learning has a vast range of applications. In particular, advanced machine learning methods are routinely and increasingly used in quantitative structure activity relationship (QSAR) modeling. QSAR data sets often encompass tens of thousands of compounds and the size of proprietary, as well as public data sets, is rapidly growing. Hence, there is a demand for computationally efficient machine learning algorithms, easily available to researchers without extensive machine learning knowledge. In granting the scientific principles of transparency and reproducibility, Open Source solutions are increasingly acknowledged by regulatory authorities. Thus, an Open Source state-of-the-art high performance machine learning platform, interfacing multiple, customized machine learning algorithms for both graphical programming and scripting, to be used for large scale development of QSAR models of regulatory quality, is of great value to the QSAR community. Results This paper describes the implementation of the Open Source machine learning package AZOrange. AZOrange is specially developed to support batch generation of QSAR models in providing the full work flow of QSAR modeling, from descriptor calculation to automated model building, validation and selection. The automated work flow relies upon the customization of the machine learning algorithms and a generalized, automated model hyper-parameter selection process. Several high performance machine learning algorithms are interfaced for efficient data set specific selection of the statistical method, promoting model accuracy. Using the high performance machine learning algorithms of AZOrange does not require programming knowledge as flexible applications can be created, not only at a scripting level, but also in a graphical programming environment. Conclusions AZOrange is a step towards meeting the needs for an Open Source high performance machine learning platform, supporting the efficient development of highly accurate QSAR models fulfilling regulatory requirements. PMID:21798025
Stålring, Jonna C; Carlsson, Lars A; Almeida, Pedro; Boyer, Scott
2011-07-28
Machine learning has a vast range of applications. In particular, advanced machine learning methods are routinely and increasingly used in quantitative structure activity relationship (QSAR) modeling. QSAR data sets often encompass tens of thousands of compounds and the size of proprietary, as well as public data sets, is rapidly growing. Hence, there is a demand for computationally efficient machine learning algorithms, easily available to researchers without extensive machine learning knowledge. In granting the scientific principles of transparency and reproducibility, Open Source solutions are increasingly acknowledged by regulatory authorities. Thus, an Open Source state-of-the-art high performance machine learning platform, interfacing multiple, customized machine learning algorithms for both graphical programming and scripting, to be used for large scale development of QSAR models of regulatory quality, is of great value to the QSAR community. This paper describes the implementation of the Open Source machine learning package AZOrange. AZOrange is specially developed to support batch generation of QSAR models in providing the full work flow of QSAR modeling, from descriptor calculation to automated model building, validation and selection. The automated work flow relies upon the customization of the machine learning algorithms and a generalized, automated model hyper-parameter selection process. Several high performance machine learning algorithms are interfaced for efficient data set specific selection of the statistical method, promoting model accuracy. Using the high performance machine learning algorithms of AZOrange does not require programming knowledge as flexible applications can be created, not only at a scripting level, but also in a graphical programming environment. AZOrange is a step towards meeting the needs for an Open Source high performance machine learning platform, supporting the efficient development of highly accurate QSAR models fulfilling regulatory requirements.
A census of radio-selected AGNs on the COSMOS field and of their FIR properties
NASA Astrophysics Data System (ADS)
Magliocchetti, M.; Popesso, P.; Brusa, M.; Salvato, M.
2018-01-01
We use the new catalogue by Laigle et al. to provide a full census of VLA-COSMOS radio sources. We identify 90 per cent of such sources and sub-divide them into active galactic nuclei (AGNs) and star-forming galaxies on the basis of their radio luminosity. The AGN sample is complete with respect to radio selection at all z ≲ 3.5. Out of 704 AGNs, 272 have a counterpart in the Herschel maps. By exploiting the better statistics of the new sample, we confirm the results of Magliocchetti et al.: the probability for a radio-selected AGN to be detected at far-infrared (FIR) wavelengths is both a function of radio luminosity and redshift, whereby powerful sources are more likely FIR emitters at earlier epochs. Such an emission is due to star-forming processes within the host galaxy. FIR emitters and non-FIR emitters only differentiate in the z ≲ 1 universe. At higher redshifts, they are indistinguishable from each other, as there is no difference between FIR-emitting AGNs and star-forming galaxies. Lastly, we focus on radio AGNs which show AGN emission at other wavelengths. We find that mid-infrared (MIR) emission is mainly associated with ongoing star formation and with sources which are smaller, younger and more radio luminous than the average parent population. X-ray emitters instead preferentially appear in more massive and older galaxies. We can therefore envisage an evolutionary track whereby the first phase of a radio-active AGN and of its host galaxy is associated with MIR emission, while at later stages the source becomes only active at radio wavelengths and possibly also in the X-ray.
ERIC Educational Resources Information Center
Ranker, Jason
2007-01-01
This case study closely examines how John (a former student of mine, age eight, second grade) composed during an informal writing group at school. Using qualitative research methods, I found that John selectively took up conventions, characters, story grammars, themes, and motifs from video games, television, Web pages, and comics. Likening his…
ERIC Educational Resources Information Center
Stroud, Maidred Morris
This study sought to determine the stage (awareness, interest, evaluation, trial, or adoption) reached by homemakers in adopting the water blanching of vegetables for freezing; to identify information sources (mass media, agencies, experts, informal personal contacts, and organizations); and to assess the relationship of certain personal, social,…
ERIC Educational Resources Information Center
Islam, Md. Aminul; Rahim, Noor Asliza Abdul; Liang, Tan Chee; Momtaz, Hasina
2011-01-01
This research attempted to find out the effect of demographic factors on the effectiveness of the e-learning system in a higher learning Institution. The students from this institution were randomly selected in order to evaluate the effectiveness of learning system in student's learning process. The primary data source is the questionnaires that…
ERIC Educational Resources Information Center
Crittenden, Barry D.; England, Richard
2005-01-01
The principles and practices of environmental impact assessment are best taught to chemical engineering undergraduate students by means of a role-playing case study. Many suitable examples are available from public sources. The planning appeal process has been selected so as to introduce an adversarial style involving cross-examination on…
NASA Astrophysics Data System (ADS)
Dreissigacker, O.
2005-12-01
None of the classic media sources report in greater depth about this science discipline than monthly astronomy magazines. They may not reach the widest audience, but their readers are 100% interested in astronomy, astrophysics and spaceflight, so the targeting is perfect! This article provides some insight into the production and selection process using examples from ASTRONOMIE HEUTE (AH), the German edition of Sky & Telescope (S&T).
NASA Astrophysics Data System (ADS)
Miller, Urszula; Grzelka, Agnieszka; Romanik, Elżbieta; Kuriata, Magdalena
2018-01-01
Operation of municipal management facilities is inseparable from the problem of malodorous compounds emissions to the atmospheric air. In that case odor nuisance is related to the chemical composition of waste, sewage and sludge as well as to the activity of microorganisms whose products of life processes can be those odorous compounds. Significant reduction of odorant emission from many sources can be achieved by optimizing parameters and conditions of processes. However, it is not always possible to limit the formation of odorants. In such cases it is best to use appropriate deodorizing methods. The choice of the appropriate method is based on in terms of physical parameters, emission intensity of polluted gases and their composition, if it is possible to determine. Among the solutions used in municipal economy, there can be distinguished physico-chemical methods such as sorption and oxidation. In cases where the source of the emission is not encapsulated, odor masking techniques are used, which consists of spraying preparations that neutralize unpleasant odors. The paper presents the characteristics of selected methods of eliminating odor nuisance and evaluation of their applicability in municipal management facilities.
Isolation and Evaluation of Oil-Producing Microalgae from Subtropical Coastal and Brackish Waters
Lim, David K. Y.; Garg, Sourabh; Timmins, Matthew; Zhang, Eugene S. B.; Thomas-Hall, Skye R.; Schuhmann, Holger; Li, Yan; Schenk, Peer M.
2012-01-01
Microalgae have been widely reported as a promising source of biofuels, mainly based on their high areal productivity of biomass and lipids as triacylglycerides and the possibility for cultivation on non-arable land. The isolation and selection of suitable strains that are robust and display high growth and lipid accumulation rates is an important prerequisite for their successful cultivation as a bioenergy source, a process that can be compared to the initial selection and domestication of agricultural crops. We developed standard protocols for the isolation and cultivation for a range of marine and brackish microalgae. By comparing growth rates and lipid productivity, we assessed the potential of subtropical coastal and brackish microalgae for the production of biodiesel and other oil-based bioproducts. This study identified Nannochloropsis sp., Dunaniella salina and new isolates of Chlorella sp. and Tetraselmis sp. as suitable candidates for a multiple-product algae crop. We conclude that subtropical coastal microalgae display a variety of fatty acid profiles that offer a wide scope for several oil-based bioproducts, including biodiesel and omega-3 fatty acids. A biorefinery approach for microalgae would make economical production more feasible but challenges remain for efficient harvesting and extraction processes for some species. PMID:22792403
Crump, Matthew J C
2016-03-01
Multiple lines of evidence from the attention and performance literature show that attention filtering can be controlled by higher level voluntary processes and lower-level cue-driven processes (for recent reviews see Bugg, 2012; Bugg & Crump, 2012; Egner, 2008). The experiments were designed to test a general hypothesis that cue-driven control learns from context-specific histories of prior acts of selective attention. Several web-based flanker studies were conducted via Amazon Mechanical Turk. Attention filtering demands were induced by a secondary one-back memory task after each trial prompting recall of the last target or distractor letter. Blocking recall demands produced larger flanker effects for the distractor than target recall conditions. Mixing recall demands and associating them with particular stimulus-cues (location, colour, letter, and font) sometimes showed rapid, contextual control of flanker interference, and sometimes did not. The results show that subtle methodological parameters can influence whether or not contextual control is observed. More generally, the results show that contextual control phenomena can be influenced by other sources of control, including other cue-driven sources competing for control. (c) 2016 APA, all rights reserved).
Major System Source Evaluation and Selection Procedures.
1987-04-02
A-RIBI I" MAJOR SYSTEM SOURCE EVALUATION AND SELECTION PROCEDURES / (U) BUSINESS MANAGEMENT RESEARCH ASSOCIATES INC ARLINGTON VA 02 APR 6? ORMC-5...BRMC-85-5142-1 0 I- MAJOR SYSTEM SOURCE EVALUATION AND SELECTION PROCEDURES o I Business Management Research Associates, Inc. 1911 Jefferson Davis...FORCE SOURCE EVALUATION AND SELECTI ON PROCEDURES Prepared by Business Management Research Associates, Inc., 1911 Jefferson Davis Highway, Arlington
The Role of Mother in Informing Girls About Puberty: A Meta-Analysis Study
Sooki, Zahra; Shariati, Mohammad; Chaman, Reza; Khosravi, Ahmad; Effatpanah, Mohammad; Keramat, Afsaneh
2016-01-01
Context Family, especially the mother, has the most important role in the education, transformation of information, and health behaviors of girls in order for them to have a healthy transition from the critical stage of puberty, but there are different views in this regard. Objectives Considering the various findings about the source of information about puberty, a meta-analysis study was conducted to investigate the extent of the mother’s role in informing girls about puberty. Data Sources This meta-analysis study was based on English articles published from 2000 to February 2015 in the Scopus, PubMed, and Science direct databases and on Persian articles in the SID, Magiran, and Iran Medex databases with determined key words and their MeSH equivalent. Study Selection Quantitative cross-sectional articles were extracted by two independent researchers and finally 46 articles were selected based on inclusion criteria. STROBE list were used for evaluation of studies. Data Extraction The percent of mothers as the current and preferred source of gaining information about the process of puberty, menarche, and menstruation from the perspective of adolescent girls was extracted from the articles. The results of studies were analyzed using meta-analysis (random effects model) and the studies’ heterogeneity was analyzed using the I2 calculation index. Variance between studies was analyzed using tau squared (Tau2) and review manager 5 software. Results The results showed that, from the perspective of teenage girls in Iran and other countries, in 56% of cases, the mother was the current source of information about the process of puberty, menarche, and menstruation. The preferred source of information about the process of puberty, menarche, and menstruation was the mother in all studies at 60% (Iran 57%, and other countries 66%). Conclusions According to the findings of this study, it is essential that health professionals and officials of the ministry of health train mothers about the time, trends, and factors affecting the start of puberty using a multi-dimensional approach that involves religious organizations, community groups, and peer groups. PMID:27331056
Quantifying the sources of atmospheric ice nuclei from carbonaceous combustion aerosol
NASA Astrophysics Data System (ADS)
Schill, G. P.; Jathar, S.; Galang, A.; Farmer, D.; Friedman, B.; Levin, E. J.; DeMott, P. J.; Kreidenweis, S. M.
2015-12-01
Ice nucleation on particles is a fundamental atmospheric process, which governs precipitation, cloud lifetimes, and climate. Despite being a basic atmospheric process, our current understanding of ice nucleation in the atmosphere is low. One reason for this low understanding is that ice nuclei concentrations are low (only ~1 in 105 particles in the free troposphere nucleate ice), making it challenging to identify both the composition and sources of ambient ice nuclei. Carbonaceous combustion aerosol produced from biomass and fossil fuel combustion are one potential source of these ice nuclei, as they contribute to over one-third of all aerosol in the North American free troposphere. Unfortunately, previous results from field measurements in-cloud, aircraft measurements, and laboratory studies are in conflict, with estimates of the impact of combustion aerosol ranging from no effect to rivaling the well-known atmospheric ice nuclei mineral dust. It is, however, becoming clear that aerosols from combustion processes are more complex than model particles, and their ice activity depends greatly on both fuel type and combustion conditions. Given these dependencies, we propose that sampling from real-world biomass burning and fossil fuel sources would provide the most useful new information on the contribution of carbonaceous combustion aerosols to atmospheric ice nuclei particles. To determine the specific contribution of refractory black carbon (rBC) to ice nuclei concentrations, we have coupled the Single Particle Soot Photometer (SP2) to the Colorado State University Continuous Flow Diffusion Chamber (CFDC). The SP2 utilizes laser-induced incandescence to quantify rBC mass on a particle-by-particle basis; in doing so, it also selectively destroys rBC particles by heating them to their vaporization temperature. Thus, the SP2 can be used as a selective pre-filter for rBC into the CFDC. In this work, we will present recent results looking at contribution of diesel engine exhaust to ice nuclei concentrations. Sampling was done for both diesel and biodiesel on fresh emissions and emissions aged up to 18 days equivalent photochemical aging with a Potential Aerosol Mass chamber. Our results show that, for mixed-phase clouds, both fresh and aged (bio)diesel are not likely a significant source of ice nuclei.
Evolutionary trends in directional hearing
Carr, Catherine E.; Christensen-Dalsgaard, Jakob
2016-01-01
Tympanic hearing is a true evolutionary novelty that arose in parallel within early tetrapods. We propose that in these tetrapods, selection for sound localization in air acted upon pre-existing directionally sensitive brainstem circuits, similar to those in fishes. Auditory circuits in birds and lizards resemble this ancestral, directionally sensitive framework. Despite this anatomically similarity, coding of sound source location differs between birds and lizards. In birds, brainstem circuits compute sound location from interaural cues. Lizards, however, have coupled ears, and do not need to compute source location in the brain. Thus their neural processing of sound direction differs, although all show mechanisms for enhancing sound source directionality. Comparisons with mammals reveal similarly complex interactions between coding strategies and evolutionary history. PMID:27448850
2016-06-15
selection strategy is key to minimizing risk and ensuring best value for all stakeholders. On the basis of thorough market research , acquisition...administrative lead-time, Contractor Performance Assessment Reporting System ratings, and earned value management assessments) and source selection strategy ...Postgraduate School A. PURPOSE This research analyzes LPTA and tradeoff source selection strategies and contract outcomes to determine if a relationship
A systems neurophysiology approach to voluntary event coding.
Petruo, Vanessa A; Stock, Ann-Kathrin; Münchau, Alexander; Beste, Christian
2016-07-15
Mechanisms responsible for the integration of perceptual events and appropriate actions (sensorimotor processes) have been subject to intense research. Different theoretical frameworks have been put forward with the "Theory of Event Coding (TEC)" being one of the most influential. In the current study, we focus on the concept of 'event files' within TEC and examine what sub-processes being dissociable by means of cognitive-neurophysiological methods are involved in voluntary event coding. This was combined with EEG source localization. We also introduce reward manipulations to delineate the neurophysiological sub-processes most relevant for performance variations during event coding. The results show that processes involved in voluntary event coding included predominantly stimulus categorization, feature unbinding and response selection, which were reflected by distinct neurophysiological processes (the P1, N2 and P3 ERPs). On a system's neurophysiological level, voluntary event-file coding is thus related to widely distributed parietal-medial frontal networks. Attentional selection processes (N1 ERP) turned out to be less important. Reward modulated stimulus categorization in parietal regions likely reflecting aspects of perceptual decision making but not in other processes. The perceptual categorization stage appears central for voluntary event-file coding. Copyright © 2016 Elsevier Inc. All rights reserved.
Crystallization and doping of amorphous silicon on low temperature plastic
Kaschmitter, James L.; Truher, Joel B.; Weiner, Kurt H.; Sigmon, Thomas W.
1994-01-01
A method or process of crystallizing and doping amorphous silicon (a-Si) on a low-temperature plastic substrate using a short pulsed high energy source in a selected environment, without heat propagation and build-up in the substrate. The pulsed energy processing of the a-Si in a selected environment, such as BF3 and PF5, will form a doped micro-crystalline or poly-crystalline silicon (pc-Si) region or junction point with improved mobilities, lifetimes and drift and diffusion lengths and with reduced resistivity. The advantage of this method or process is that it provides for high energy materials processing on low cost, low temperature, transparent plastic substrates. Using pulsed laser processing a high (>900.degree. C.), localized processing temperature can be achieved in thin films, with little accompanying temperature rise in the substrate, since substrate temperatures do not exceed 180.degree. C. for more than a few microseconds. This method enables use of plastics incapable of withstanding sustained processing temperatures (higher than 180.degree. C.) but which are much lower cost, have high tolerance to ultraviolet light, have high strength and good transparency, compared to higher temperature plastics such as polyimide.
Crystallization and doping of amorphous silicon on low temperature plastic
Kaschmitter, J.L.; Truher, J.B.; Weiner, K.H.; Sigmon, T.W.
1994-09-13
A method or process of crystallizing and doping amorphous silicon (a-Si) on a low-temperature plastic substrate using a short pulsed high energy source in a selected environment, without heat propagation and build-up in the substrate is disclosed. The pulsed energy processing of the a-Si in a selected environment, such as BF3 and PF5, will form a doped micro-crystalline or poly-crystalline silicon (pc-Si) region or junction point with improved mobilities, lifetimes and drift and diffusion lengths and with reduced resistivity. The advantage of this method or process is that it provides for high energy materials processing on low cost, low temperature, transparent plastic substrates. Using pulsed laser processing a high (>900 C), localized processing temperature can be achieved in thin films, with little accompanying temperature rise in the substrate, since substrate temperatures do not exceed 180 C for more than a few microseconds. This method enables use of plastics incapable of withstanding sustained processing temperatures (higher than 180 C) but which are much lower cost, have high tolerance to ultraviolet light, have high strength and good transparency, compared to higher temperature plastics such as polyimide. 5 figs.
High Temporal Resolution Mapping of Seismic Noise Sources Using Heterogeneous Supercomputers
NASA Astrophysics Data System (ADS)
Paitz, P.; Gokhberg, A.; Ermert, L. A.; Fichtner, A.
2017-12-01
The time- and space-dependent distribution of seismic noise sources is becoming a key ingredient of modern real-time monitoring of various geo-systems like earthquake fault zones, volcanoes, geothermal and hydrocarbon reservoirs. We present results of an ongoing research project conducted in collaboration with the Swiss National Supercomputing Centre (CSCS). The project aims at building a service providing seismic noise source maps for Central Europe with high temporal resolution. We use source imaging methods based on the cross-correlation of seismic noise records from all seismic stations available in the region of interest. The service is hosted on the CSCS computing infrastructure; all computationally intensive processing is performed on the massively parallel heterogeneous supercomputer "Piz Daint". The solution architecture is based on the Application-as-a-Service concept to provide the interested researchers worldwide with regular access to the noise source maps. The solution architecture includes the following sub-systems: (1) data acquisition responsible for collecting, on a periodic basis, raw seismic records from the European seismic networks, (2) high-performance noise source mapping application responsible for the generation of source maps using cross-correlation of seismic records, (3) back-end infrastructure for the coordination of various tasks and computations, (4) front-end Web interface providing the service to the end-users and (5) data repository. The noise source mapping itself rests on the measurement of logarithmic amplitude ratios in suitably pre-processed noise correlations, and the use of simplified sensitivity kernels. During the implementation we addressed various challenges, in particular, selection of data sources and transfer protocols, automation and monitoring of daily data downloads, ensuring the required data processing performance, design of a general service-oriented architecture for coordination of various sub-systems, and engineering an appropriate data storage solution. The present pilot version of the service implements noise source maps for Switzerland. Extension of the solution to Central Europe is planned for the next project phase.
Shim, Miseon; Hwang, Han-Jeong; Kim, Do-Won; Lee, Seung-Hwan; Im, Chang-Hwan
2016-10-01
Recently, an increasing number of researchers have endeavored to develop practical tools for diagnosing patients with schizophrenia using machine learning techniques applied to EEG biomarkers. Although a number of studies showed that source-level EEG features can potentially be applied to the differential diagnosis of schizophrenia, most studies have used only sensor-level EEG features such as ERP peak amplitude and power spectrum for machine learning-based diagnosis of schizophrenia. In this study, we used both sensor-level and source-level features extracted from EEG signals recorded during an auditory oddball task for the classification of patients with schizophrenia and healthy controls. EEG signals were recorded from 34 patients with schizophrenia and 34 healthy controls while each subject was asked to attend to oddball tones. Our results demonstrated higher classification accuracy when source-level features were used together with sensor-level features, compared to when only sensor-level features were used. In addition, the selected sensor-level features were mostly found in the frontal area, and the selected source-level features were mostly extracted from the temporal area, which coincide well with the well-known pathological region of cognitive processing in patients with schizophrenia. Our results suggest that our approach would be a promising tool for the computer-aided diagnosis of schizophrenia. Copyright © 2016 Elsevier B.V. All rights reserved.
Undergraduate Students' Justifications for Source Selection in a Digital Academic Context
ERIC Educational Resources Information Center
List, Alexandra; Grossnickle, Emily M.; Alexander, Patricia A.
2016-01-01
To complete any academic tasks using information from the Internet, undergraduate students first have to select the appropriate sources. However, the types of justifications that undergraduates provide for source selection and how these justifications may be impacted by task characteristics have been underexamined. This study explored…
Sources for Selecting School Library Resource Materials.
ERIC Educational Resources Information Center
Friderichsen, Blanche
A Department of Education publication on an integrated program for Alberta school libraries, this document recommends the use of specific material selection sources designed to aid schools in developing their library collections. Materials are listed in the following sections: (1) Sources for Selecting School Library Resource Materials; (2)…
THERMAL NEUTRON INTENSITIES IN SOILS IRRADIATED BY FAST NEUTRONS FROM POINT SOURCES. (R825549C054)
Thermal-neutron fluences in soil are reported for selected fast-neutron sources, selected soil types, and selected irradiation geometries. Sources include 14 MeV neutrons from accelerators, neutrons from spontaneously fissioning 252Cf, and neutrons produced from alp...
A selection of giant radio sources from NVSS
Proctor, D. D.
2016-06-01
Results of the application of pattern-recognition techniques to the problem of identifying giant radio sources (GRSs) from the data in the NVSS catalog are presented, and issues affecting the process are explored. Decision-tree pattern-recognition software was applied to training-set source pairs developed from known NVSS large-angular-size radio galaxies. The full training set consisted of 51,195 source pairs, 48 of which were known GRSs for which each lobe was primarily represented by a single catalog component. The source pairs had a maximum separation ofmore » $$20^{\\prime} $$ and a minimum component area of 1.87 square arcmin at the 1.4 mJy level. The importance of comparing the resulting probability distributions of the training and application sets for cases of unknown class ratio is demonstrated. The probability of correctly ranking a randomly selected (GRS, non-GRS) pair from the best of the tested classifiers was determined to be 97.8 ± 1.5%. The best classifiers were applied to the over 870,000 candidate pairs from the entire catalog. Images of higher-ranked sources were visually screened, and a table of over 1600 candidates, including morphological annotation, is presented. These systems include doubles and triples, wide-angle tail and narrow-angle tail, S- or Z-shaped systems, and core-jets and resolved cores. In conclusion, while some resolved-lobe systems are recovered with this technique, generally it is expected that such systems would require a different approach.« less
NASA Technical Reports Server (NTRS)
Holmquist, R.
1978-01-01
The random evolutionary hits (REH) theory of evolutionary divergence, originally proposed in 1972, is restated with attention to certain aspects of the theory that have caused confusion. The theory assumes that natural selection and stochastic processes interact and that natural selection restricts those codon sites which may fix mutations. The predicted total number of fixed nucleotide replacements agrees with data for cytochrome c, a-hemoglobin, beta-hemoglobin, and myoglobin. The restatement analyzes the magnitude of possible sources of errors and simplifies calculational methodology by supplying polynomial expressions to replace tables and graphs.
The national response for preventing healthcare-associated infections: data and monitoring.
Kahn, Katherine L; Weinberg, Daniel A; Leuschner, Kristin J; Gall, Elizabeth M; Siegel, Sari; Mendel, Peter
2014-02-01
Historically, the ability to accurately track healthcare-associated infections (HAIs) was hindered due to a lack of coordination among data sources and shortcomings in individual data sources. This paper presents the results of the evaluation of the HAI data and the monitoring component of the Action Plan, focusing on context (goals), inputs, and processes. We used the Content-Input-Process-Product framework, together with the HAI prevention system framework, to describe the transformative processes associated with data and monitoring efforts. Six HAI priority conditions in the 2009 Action Plan created a focus for the selection of goals and activities. Key Action Plan decisions included a phased-in data and monitoring approach, commitment to linking the selection of priority HAIs to highly visible national 5-year prevention targets, and the development of a comprehensive HAI database inventory. Remaining challenges relate to data validation, resources, and the opportunity to integrate electronic health and laboratory records with other provider data systems. The Action Plan's data and monitoring program has developed a sound infrastructure that builds upon technological advances and embodies a firm commitment to prioritization, coordination and alignment, accountability and incentives, stakeholder engagement, and an awareness of the need for predictable resources. With time, and adequate resources, it is likely that the investment in data-related infrastructure during the Action Plan's initial years will reap great rewards.
The requirements for low-temperature plasma ionization support miniaturization of the ion source.
Kiontke, Andreas; Holzer, Frank; Belder, Detlev; Birkemeyer, Claudia
2018-06-01
Ambient ionization mass spectrometry (AI-MS), the ionization of samples under ambient conditions, enables fast and simple analysis of samples without or with little sample preparation. Due to their simple construction and low resource consumption, plasma-based ionization methods in particular are considered ideal for use in mobile analytical devices. However, systematic investigations that have attempted to identify the optimal configuration of a plasma source to achieve the sensitive detection of target molecules are still rare. We therefore used a low-temperature plasma ionization (LTPI) source based on dielectric barrier discharge with helium employed as the process gas to identify the factors that most strongly influence the signal intensity in the mass spectrometry of species formed by plasma ionization. In this study, we investigated several construction-related parameters of the plasma source and found that a low wall thickness of the dielectric, a small outlet spacing, and a short distance between the plasma source and the MS inlet are needed to achieve optimal signal intensity with a process-gas flow rate of as little as 10 mL/min. In conclusion, this type of ion source is especially well suited for downscaling, which is usually required in mobile devices. Our results provide valuable insights into the LTPI mechanism; they reveal the potential to further improve its implementation and standardization for mobile mass spectrometry as well as our understanding of the requirements and selectivity of this technique. Graphical abstract Optimized parameters of a dielectric barrier discharge plasma for ionization in mass spectrometry. The electrode size, shape, and arrangement, the thickness of the dielectric, and distances between the plasma source, sample, and MS inlet are marked in red. The process gas (helium) flow is shown in black.
Dispersal-Based Microbial Community Assembly Decreases Biogeochemical Function
DOE Office of Scientific and Technical Information (OSTI.GOV)
Graham, Emily B.; Stegen, James C.
Ecological mechanisms influence relationships among microbial communities, which in turn impact biogeochemistry. In particular, microbial communities are assembled by deterministic (e.g., selection) and stochastic (e.g., dispersal) processes, and the relative balance of these two process types is hypothesized to alter the influence of microbial communities over biogeochemical function. We used an ecological simulation model to evaluate this hypothesis, defining biogeochemical function generically to represent any biogeochemical reaction of interest. We assembled receiving communities under different levels of dispersal from a source community that was assembled purely by selection. The dispersal scenarios ranged from no dispersal (i.e., selection-only) to dispersal ratesmore » high enough to overwhelm selection (i.e., homogenizing dispersal). We used an aggregate measure of community fitness to infer a given community’s biogeochemical function relative to other communities. We also used ecological null models to further link the relative influence of deterministic assembly to function. We found that increasing rates of dispersal decrease biogeochemical function by increasing the proportion of maladapted taxa in a local community. Niche breadth was also a key determinant of biogeochemical function, suggesting a tradeoff between the function of generalist and specialist species. Finally, we show that microbial assembly processes exert greater influence over biogeochemical function when there is variation in the relative contributions of dispersal and selection among communities. Taken together, our results highlight the influence of spatial processes on biogeochemical function and indicate the need to account for such effects in models that aim to predict biogeochemical function under future environmental scenarios.« less
Dispersal-Based Microbial Community Assembly Decreases Biogeochemical Function
Graham, Emily B.; Stegen, James C.
2017-11-01
Ecological mechanisms influence relationships among microbial communities, which in turn impact biogeochemistry. In particular, microbial communities are assembled by deterministic (e.g., selection) and stochastic (e.g., dispersal) processes, and the relative balance of these two process types is hypothesized to alter the influence of microbial communities over biogeochemical function. We used an ecological simulation model to evaluate this hypothesis, defining biogeochemical function generically to represent any biogeochemical reaction of interest. We assembled receiving communities under different levels of dispersal from a source community that was assembled purely by selection. The dispersal scenarios ranged from no dispersal (i.e., selection-only) to dispersal ratesmore » high enough to overwhelm selection (i.e., homogenizing dispersal). We used an aggregate measure of community fitness to infer a given community’s biogeochemical function relative to other communities. We also used ecological null models to further link the relative influence of deterministic assembly to function. We found that increasing rates of dispersal decrease biogeochemical function by increasing the proportion of maladapted taxa in a local community. Niche breadth was also a key determinant of biogeochemical function, suggesting a tradeoff between the function of generalist and specialist species. Finally, we show that microbial assembly processes exert greater influence over biogeochemical function when there is variation in the relative contributions of dispersal and selection among communities. Taken together, our results highlight the influence of spatial processes on biogeochemical function and indicate the need to account for such effects in models that aim to predict biogeochemical function under future environmental scenarios.« less
NASA Technical Reports Server (NTRS)
Cariapa, Vikram
1993-01-01
The trend in the modern global economy towards free market policies has motivated companies to use rapid prototyping technologies to not only reduce product development cycle time but also to maintain their competitive edge. A rapid prototyping technology is one which combines computer aided design with computer controlled tracking of focussed high energy source (eg. lasers, heat) on modern ceramic powders, metallic powders, plastics or photosensitive liquid resins in order to produce prototypes or models. At present, except for the process of shape melting, most rapid prototyping processes generate products that are only dimensionally similar to those of the desired end product. There is an urgent need, therefore, to enhance the understanding of the characteristics of these processes in order to realize their potential for production. Currently, the commercial market is dominated by four rapid prototyping processes, namely selective laser sintering, stereolithography, fused deposition modelling and laminated object manufacturing. This phase of the research has focussed on the selective laser sintering and stereolithography rapid prototyping processes. A theoretical model for these processes is under development. Different rapid prototyping sites supplied test specimens (based on ASTM 638-84, Type I) that have been measured and tested to provide a data base on surface finish, dimensional variation and ultimate tensile strength. Further plans call for developing and verifying the theoretical models by carefully designed experiments. This will be a joint effort between NASA and other prototyping centers to generate a larger database, thus encouraging more widespread usage by product designers.
NASA Astrophysics Data System (ADS)
Criales Escobar, Luis Ernesto
One of the most frequently evolving areas of research is the utilization of lasers for micro-manufacturing and additive manufacturing purposes. The use of laser beam as a tool for manufacturing arises from the need for flexible and rapid manufacturing at a low-to-mid cost. Laser micro-machining provides an advantage over mechanical micro-machining due to the faster production times of large batch sizes and the high costs associated with specific tools. Laser based additive manufacturing enables processing of powder metals for direct and rapid fabrication of products. Therefore, laser processing can be viewed as a fast, flexible, and cost-effective approach compared to traditional manufacturing processes. Two types of laser processing techniques are studied: laser ablation of polymers for micro-channel fabrication and selective laser melting of metal powders. Initially, a feasibility study for laser-based micro-channel fabrication of poly(dimethylsiloxane) (PDMS) via experimentation is presented. In particular, the effectiveness of utilizing a nanosecond-pulsed laser as the energy source for laser ablation is studied. The results are analyzed statistically and a relationship between process parameters and micro-channel dimensions is established. Additionally, a process model is introduced for predicting channel depth. Model outputs are compared and analyzed to experimental results. The second part of this research focuses on a physics-based FEM approach for predicting the temperature profile and melt pool geometry in selective laser melting (SLM) of metal powders. Temperature profiles are calculated for a moving laser heat source to understand the temperature rise due to heating during SLM. Based on the predicted temperature distributions, melt pool geometry, i.e. the locations at which melting of the powder material occurs, is determined. Simulation results are compared against data obtained from experimental Inconel 625 test coupons fabricated at the National Institute for Standards & Technology via response surface methodology techniques. The main goal of this research is to develop a comprehensive predictive model with which the effect of powder material properties and laser process parameters on the built quality and integrity of SLM-produced parts can be better understood. By optimizing process parameters, SLM as an additive manufacturing technique is not only possible, but also practical and reproducible.
Source counting in MEG neuroimaging
NASA Astrophysics Data System (ADS)
Lei, Tianhu; Dell, John; Magee, Ralphy; Roberts, Timothy P. L.
2009-02-01
Magnetoencephalography (MEG) is a multi-channel, functional imaging technique. It measures the magnetic field produced by the primary electric currents inside the brain via a sensor array composed of a large number of superconducting quantum interference devices. The measurements are then used to estimate the locations, strengths, and orientations of these electric currents. This magnetic source imaging technique encompasses a great variety of signal processing and modeling techniques which include Inverse problem, MUltiple SIgnal Classification (MUSIC), Beamforming (BF), and Independent Component Analysis (ICA) method. A key problem with Inverse problem, MUSIC and ICA methods is that the number of sources must be detected a priori. Although BF method scans the source space on a point-to-point basis, the selection of peaks as sources, however, is finally made by subjective thresholding. In practice expert data analysts often select results based on physiological plausibility. This paper presents an eigenstructure approach for the source number detection in MEG neuroimaging. By sorting eigenvalues of the estimated covariance matrix of the acquired MEG data, the measured data space is partitioned into the signal and noise subspaces. The partition is implemented by utilizing information theoretic criteria. The order of the signal subspace gives an estimate of the number of sources. The approach does not refer to any model or hypothesis, hence, is an entirely data-led operation. It possesses clear physical interpretation and efficient computation procedure. The theoretical derivation of this method and the results obtained by using the real MEG data are included to demonstrates their agreement and the promise of the proposed approach.
Single-trial event-related potential extraction through one-unit ICA-with-reference
NASA Astrophysics Data System (ADS)
Lih Lee, Wee; Tan, Tele; Falkmer, Torbjörn; Leung, Yee Hong
2016-12-01
Objective. In recent years, ICA has been one of the more popular methods for extracting event-related potential (ERP) at the single-trial level. It is a blind source separation technique that allows the extraction of an ERP without making strong assumptions on the temporal and spatial characteristics of an ERP. However, the problem with traditional ICA is that the extraction is not direct and is time-consuming due to the need for source selection processing. In this paper, the application of an one-unit ICA-with-Reference (ICA-R), a constrained ICA method, is proposed. Approach. In cases where the time-region of the desired ERP is known a priori, this time information is utilized to generate a reference signal, which is then used for guiding the one-unit ICA-R to extract the source signal of the desired ERP directly. Main results. Our results showed that, as compared to traditional ICA, ICA-R is a more effective method for analysing ERP because it avoids manual source selection and it requires less computation thus resulting in faster ERP extraction. Significance. In addition to that, since the method is automated, it reduces the risks of any subjective bias in the ERP analysis. It is also a potential tool for extracting the ERP in online application.
Single-trial event-related potential extraction through one-unit ICA-with-reference.
Lee, Wee Lih; Tan, Tele; Falkmer, Torbjörn; Leung, Yee Hong
2016-12-01
In recent years, ICA has been one of the more popular methods for extracting event-related potential (ERP) at the single-trial level. It is a blind source separation technique that allows the extraction of an ERP without making strong assumptions on the temporal and spatial characteristics of an ERP. However, the problem with traditional ICA is that the extraction is not direct and is time-consuming due to the need for source selection processing. In this paper, the application of an one-unit ICA-with-Reference (ICA-R), a constrained ICA method, is proposed. In cases where the time-region of the desired ERP is known a priori, this time information is utilized to generate a reference signal, which is then used for guiding the one-unit ICA-R to extract the source signal of the desired ERP directly. Our results showed that, as compared to traditional ICA, ICA-R is a more effective method for analysing ERP because it avoids manual source selection and it requires less computation thus resulting in faster ERP extraction. In addition to that, since the method is automated, it reduces the risks of any subjective bias in the ERP analysis. It is also a potential tool for extracting the ERP in online application.
2013-02-01
Purified cultures are tested for optimized production under heterotrophic conditions with several organic carbon sources like beet and sorghum juice using ...Moreover, AFRL support sponsored the Master’s in Chemical Engineering project titled “Cost Analysis Of Local Bio- Products Processing Plant Using ...unlimited. 2.5 Screening for High Lipid Production Mutants Procedure: A selection of 84 single colony cultures was analyzed in this phase using the
Highlighting impact: Do editors' selections identify influential papers?
NASA Astrophysics Data System (ADS)
Antonoyiannakis, Manolis
A recent trend in scientific publishing is that journal editors highlight each week a select set among the papers published (usually) in their respective journals. The highlighted papers are deemed of higher quality, importance, or interest than the 'average' paper and feature prominently in the publishers' websites. We perform a citation analysis of the highlighted papers for a number of journals from various publishers in physics. By comparing the performance of highlighted papers relative to (a) typical papers and (b) highly cited papers in their source journals and in other journals in the field, we explore whether, and to what extent, the selection process at the time of publication identifies papers that will turn out to be influential. We discuss the broader implications for research assessment.
Impact of selected troposphere models on Precise Point Positioning convergence
NASA Astrophysics Data System (ADS)
Kalita, Jakub; Rzepecka, Zofia
2016-04-01
The Precise Point Positioning (PPP) absolute method is currently intensively investigated in order to reach fast convergence time. Among various sources that influence the convergence of the PPP, the tropospheric delay is one of the most important. Numerous models of tropospheric delay are developed and applied to PPP processing. However, with rare exceptions, the quality of those models does not allow fixing the zenith path delay tropospheric parameter, leaving difference between nominal and final value to the estimation process. Here we present comparison of several PPP result sets, each of which based on different troposphere model. The respective nominal values are adopted from models: VMF1, GPT2w, MOPS and ZERO-WET. The PPP solution admitted as reference is based on the final troposphere product from the International GNSS Service (IGS). The VMF1 mapping function was used for all processing variants in order to provide capability to compare impact of applied nominal values. The worst case initiates zenith wet delay with zero value (ZERO-WET). Impact from all possible models for tropospheric nominal values should fit inside both IGS and ZERO-WET border variants. The analysis is based on data from seven IGS stations located in mid-latitude European region from year 2014. For the purpose of this study several days with the most active troposphere were selected for each of the station. All the PPP solutions were determined using gLAB open-source software, with the Kalman filter implemented independently by the authors of this work. The processing was performed on 1 hour slices of observation data. In addition to the analysis of the output processing files, the presented study contains detailed analysis of the tropospheric conditions for the selected data. The overall results show that for the height component the VMF1 model outperforms GPT2w and MOPS by 35-40% and ZERO-WET variant by 150%. In most of the cases all solutions converge to the same values during first hour of processing. Finally, the results have been compared against results obtained during calm tropospheric conditions.
Code of Federal Regulations, 2010 CFR
2010-10-01
... marking of contractor bid or proposal information and source selection information. 1303.104-4 Section... PRACTICES AND PERSONAL CONFLICTS OF INTEREST Safeguards 1303.104-4 Disclosure, protection and marking of contractor bid or proposal information and source selection information. Contractor bid or proposal...
Cho, Young-Je; Kim, HyunHo; Park, Kyoung-Yun; Lee, Jaegab; Bobade, Santosh M; Wu, Fu-Chung; Choi, Duck-Kyun
2011-01-01
Interest in transparent oxide thin film transistors utilizing ZnO material has been on the rise for many years. Recently, however, IGZO has begun to draw more attention due to its higher stability and superior electric field mobility when compared to ZnO. In this work, we address an improved method for patterning an a-IGZO film using the SAM process, which employs a cost-efficient micro-contact printing method instead of the conventional lithography process. After a-IGZO film deposition on the surface of a SiO2-layered Si wafer, the wafer was illuminated with UV light; sources and drains were then patterned using n-octadecyltrichlorosilane (OTS) molecules by a printing method. Due to the low surface energy of OTS, cobalt was selectively deposited on the OTS-free a-IGZO surface. The selective deposition of cobalt electrodes was successful, as confirmed by an optical microscope. The a-IZGO TFT fabricated using the SAM process exhibited good transistor performance: electric field mobility (micro(FE)), threshold voltage (V(th)), subthreshold slope (SS) and on/off ratio were 2.1 cm2/Vs, 2.4 V, 0.35 V/dec and 2.9 x 10(6), respectively.
Peng, Jianfeng; Song, Yonghui; Yuan, Peng; Xiao, Shuhu; Han, Lu
2013-07-01
The chemical industry is a major source of various pollution accidents. Improving the management level of risk sources for pollution accidents has become an urgent demand for most industrialized countries. In pollution accidents, the released chemicals harm the receptors to some extent depending on their sensitivity or susceptibility. Therefore, identifying the potential risk sources from such a large number of chemical enterprises has become pressingly urgent. Based on the simulation of the whole accident process, a novel and expandable identification method for risk sources causing water pollution accidents is presented. The newly developed approach, by analyzing and stimulating the whole process of a pollution accident between sources and receptors, can be applied to identify risk sources, especially on the nationwide scale. Three major types of losses, such as social, economic and ecological losses, were normalized, analyzed and used for overall consequence modeling. A specific case study area, located in a chemical industry park (CIP) along the Yangtze River in Jiangsu Province, China, was selected to test the potential of the identification method. The results showed that there were four risk sources for pollution accidents in this CIP. Aniline leakage in the HS Chemical Plant would lead to the most serious impact on the surrounding water environment. This potential accident would severely damage the ecosystem up to 3.8 km downstream of Yangtze River, and lead to pollution over a distance stretching to 73.7 km downstream. The proposed method is easily extended to the nationwide identification of potential risk sources.
Improved Nitrogen Removal Effect In Continuous Flow A2/O Process Using Typical Extra Carbon Source
NASA Astrophysics Data System (ADS)
Wu, Haiyan; Gao, Junyan; Yang, Dianhai; Zhou, Qi; Cai, Bijing
2010-11-01
In order to provide a basis for optimal selection of carbon source, three typical external carbon sources (i.e. methanol, sodium acetate and leachate) were applied to examine nitrogen removal efficiency of continuous flow A2/O system with the influent from the effluent of grit chamber in the second Kunming wastewater treatment plant. The best dosage was determined, and the specific nitrogen removal rate and carbon consumption rate were calculated with regard to individual external carbon source in A2/O system. Economy and technology analysis was also conducted to select the suitable carbon source with a low operation cost. Experimental results showed that the external typical carbon source caused a remarkable enhancement of system nitrate degradation ability. In comparison with the blank test, the average TN and NH3-N removal efficiency of system with different dosing quantities of external carbon source was improved by 15.2% and 34.2%, respectively. The optimal dosage of methanol, sodium acetate and leachate was respectively up to 30 mg/L, 40 mg/L and 100 mg COD/L in terms of a high nitrogen degradation effect. The highest removal efficiency of COD, TN and NH3-N reached respectively 92.3%, 73.9% and 100% with methanol with a dosage of 30 mg/L. The kinetic analysis and calculation revealed that the greatest denitrification rate was 0.0107 mg TN/mg MLVSSṡd with sodium acetate of 60 mg/L. As to carbon consumption rate, however, the highest value occurred in the blank test with a rate of 0.1955 mg COD/mg MLVSSṡd. Also, further economic analysis proved leachate to be pragmatic external carbon source whose cost was far cheaper than methanol.
Parallel Processing of Large Scale Microphone Arrays for Sound Capture
NASA Astrophysics Data System (ADS)
Jan, Ea-Ee.
1995-01-01
Performance of microphone sound pick up is degraded by deleterious properties of the acoustic environment, such as multipath distortion (reverberation) and ambient noise. The degradation becomes more prominent in a teleconferencing environment in which the microphone is positioned far away from the speaker. Besides, the ideal teleconference should feel as easy and natural as face-to-face communication with another person. This suggests hands-free sound capture with no tether or encumbrance by hand-held or body-worn sound equipment. Microphone arrays for this application represent an appropriate approach. This research develops new microphone array and signal processing techniques for high quality hands-free sound capture in noisy, reverberant enclosures. The new techniques combine matched-filtering of individual sensors and parallel processing to provide acute spatial volume selectivity which is capable of mitigating the deleterious effects of noise interference and multipath distortion. The new method outperforms traditional delay-and-sum beamformers which provide only directional spatial selectivity. The research additionally explores truncated matched-filtering and random distribution of transducers to reduce complexity and improve sound capture quality. All designs are first established by computer simulation of array performance in reverberant enclosures. The simulation is achieved by a room model which can efficiently calculate the acoustic multipath in a rectangular enclosure up to a prescribed order of images. It also calculates the incident angle of the arriving signal. Experimental arrays were constructed and their performance was measured in real rooms. Real room data were collected in a hard-walled laboratory and a controllable variable acoustics enclosure of similar size, approximately 6 x 6 x 3 m. An extensive speech database was also collected in these two enclosures for future research on microphone arrays. The simulation results are shown to be consistent with the real room data. Localization of sound sources has been explored using cross-power spectrum time delay estimation and has been evaluated using real room data under slightly, moderately and highly reverberant conditions. To improve the accuracy and reliability of the source localization, an outlier detector that removes incorrect time delay estimation has been invented. To provide speaker selectivity for microphone array systems, a hands-free speaker identification system has been studied. A recently invented feature using selected spectrum information outperforms traditional recognition methods. Measured results demonstrate the capabilities of speaker selectivity from a matched-filtered array. In addition, simulation utilities, including matched -filtering processing of the array and hands-free speaker identification, have been implemented on the massively -parallel nCube super-computer. This parallel computation highlights the requirements for real-time processing of array signals.
NASA Technical Reports Server (NTRS)
Zaitzeff, J. B. (Editor); Cornillon, P. (Editor); Aubrey, D. A. (Editor)
1980-01-01
Presentations were grouped in the following categories: (1) a technical orientation of Earth resources remote sensing including data sources and processing; (2) a review of the present status of remote sensing technology applicable to the coastal and marine environment; (3) a description of data and information needs of selected coastal and marine activities; and (4) an outline of plans for marine monitoring systems for the east coast and a concept for an east coast remote sensing facility. Also discussed were user needs and remote sensing potentials in the areas of coastal processes and management, commercial and recreational fisheries, and marine physical processes.
Real time microcontroller implementation of an adaptive myoelectric filter.
Bagwell, P J; Chappell, P H
1995-03-01
This paper describes a real time digital adaptive filter for processing myoelectric signals. The filter time constant is automatically selected by the adaptation algorithm, giving a significant improvement over linear filters for estimating the muscle force and controlling a prosthetic device. Interference from mains sources often produces problems for myoelectric processing, and so 50 Hz and all harmonic frequencies are reduced by an averaging filter and differential process. This makes practical electrode placement and contact less critical and time consuming. An economic real time implementation is essential for a prosthetic controller, and this is achieved using an Intel 80C196KC microcontroller.
Comparative study of resist stabilization techniques for metal etch processing
NASA Astrophysics Data System (ADS)
Becker, Gerry; Ross, Matthew F.; Wong, Selmer S.; Minter, Jason P.; Marlowe, Trey; Livesay, William R.
1999-06-01
This study investigates resist stabilization techniques as they are applied to a metal etch application. The techniques that are compared are conventional deep-UV/thermal stabilization, or UV bake, and electron beam stabilization. The electron beam tool use din this study, an ElectronCure system from AlliedSignal Inc., ELectron Vision Group, utilizes a flood electron source and a non-thermal process. These stabilization techniques are compared with respect to a metal etch process. In this study, two types of resist are considered for stabilization and etch: a g/i-line resist, Shipley SPR-3012, and an advanced i-line, Shipley SPR 955- Cm. For each of these resist the effects of stabilization on resist features are evaluated by post-stabilization SEM analysis. Etch selectivity in all cases is evaluated by using a timed metal etch, and measuring resists remaining relative to total metal thickness etched. Etch selectivity is presented as a function of stabilization condition. Analyses of the effects of the type of stabilization on this method of selectivity measurement are also presented. SEM analysis was also performed on the features after a compete etch process, and is detailed as a function of stabilization condition. Post-etch cleaning is also an important factor impacted by pre-etch resist stabilization. Results of post- etch cleaning are presented for both stabilization methods. SEM inspection is also detailed for the metal features after resist removal processing.
NASA Astrophysics Data System (ADS)
Laher, Russ
2012-08-01
Aperture Photometry Tool (APT) is software for astronomers and students interested in manually exploring the photometric qualities of astronomical images. It has a graphical user interface (GUI) which allows the image data associated with aperture photometry calculations for point and extended sources to be visualized and, therefore, more effectively analyzed. Mouse-clicking on a source in the displayed image draws a circular or elliptical aperture and sky annulus around the source and computes the source intensity and its uncertainty, along with several commonly used measures of the local sky background and its variability. The results are displayed and can be optionally saved to an aperture-photometry-table file and plotted on graphs in various ways using functions available in the software. APT is geared toward processing sources in a small number of images and is not suitable for bulk processing a large number of images, unlike other aperture photometry packages (e.g., SExtractor). However, APT does have a convenient source-list tool that enables calculations for a large number of detections in a given image. The source-list tool can be run either in automatic mode to generate an aperture photometry table quickly or in manual mode to permit inspection and adjustment of the calculation for each individual detection. APT displays a variety of useful graphs, including image histogram, and aperture slices, source scatter plot, sky scatter plot, sky histogram, radial profile, curve of growth, and aperture-photometry-table scatter plots and histograms. APT has functions for customizing calculations, including outlier rejection, pixel “picking” and “zapping,” and a selection of source and sky models. The radial-profile-interpolation source model, accessed via the radial-profile-plot panel, allows recovery of source intensity from pixels with missing data and can be especially beneficial in crowded fields.
Sound reduction of air compressors using a systematic approach
NASA Astrophysics Data System (ADS)
Moylan, Justin Tharp
The noise emitted by portable electric air compressors can often be a nuisance or potentially hazardous to the operator or others nearby. Therefore, reducing the noise of these air compressors is desired. This research focuses on compressors with a reciprocating piston design as this is the most common type of pump design for portable compressors. An experimental setup was developed to measure the sound and vibration of the air compressors, including testing inside a semi-anechoic chamber. The design of a quiet air compressor was performed in four stages: 1) Teardown and benchmarking of air compressors, 2) Identification and isolation of noise sources, 3) Development of individual means to quiet noise sources, 4) Selection and testing of integrated solutions. The systematic approach and results for each of these stages will be discussed. Two redesigned solutions were developed and measured to be approximately 65% quieter than the previous unmodified compressor. An additional analysis was performed on the solutions selected by the participants involved in the selection process. This analysis involved determining which of the design criteria each participant considered most important when selecting solutions. The results from each participant were then compared to their educational background and experience and correlations were identified. The correlations discovered suggest that educational background and experience may be key determinants for the preference models developed.
NASA Astrophysics Data System (ADS)
dell'Erba, M.; Galantucci, L. M.; Miglietta, S.
This paper reports on the results of research which investigated the potential for the application of an excimer laser in the field of composite material drilling and cutting, by comparing this technology with that using CO2 sources. In particular, the scope of the work was to check whether the interaction between excimer lasers and composite materials, whose characteristic feature is the absence of thermal transfer, could yield better results than those obtainable with CO2 sources once heat transfer-induced difficulties had been eliminated. The materials selected for the experiments were multilayer composites having an epoxy resin matrix (65 percent in volume), with aramid fiber (Kevlar), carbon fiber and glass fiber as reinforcing materials, all of considerable interest for the aerospace industry. Optimal operational parameters were identified in relation to each source with a view to obtaining undersize holes or through cuts exhibiting severed areas of good quality. A comparison between the two types of processing carried out show that rims processed by excimer lasers are of better quality - particularly so with Kevlar - whereas the ablation rate is undoubtedly rather low compared with the CO2 technology.
NASA Astrophysics Data System (ADS)
Wang, Chao; An, Xingqin; Zhai, Shixian; Hou, Qing; Sun, Zhaobin
2018-02-01
In this study, the sustained pollution processes were selected during which daily PM2.5 concentration exceeded 75 μg/m3 for three days continuously based on the hourly data of Beijing observation sites from July 2012 to December 2015. Using the China Meteorological Administration (CMA) MICAPS meteorological processing system, synoptic situation during PM2.5 pollution processes was classified into five weather types: low pressure and weak high pressure alternating control, weak high pressure, low pressure control, high rear, and uniform pressure field. Then, we chose the representative pollution cases corresponding to each type, adopted the GRAPES-CUACE adjoint model tracking the sensitive source areas of the five types, and analyzed the critical discharge periods of Beijing and neighboring provinces as well as their contribution to the PM2.5 peak concentration in Beijing. The results showed that the local source plays the main theme in the 30 h before the objective time, and prior to 72 h before the objective time contribution of local sources for the five pollution types are 37.5%, 25.0%, 39.4%, 31.2%, and 42.4%, respectively; the Hebei source contributes constantly in the 57 h ahead of the objective time with the contribution proportion ranging from 37% to 64%; the contribution period and rate of Tianjin and Shanxi sources are shorter and smaller. Based on the adjoint sensitivity analysis, we further discussed the effect of emission reduction control measures in different types, finding that the effect of local source reduction in the first 20 h of the objective time is better, and if the local source is reduced 50% within 72 h before the objective time, the decline rates of PM2.5 in the five types are 11.6%, 9.4%, 13.8%, 9.9% and 15.2% respectively. And the reduction effect of the neighboring sources is better within the 3-57 h before the objective time.
Vale, S S; Fuller, I C; Procter, J N; Basher, L R; Smith, I E
2016-02-01
Knowledge of sediment movement throughout a catchment environment is essential due to its influence on the character and form of our landscape relating to agricultural productivity and ecological health. Sediment fingerprinting is a well-used tool for evaluating sediment sources within a fluvial catchment but still faces areas of uncertainty for applications to large catchments that have a complex arrangement of sources. Sediment fingerprinting was applied to the Manawatu River Catchment to differentiate 8 geological and geomorphological sources. The source categories were Mudstone, Hill Subsurface, Hill Surface, Channel Bank, Mountain Range, Gravel Terrace, Loess and Limestone. Geochemical analysis was conducted using XRF and LA-ICP-MS. Geochemical concentrations were analysed using Discriminant Function Analysis and sediment un-mixing models. Two mixing models were used in conjunction with GRG non-linear and Evolutionary optimization methods for comparison. Discriminant Function Analysis required 16 variables to correctly classify 92.6% of sediment sources. Geological explanations were achieved for some of the variables selected, although there is a need for mineralogical information to confirm causes for the geochemical signatures. Consistent source estimates were achieved between models with optimization techniques providing globally optimal solutions for sediment quantification. Sediment sources was attributed primarily to Mudstone, ≈38-46%; followed by the Mountain Range, ≈15-18%; Hill Surface, ≈12-16%; Hill Subsurface, ≈9-11%; Loess, ≈9-15%; Gravel Terrace, ≈0-4%; Channel Bank, ≈0-5%; and Limestone, ≈0%. Sediment source apportionment fits with the conceptual understanding of the catchment which has recognized soft sedimentary mudstone to be highly susceptible to erosion. Inference of the processes responsible for sediment generation can be made for processes where there is a clear relationship with the geomorphology, but is problematic for processes which occur within multiple terrains. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Yetman, G.; Downs, R. R.
2011-12-01
Software deployment is needed to process and distribute scientific data throughout the data lifecycle. Developing software in-house can take software development teams away from other software development projects and can require efforts to maintain the software over time. Adopting and reusing software and system modules that have been previously developed by others can reduce in-house software development and maintenance costs and can contribute to the quality of the system being developed. A variety of models are available for reusing and deploying software and systems that have been developed by others. These deployment models include open source software, vendor-supported open source software, commercial software, and combinations of these approaches. Deployment in Earth science data processing and distribution has demonstrated the advantages and drawbacks of each model. Deploying open source software offers advantages for developing and maintaining scientific data processing systems and applications. By joining an open source community that is developing a particular system module or application, a scientific data processing team can contribute to aspects of the software development without having to commit to developing the software alone. Communities of interested developers can share the work while focusing on activities that utilize in-house expertise and addresses internal requirements. Maintenance is also shared by members of the community. Deploying vendor-supported open source software offers similar advantages to open source software. However, by procuring the services of a vendor, the in-house team can rely on the vendor to provide, install, and maintain the software over time. Vendor-supported open source software may be ideal for teams that recognize the value of an open source software component or application and would like to contribute to the effort, but do not have the time or expertise to contribute extensively. Vendor-supported software may also have the additional benefits of guaranteed up-time, bug fixes, and vendor-added enhancements. Deploying commercial software can be advantageous for obtaining system or software components offered by a vendor that meet in-house requirements. The vendor can be contracted to provide installation, support and maintenance services as needed. Combining these options offers a menu of choices, enabling selection of system components or software modules that meet the evolving requirements encountered throughout the scientific data lifecycle.
NASA Astrophysics Data System (ADS)
Huang, Yi-Chih; Wang, Pao K.
2017-01-01
Numerical modeling is conducted to study the hydrometeor partitioning and microphysical source and sink processes during a quasi-steady state of thunderstorms over the Pacific Warm Pool by utilizing the microphysical model WISCDYMM to simulate selected storm cases. The results show that liquid-phase hydrometeors dominate thunderstorm evolution over the Pacific Warm Pool. The ratio of ice-phase mass to liquid-phase mass is about 41%: 59%, indicating that ice-phase water is not as significant over the Pacific Warm Pool as the liquid water compared to the larger than 50% in the subtropics and 80% in the US High Plains in a previous study. Sensitivity tests support the dominance of liquid-phase hydrometeors over the Pacific Warm Pool. The major rain sources are the key hail sinks: melting of hail and shedding from hail; whereas the crucial rain sinks are evaporation and accretion by hail. The major snow sources are Bergeron-Findeisen process, transfer of cloud ice to snow and accretion of cloud water; whereas the foremost sink of snow is accretion by hail. The essential hail sources are accretions of rain, cloud water, and snow; whereas the critical hail sinks are melting of hail and shedding from hail. The contribution and ranking of sources and sinks of these precipitates are compared with the previous study. Hydrometeors have their own special microphysical processes in the development and depletion over the Pacific Warm Pool. Microphysical budgets depend on atmospheric dynamical and thermodynamical conditions which determine the partitioning of hydrometeors. This knowledge would benefit the microphysics parameterization in cloud models and cumulus parameterization in global circulation models.
ERIC Educational Resources Information Center
Murphy, Jeremy W.; Foxe, John J.; Molholm, Sophie
2016-01-01
The ability to attend to one among multiple sources of information is central to everyday functioning. Just as central is the ability to switch attention among competing inputs as the task at hand changes. Such processes develop surprisingly slowly, such that even into adolescence, we remain slower and more error prone at switching among tasks…
ERIC Educational Resources Information Center
Bissels, Gerhard
2008-01-01
Purpose: The purpose of this paper is to describe the selection process and criteria that led to the implementation of the Koha 3.0 library management system (LMS) at the Complementary and Alternative Medicine Library and Information Service (CAMLIS), Royal London Homoeopathic Hospital. Design/methodology/approach: The paper is a report based on…
2015-05-21
Source Assessment and Feedback OER Officer Evaluation Report PME Professional Military Education TRADOC Training and Doctrine Command...Toxic leadership is a combination of self-centered attitudes, motivations , and behaviors that have adverse effects on subordinates, the...process of influencing people by providing purpose, direction and motivation to accomplish the mission and improve the organization.”28 The ideal
Procurement specifications report. IMBLS phase B-4
NASA Technical Reports Server (NTRS)
1970-01-01
Procurement specifications to provide vendors of space systems with supporting information to accurately price the selected major buy items are illustrated. In performing this task, rigid constraints on specifications and drawing details are avoided beyond those necessary to define basic requirements. Described are digital processing equipment, mass spectrometer, body mass measuring device, sensors, bio-belt power source, vision tester and instrumentation for a biochemical station.
2013-10-01
Threats: Tools and Techniques 2 2.1 The Man-in-The-Middle ( MiTM ) Proxy 2 2.2 The Inspection Process 2 3 Installing WebDLPIndexer 4 3.1 Install JDK SE...selected open source and public-domain tools since they are freely available to the public. 2.1 The Man-in-The-Middle ( MiTM ) Proxy This report builds
2012-12-14
Each pair of rollers is designed to capture the shafts mounted to both ends of the tool lid. Additionally, a safety pin can be put in place to...ITRB for the AH-64D. The scope of the program included structural design , materials selection, manufacturing producibility analysis, tooling design ...responsible for tooling design and fabrication, fabrication process development and fabrication of spars and test samples; G3 who designed the RTM
ERIC Educational Resources Information Center
Allgaier, Joachim
2011-01-01
Media accounts of reality have the potential to influence public opinion and decision making processes. Therefore who has and who does not have access to the media and can make their voice heard is a crucial question with serious political consequences. In this article it is investigated whether the specialty of journalists influences their source…
NASA Astrophysics Data System (ADS)
Fukuda, Kenjiro; Takeda, Yasunori; Kobayashi, Yu; Shimizu, Masahiro; Sekine, Tomohito; Kumaki, Daisuke; Kurihara, Masato; Sakamoto, Masatomi; Tokito, Shizuo
2013-05-01
Fully solution-processed organic thin-film transistor (OTFT) devices have been fabricated with simple patterning process at a relatively low process temperature of 100 °C. In the patterning process, a hydrophobic amorphous fluoropolymer material, which was used as the gate dielectric layer and the underlying base layer, was treated with an oxygen plasma to selectively change its surface wetting properties from hydrophobic to hydrophilic. Silver source and drain electrodes were successfully formed in the treated areas with highly uniform line widths and without residues between the electrodes. Nonuniformities in the thickness of the silver electrodes originating from the “coffee-ring” effect were suppressed by optimizing the blend of solvents used with the silver nanoparticles, such that the printed electrodes are appropriate for bottom-gate OTFT devices. A fully solution-processed OTFT device using a polymer semiconductor material (PB16TTT) exhibited good electrical performance with no hysteresis in its transfer characteristics and with good linearity in its output characteristics. A relatively high carrier mobility of 0.14 cm2 V-1 s-1 and an on/off ratio of 1×105 were obtained with the fabricated TFT device.
NASA Astrophysics Data System (ADS)
Brosda, Maximilian; Olowinsky, Alexander; Pelzer, Alexander
Flexible organic electronics such as OLPV and OLED modules are highly sensitive against water and oxygen. To protect them against the environment and to ensure a long lifetime visual transparent ultra high barrier films are used for the encapsulation process. These multilayer films usually consist of a polymer substrate on which, depending on the requirements, various functional layers are applied. The organic device is then fully packed in this films. Instead of conventional joining these film with adhesive, a flexible laser based process can be an interesting alternative especially for roll2roll applications. According to a precise spectral analysis and a consideration of the interaction between the laser radiation and the individual layers of the film a suitable laser beam source is selected. With this laser beam source the weldability of the films is investigated. For analysis of the weldseam and the melted volume cross sections and scanning-electron-microscopy-images are prepared. The strength of the weld is determined by T-Peel tensile tests.
Evaluation of Chemical Coating Processes for AXAF
NASA Technical Reports Server (NTRS)
Engelhaupt, Darell; Ramsey, Brian; Mendrek, Mitchell
1998-01-01
The need existed at MSFC for the development and fabrication of radioisotope calibration sources of cadmium 109 and iron 55 isotopes. This was in urgent response to the AXA-F program. Several issues persisted in creating manufacturing difficulties for the supplier. In order to meet the MSFC requirements very stringent control needed to be maintained for the coating quality, specific activity and thickness. Due to the difficulties in providing the precisely controlled devices for testing, the delivery of the sources was seriously delayed. It became imperative that these fabrication issues be resolved to avoid further delays in this AXA-F observatory key component. The objectives are: 1) Research and provide expert advice on coating materials and procedures. 2) Research and recommend solutions to problems that have been experienced with the coating process. 3) Provide recommendations on the selection and preparation of substrates. 4) Provide consultation on the actual coating process including the results of the qualification and acceptance test programs. 5) Perform independent tests at UAH or MSFC as necessary.
Seismic Window Selection and Misfit Measurements for Global Adjoint Tomography
NASA Astrophysics Data System (ADS)
Lei, W.; Bozdag, E.; Lefebvre, M.; Podhorszki, N.; Smith, J. A.; Tromp, J.
2013-12-01
Global Adjoint Tomography requires fast parallel processing of large datasets. After obtaing the preprocessed observed and synthetic seismograms, we use the open source software packages FLEXWIN (Maggi et al. 2007) to select time windows and MEASURE_ADJ to make measurements. These measurements define adjoint sources for data assimilation. Previous versions of these tools work on a pair of SAC files---observed and synthetic seismic data for the same component and station, and loop over all seismic records associated with one earthquake. Given the large number of stations and earthquakes, the frequent read and write operations create severe I/O bottlenecks on modern computing platforms. We present new versions of these tools utilizing a new seismic data format, namely the Adaptive Seismic Data Format(ASDF). This new format shows superior scalability for applications on high-performance computers and accommodates various types of data, including earthquake, industry and seismic interferometry datasets. ASDF also provides user-friendly APIs, which can be easily integrated into the adjoint tomography workflow and combined with other data processing tools. In addition to solving the I/O bottleneck, we are making several improvements to these tools. For example, FLEXWIN is tuned to select windows for different types of earthquakes. To capture their distinct features, we categorize earthquakes by their depths and frequency bands. Moreover, instead of only picking phases between the first P arrival and the surface-wave arrivals, our aim is to select and assimilate many other later prominent phases in adjoint tomography. For example, in the body-wave band (17 s - 60 s), we include SKS, sSKS and their multiple, while in the surface-wave band (60 s - 120 s) we incorporate major-arc surface waves.
Poszytek, Krzysztof; Pyzik, Adam; Sobczak, Adam; Lipinski, Leszek; Sklodowska, Aleksandra; Drewniak, Lukasz
2017-08-01
The main aim of this study was to evaluate the effect of the source of microorganisms on the selection of hydrolytic consortia dedicated to anaerobic digestion of maize silage. The selection process was investigated based on the analysis of changes in the hydrolytic activity and the diversity of microbial communities derived from (i) a hydrolyzer of a commercial agricultural biogas plant, (ii) cattle slurry and (iii) raw sewage sludge, during a series of 10 passages. Following the selection process, the adapted consortia were thoroughly analyzed for their ability to utilize maize silage and augmentation of anaerobic digestion communities. The results of selection of the consortia showed that every subsequent passage of each consortium leads to their adaptation to degradation of maize silage, which was manifested by the increased hydrolytic activity of the adapted consortia. Biodiversity analysis (based on the 16S rDNA amplicon sequencing) confirmed the changes microbial community of each consortium, and showed that after the last (10th) passage all microbial communities were dominated by the representatives of Lactobacillaceae, Prevotellaceae, Veillonellaceae. The results of the functional analyses showed that the adapted consortia improved the efficiency of maize silage degradation, as indicated by the increase in the concentration of glucose and volatile fatty acids (VFAs), as well as the soluble chemical oxygen demand (sCOD). Moreover, bioaugmentation of anaerobic digestion communities by the adapted hydrolytic consortia increased biogas yield by 10-29%, depending on the origin of the community. The obtained results also indicate that substrate input (not community origin) was the driving force responsible for the changes in the community structure of hydrolytic consortia dedicated to anaerobic digestion. Copyright © 2017 Elsevier Ltd. All rights reserved.
Robotic vision. [process control applications
NASA Technical Reports Server (NTRS)
Williams, D. S.; Wilf, J. M.; Cunningham, R. T.; Eskenazi, R.
1979-01-01
Robotic vision, involving the use of a vision system to control a process, is discussed. Design and selection of active sensors employing radiation of radio waves, sound waves, and laser light, respectively, to light up unobservable features in the scene are considered, as are design and selection of passive sensors, which rely on external sources of illumination. The segmentation technique by which an image is separated into different collections of contiguous picture elements having such common characteristics as color, brightness, or texture is examined, with emphasis on the edge detection technique. The IMFEX (image feature extractor) system performing edge detection and thresholding at 30 frames/sec television frame rates is described. The template matching and discrimination approach to recognize objects are noted. Applications of robotic vision in industry for tasks too monotonous or too dangerous for the workers are mentioned.
The application of ERTS-1 data to the land use planning process. [Wisconsin
NASA Technical Reports Server (NTRS)
Clapp, J. L.; Kiefer, R. W.; Kuhlmey, E. L.; Niemann, B. J., Jr.
1974-01-01
Land resource data has been extracted on a percent of cell basis from ERTS imagery, RB-57 color infrared imagery and best available conventional sources for a 10,000 square kilometer test area in eastern Wisconsin. First, the data from the three sources is compared on a spatial basis for a 300 square kilometer portion of the test area. For those land resource variables associated with cover, ERTS derived resource data compared favorably with both the RB-57 and conventional data. Second, the effect of the data source on land use decisions is examined. Three interstate highway corridors are located through the same region based upon data extracted from each of the three sources. A policy of preserving natural environmental systems was used as a basis for the corridors selection in each case. The resulting three corridors compare favorably.
New laser system for highly sensitive clinical pulse oximetry
NASA Astrophysics Data System (ADS)
Hamza, Mostafa; Hamza, Mohammad
1996-04-01
This paper describes the theory and design of a new pulse oximeter in which laser diodes and other compact laser sources are used for the measurement of oxygen saturation in patients who are at risk of developing hypoxemia. The technique depends upon illuminating special sites of the skin of the patient with radiation from modulated laser sources at selected wavelengths. The specific laser wavelengths are chosen based on the absorption characteristics of oxyhemoglobin, reduced hemoglobin and other interfering sources for obtaining more accurate measurements. The laser radiation transmitted through the tissue is detected and signal processing based on differential absorption laser spectroscopy is done in such a way to overcome the primary performance limitations of the conventionally used pulse oximetry. The new laser pulse oximeter can detect weak signals and is not affected by other light sources such as surgical lamps, phototherapy units, etc. The detailed description and operating characteristics of this system are presented.
Electron beam pumped semiconductor laser
NASA Technical Reports Server (NTRS)
Hug, William F. (Inventor); Reid, Ray D. (Inventor)
2009-01-01
Electron-beam-pumped semiconductor ultra-violet optical sources (ESUVOSs) are disclosed that use ballistic electron pumped wide bandgap semiconductor materials. The sources may produce incoherent radiation and take the form of electron-beam-pumped light emitting triodes (ELETs). The sources may produce coherent radiation and take the form of electron-beam-pumped laser triodes (ELTs). The ELTs may take the form of electron-beam-pumped vertical cavity surface emitting lasers (EVCSEL) or edge emitting electron-beam-pumped lasers (EEELs). The semiconductor medium may take the form of an aluminum gallium nitride alloy that has a mole fraction of aluminum selected to give a desired emission wavelength, diamond, or diamond-like carbon (DLC). The sources may be produced from discrete components that are assembled after their individual formation or they may be produced using batch MEMS-type or semiconductor-type processing techniques to build them up in a whole or partial monolithic manner, or combination thereof.
23 CFR 636.512 - What is the basis for the source selection decision?
Code of Federal Regulations, 2013 CFR
2013-04-01
... AND TRAFFIC OPERATIONS DESIGN-BUILD CONTRACTING Discussions, Proposal Revisions and Source Selection... decision on a comparative assessment of proposals against all selection criteria in the solicitation. While...
23 CFR 636.512 - What is the basis for the source selection decision?
Code of Federal Regulations, 2011 CFR
2011-04-01
... AND TRAFFIC OPERATIONS DESIGN-BUILD CONTRACTING Discussions, Proposal Revisions and Source Selection... decision on a comparative assessment of proposals against all selection criteria in the solicitation. While...
23 CFR 636.512 - What is the basis for the source selection decision?
Code of Federal Regulations, 2014 CFR
2014-04-01
... AND TRAFFIC OPERATIONS DESIGN-BUILD CONTRACTING Discussions, Proposal Revisions and Source Selection... decision on a comparative assessment of proposals against all selection criteria in the solicitation. While...
23 CFR 636.512 - What is the basis for the source selection decision?
Code of Federal Regulations, 2010 CFR
2010-04-01
... AND TRAFFIC OPERATIONS DESIGN-BUILD CONTRACTING Discussions, Proposal Revisions and Source Selection... decision on a comparative assessment of proposals against all selection criteria in the solicitation. While...
23 CFR 636.512 - What is the basis for the source selection decision?
Code of Federal Regulations, 2012 CFR
2012-04-01
... AND TRAFFIC OPERATIONS DESIGN-BUILD CONTRACTING Discussions, Proposal Revisions and Source Selection... decision on a comparative assessment of proposals against all selection criteria in the solicitation. While...
Sun, Zhi; Xiao, Y; Sietsma, J; Agterhuis, H; Yang, Y
2015-07-07
In recent years, recovery of metals from electronic waste within the European Union has become increasingly important due to potential supply risk of strategic raw material and environmental concerns. Electronic waste, especially a mixture of end-of-life electronic products from a variety of sources, is of inherently high complexity in composition, phase, and physiochemical properties. In this research, a closed-loop hydrometallurgical process was developed to recover valuable metals, i.e., copper and precious metals, from an industrially processed information and communication technology waste. A two-stage leaching design of this process was adopted in order to selectively extract copper and enrich precious metals. It was found that the recovery efficiency and extraction selectivity of copper both reached more than 95% by using ammonia-based leaching solutions. A new electrodeposition process has been proven feasible with 90% current efficiency during copper recovery, and the copper purity can reach 99.8 wt %. The residue from the first-stage leaching was screened into coarse and fine fractions. The coarse fraction was returned to be releached for further copper recovery. The fine fraction was treated in the second-stage leaching using sulfuric acid to further concentrate precious metals, which could achieve a 100% increase in their concentrations in the residue with negligible loss into the leaching solution. By a combination of different leaching steps and proper physical separation of light materials, this process can achieve closed-loop recycling of the waste with significant efficiency.
VHDL implementation of feature-extraction algorithm for the PANDA electromagnetic calorimeter
NASA Astrophysics Data System (ADS)
Guliyev, E.; Kavatsyuk, M.; Lemmens, P. J. J.; Tambave, G.; Löhner, H.; Panda Collaboration
2012-02-01
A simple, efficient, and robust feature-extraction algorithm, developed for the digital front-end electronics of the electromagnetic calorimeter of the PANDA spectrometer at FAIR, Darmstadt, is implemented in VHDL for a commercial 16 bit 100 MHz sampling ADC. The source-code is available as an open-source project and is adaptable for other projects and sampling ADCs. Best performance with different types of signal sources can be achieved through flexible parameter selection. The on-line data-processing in FPGA enables to construct an almost dead-time free data acquisition system which is successfully evaluated as a first step towards building a complete trigger-less readout chain. Prototype setups are studied to determine the dead-time of the implemented algorithm, the rate of false triggering, timing performance, and event correlations.
CARS molecular fingerprinting using a sub-nanosecond supercontinuum light source
NASA Astrophysics Data System (ADS)
Kano, Hideaki; Akiyama, Toshihiro; Inoko, Akihito; Kobayashi, Tsubasa; Leproux, Philippe; Couderc, Vincent; Kaji, Yuichi; Oshika, Tetsuro
2018-02-01
We have visualized living cells and tissues using an ultrabroadband multiplex coherent anti-Stokes Raman scattering (CARS) microspectroscopic system by using a sub-nanosecond supercontinuum (SC) light source. Owing to the ultrabroadband spectral profile of the SC, we can generate multiplex CARS signals in the spectral range of 500-3800 cm-1, which covers the whole molecular fingerprint region, as well as the C-H and O-H stretching regions. Through the combination of the ultrabroadband multiplex CARS method with second harmonic generation (SHG) and third harmonic generation (THG) processes, we have successfully performed selective imaging of ciliary rootlet-composing Rootletin filaments in rat retina.
Development of Agave as a dedicated biomass source: production of biofuels from whole plants
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mielenz, Jonathan R.; Rodriguez, Jr, Miguel; Thompson, Olivia A
Background: Agave species can grow well in semi-arid marginal agricultural lands around the world. Selected Agave species are used largely for alcoholic beverage production in Mexico. There are expanding research efforts to use the plentiful residues (bagasse) for ethanol production as the beverage manufacturing process only uses the juice from the central core of mature plants. Here we investigate the potential of over a dozen Agave species, including three from cold semi-arid regions of the United States, to produce biofuels using the whole plant. Results: Ethanol was readily produced by Saccharomyces cerevisiae from hydrolysate of ten whole Agaves with themore » use of a proper blend of biomass degrading enzymes that overcomes toxicity of most of the species tested. Unlike yeast fermentations, Clostridium beijerinckii produced butanol plus acetone from nine species tested. Butyric acid, a precursor of butanol, was also present due to incomplete conversion during the screening process. Since Agave contains high levels of free and poly-fructose which are readily destroyed by acidic pretreatment, a two step process was used developed to depolymerized poly-fructose while maintaining its fermentability. The hydrolysate from before and after dilute acid processing was used in C. beijerinckii acetone and butanol fermentations with selected Agave species. Conclusions: Results have shown Agave s potential to be a source of fermentable sugars beyond the existing beverage species to now include species previously unfermentable by yeast, including cold tolerant lines. This development may stimulate development of Agave as a dedicated feedstock for biofuels in semi-arid regions throughout the globe.« less
Development of Agave as a dedicated biomass source: production of biofuels from whole plants
Mielenz, Jonathan R.; Rodriguez, Jr, Miguel; Thompson, Olivia A; ...
2015-01-01
Background: Agave species can grow well in semi-arid marginal agricultural lands around the world. Selected Agave species are used largely for alcoholic beverage production in Mexico. There are expanding research efforts to use the plentiful residues (bagasse) for ethanol production as the beverage manufacturing process only uses the juice from the central core of mature plants. Here we investigate the potential of over a dozen Agave species, including three from cold semi-arid regions of the United States, to produce biofuels using the whole plant. Results: Ethanol was readily produced by Saccharomyces cerevisiae from hydrolysate of ten whole Agaves with themore » use of a proper blend of biomass degrading enzymes that overcomes toxicity of most of the species tested. Unlike yeast fermentations, Clostridium beijerinckii produced butanol plus acetone from nine species tested. Butyric acid, a precursor of butanol, was also present due to incomplete conversion during the screening process. Since Agave contains high levels of free and poly-fructose which are readily destroyed by acidic pretreatment, a two step process was used developed to depolymerized poly-fructose while maintaining its fermentability. The hydrolysate from before and after dilute acid processing was used in C. beijerinckii acetone and butanol fermentations with selected Agave species. Conclusions: Results have shown Agave s potential to be a source of fermentable sugars beyond the existing beverage species to now include species previously unfermentable by yeast, including cold tolerant lines. This development may stimulate development of Agave as a dedicated feedstock for biofuels in semi-arid regions throughout the globe.« less
Software Selection: A Primer on Source and Evaluation.
ERIC Educational Resources Information Center
Burston, Jack
2003-01-01
Provides guidance on making decisions regarding the selection of foreign language instructional software. Identifies sources of foreign language software, indicates sources of foreign language software reviews, and outlines essential procedures of software evaluation. (Author/VWL)
Analysis of age as a factor in NASA astronaut selection and career landmarks.
Kovacs, Gregory T A; Shadden, Mark
2017-01-01
NASA's periodic selection of astronauts is a highly selective process accepting applications from the general population, wherein the mechanics of selection are not made public. This research was an effort to determine if biases (specifically age) exist in the process and, if so, at which points they might manifest. Two sets of analyses were conducted. The first utilized data requested via the Freedom of Information Act (FOIA) on NASA astronaut applicants for the 2009 and 2013 selection years. Using a series of multinomial and logistic regressions, the data were analyzed to uncover whether age of the applicants linearly or nonlinearly affected their likelihood of receiving an invitation, as well as their likelihood of being selected into the astronaut program. The second used public data on age at selection and age at other career milestones for every astronaut selected from 1959 to 2013 to analyze trends in age over time using ordinary least-squares (OLS) regression and Pearson's correlation. The results for the FOIA data revealed a nonlinear relationship between age and receiving an interview, as well as age and selection into the astronaut program, but the most striking observation was the loss of age diversity at each stage of selection. Applicants younger or older than approximately 40 years were significantly less likely to receive invitations for interviews and were significantly less likely to be selected as an astronaut. Analysis of the public-source data for all selections since the beginning of the astronaut program revealed significant age trends over time including a gradual increase in selectee age and decreased tenure at NASA after last flight, with average age at retirement steady over the entire history of the astronaut program at approximately 48 years.
Analysis of age as a factor in NASA astronaut selection and career landmarks
Shadden, Mark
2017-01-01
NASA’s periodic selection of astronauts is a highly selective process accepting applications from the general population, wherein the mechanics of selection are not made public. This research was an effort to determine if biases (specifically age) exist in the process and, if so, at which points they might manifest. Two sets of analyses were conducted. The first utilized data requested via the Freedom of Information Act (FOIA) on NASA astronaut applicants for the 2009 and 2013 selection years. Using a series of multinomial and logistic regressions, the data were analyzed to uncover whether age of the applicants linearly or nonlinearly affected their likelihood of receiving an invitation, as well as their likelihood of being selected into the astronaut program. The second used public data on age at selection and age at other career milestones for every astronaut selected from 1959 to 2013 to analyze trends in age over time using ordinary least-squares (OLS) regression and Pearson’s correlation. The results for the FOIA data revealed a nonlinear relationship between age and receiving an interview, as well as age and selection into the astronaut program, but the most striking observation was the loss of age diversity at each stage of selection. Applicants younger or older than approximately 40 years were significantly less likely to receive invitations for interviews and were significantly less likely to be selected as an astronaut. Analysis of the public-source data for all selections since the beginning of the astronaut program revealed significant age trends over time including a gradual increase in selectee age and decreased tenure at NASA after last flight, with average age at retirement steady over the entire history of the astronaut program at approximately 48 years. PMID:28749968
Advantages offered by high average power picosecond lasers
NASA Astrophysics Data System (ADS)
Moorhouse, C.
2011-03-01
As electronic devices shrink in size to reduce material costs, device size and weight, thinner material thicknesses are also utilized. Feature sizes are also decreasing, which is pushing manufacturers towards single step laser direct write process as an attractive alternative to conventional, multiple step photolithography processes by eliminating process steps and the cost of chemicals. The fragile nature of these thin materials makes them difficult to machine either mechanically or with conventional nanosecond pulsewidth, Diode Pumped Solids State (DPSS) lasers. Picosecond laser pulses can cut materials with reduced damage regions and selectively remove thin films due to the reduced thermal effects of the shorter pulsewidth. Also, the high repetition rate allows high speed processing for industrial applications. Selective removal of thin films for OLED patterning, silicon solar cells and flat panel displays is discussed, as well as laser cutting of transparent materials with low melting point such as Polyethylene Terephthalate (PET). For many of these thin film applications, where low pulse energy and high repetition rate are required, throughput can be increased by the use of a novel technique to using multiple beams from a single laser source is outlined.
Public Administration: A Bibliography of Selected Reference Sources.
ERIC Educational Resources Information Center
Brustman, Mary Jane
This guide presents an annotated list of selected reference sources in public administration. All of the sources listed are found at the Graduate Library for Public Affairs and Policy (GLPP) located at the State University of New York, Albany. Detailed, exhaustive guides in literature, research, indexes, abstracts, statistical sources, government…
PRISM Software: Processing and Review Interface for Strong‐Motion Data
Jones, Jeanne M.; Kalkan, Erol; Stephens, Christopher D.; Ng, Peter
2017-01-01
A continually increasing number of high‐quality digital strong‐motion records from stations of the National Strong Motion Project (NSMP) of the U.S. Geological Survey, as well as data from regional seismic networks within the United States, calls for automated processing of strong‐motion records with human review limited to selected significant or flagged records. The NSMP has developed the Processing and Review Interface for Strong Motion data (PRISM) software to meet this need. In combination with the Advanced National Seismic System Quake Monitoring System (AQMS), PRISM automates the processing of strong‐motion records. When used without AQMS, PRISM provides batch‐processing capabilities. The PRISM software is platform independent (coded in Java), open source, and does not depend on any closed‐source or proprietary software. The software consists of two major components: a record processing engine composed of modules for each processing step, and a review tool, which is a graphical user interface for manual review, edit, and processing. To facilitate use by non‐NSMP earthquake engineers and scientists, PRISM (both its processing engine and review tool) is easy to install and run as a stand‐alone system on common operating systems such as Linux, OS X, and Windows. PRISM was designed to be flexible and extensible to accommodate implementation of new processing techniques. All the computing features have been thoroughly tested.
NASA Astrophysics Data System (ADS)
Nowak, W.; Koch, J.
2014-12-01
Predicting DNAPL fate and transport in heterogeneous aquifers is challenging and subject to an uncertainty that needs to be quantified. Models for this task needs to be equipped with an accurate source zone description, i.e., the distribution of mass of all partitioning phases (DNAPL, water, and soil) in all possible states ((im)mobile, dissolved, and sorbed), mass-transfer algorithms, and the simulation of transport processes in the groundwater. Such detailed models tend to be computationally cumbersome when used for uncertainty quantification. Therefore, a selective choice of the relevant model states, processes, and scales are both sensitive and indispensable. We investigate the questions: what is a meaningful level of model complexity and how to obtain an efficient model framework that is still physically and statistically consistent. In our proposed model, aquifer parameters and the contaminant source architecture are conceptualized jointly as random space functions. The governing processes are simulated in a three-dimensional, highly-resolved, stochastic, and coupled model that can predict probability density functions of mass discharge and source depletion times. We apply a stochastic percolation approach as an emulator to simulate the contaminant source formation, a random walk particle tracking method to simulate DNAPL dissolution and solute transport within the aqueous phase, and a quasi-steady-state approach to solve for DNAPL depletion times. Using this novel model framework, we test whether and to which degree the desired model predictions are sensitive to simplifications often found in the literature. With this we identify that aquifer heterogeneity, groundwater flow irregularity, uncertain and physically-based contaminant source zones, and their mutual interlinkages are indispensable components of a sound model framework.
Quantifying chemical reactions by using mixing analysis.
Jurado, Anna; Vázquez-Suñé, Enric; Carrera, Jesús; Tubau, Isabel; Pujades, Estanislao
2015-01-01
This work is motivated by a sound understanding of the chemical processes that affect the organic pollutants in an urban aquifer. We propose an approach to quantify such processes using mixing calculations. The methodology consists of the following steps: (1) identification of the recharge sources (end-members) and selection of the species (conservative and non-conservative) to be used, (2) identification of the chemical processes and (3) evaluation of mixing ratios including the chemical processes. This methodology has been applied in the Besòs River Delta (NE Barcelona, Spain), where the River Besòs is the main aquifer recharge source. A total number of 51 groundwater samples were collected from July 2007 to May 2010 during four field campaigns. Three river end-members were necessary to explain the temporal variability of the River Besòs: one river end-member is from the wet periods (W1) and two are from dry periods (D1 and D2). This methodology has proved to be useful not only to compute the mixing ratios but also to quantify processes such as calcite and magnesite dissolution, aerobic respiration and denitrification undergone at each observation point. Copyright © 2014 Elsevier B.V. All rights reserved.
Software for MR image overlay guided needle insertions: the clinical translation process
NASA Astrophysics Data System (ADS)
Ungi, Tamas; U-Thainual, Paweena; Fritz, Jan; Iordachita, Iulian I.; Flammang, Aaron J.; Carrino, John A.; Fichtinger, Gabor
2013-03-01
PURPOSE: Needle guidance software using augmented reality image overlay was translated from the experimental phase to support preclinical and clinical studies. Major functional and structural changes were needed to meet clinical requirements. We present the process applied to fulfill these requirements, and selected features that may be applied in the translational phase of other image-guided surgical navigation systems. METHODS: We used an agile software development process for rapid adaptation to unforeseen clinical requests. The process is based on iterations of operating room test sessions, feedback discussions, and software development sprints. The open-source application framework of 3D Slicer and the NA-MIC kit provided sufficient flexibility and stable software foundations for this work. RESULTS: All requirements were addressed in a process with 19 operating room test iterations. Most features developed in this phase were related to workflow simplification and operator feedback. CONCLUSION: Efficient and affordable modifications were facilitated by an open source application framework and frequent clinical feedback sessions. Results of cadaver experiments show that software requirements were successfully solved after a limited number of operating room tests.
Individual Alpha Peak Frequency Predicts 10 Hz Flicker Effects on Selective Attention.
Gulbinaite, Rasa; van Viegen, Tara; Wieling, Martijn; Cohen, Michael X; VanRullen, Rufin
2017-10-18
Rhythmic visual stimulation ("flicker") is primarily used to "tag" processing of low-level visual and high-level cognitive phenomena. However, preliminary evidence suggests that flicker may also entrain endogenous brain oscillations, thereby modulating cognitive processes supported by those brain rhythms. Here we tested the interaction between 10 Hz flicker and endogenous alpha-band (∼10 Hz) oscillations during a selective visuospatial attention task. We recorded EEG from human participants (both genders) while they performed a modified Eriksen flanker task in which distractors and targets flickered within (10 Hz) or outside (7.5 or 15 Hz) the alpha band. By using a combination of EEG source separation, time-frequency, and single-trial linear mixed-effects modeling, we demonstrate that 10 Hz flicker interfered with stimulus processing more on incongruent than congruent trials (high vs low selective attention demands). Crucially, the effect of 10 Hz flicker on task performance was predicted by the distance between 10 Hz and individual alpha peak frequency (estimated during the task). Finally, the flicker effect on task performance was more strongly predicted by EEG flicker responses during stimulus processing than during preparation for the upcoming stimulus, suggesting that 10 Hz flicker interfered more with reactive than proactive selective attention. These findings are consistent with our hypothesis that visual flicker entrained endogenous alpha-band networks, which in turn impaired task performance. Our findings also provide novel evidence for frequency-dependent exogenous modulation of cognition that is determined by the correspondence between the exogenous flicker frequency and the endogenous brain rhythms. SIGNIFICANCE STATEMENT Here we provide novel evidence that the interaction between exogenous rhythmic visual stimulation and endogenous brain rhythms can have frequency-specific behavioral effects. We show that alpha-band (10 Hz) flicker impairs stimulus processing in a selective attention task when the stimulus flicker rate matches individual alpha peak frequency. The effect of sensory flicker on task performance was stronger when selective attention demands were high, and was stronger during stimulus processing and response selection compared with the prestimulus anticipatory period. These findings provide novel evidence that frequency-specific sensory flicker affects online attentional processing, and also demonstrate that the correspondence between exogenous and endogenous rhythms is an overlooked prerequisite when testing for frequency-specific cognitive effects of flicker. Copyright © 2017 the authors 0270-6474/17/3710173-12$15.00/0.
Hydrogen and Oxygen Gas Monitoring System Design and Operation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee C. Cadwallader; Kevin G. DeWall; J. Stephen Herring
2007-06-01
This paper describes pertinent design practices of selecting types of monitors, monitor unit placement, setpoint selection, and maintenance considerations for gas monitors. While hydrogen gas monitors and enriched oxygen atmosphere monitors as they would be needed for hydrogen production experiments are the primary focus of this paper, monitors for carbon monoxide and carbon dioxide are also discussed. The experiences of designing, installing, and calibrating gas monitors for a laboratory where experiments in support of the DOE Nuclear Hydrogen Initiative (NHI) are described along with codes, standards, and regulations for these monitors. Information from the literature about best operating practices ismore » also presented. The NHI program has two types of activities. The first, near-term activity is laboratory and pilot-plant experimentation with different processes in the kilogram per day scale to select the most promising types of processes for future applications of hydrogen production. Prudent design calls for indoor gas monitors to sense any hydrogen leaks within these laboratory rooms. The second, longer-term activity is the prototype, or large-scale plants to produce tons of hydrogen per day. These large, outdoor production plants will require area (or “fencepost”) monitoring of hydrogen gas leaks. Some processes will have oxygen production with hydrogen production, and any oxygen releases are also safety concerns since oxygen gas is the strongest oxidizer. Monitoring of these gases is important for personnel safety of both indoor and outdoor experiments. There is some guidance available about proper placement of monitors. The fixed point, stationary monitor can only function if the intruding gas contacts the monitor. Therefore, monitor placement is vital to proper monitoring of the room or area. Factors in sensor location selection include: indoor or outdoor site, the location and nature of potential vapor/gas sources, chemical and physical data of the gases or vapors, liquids with volatility need sensors near the potential sources of release, nature and concentration of gas releases, natural and mechanical ventilation, detector installation locations not vulnerable to mechanical or water damage from normal operations, and locations that lend themselves to convenient maintenance and calibration. The guidance also states that sensors should be located in all areas where hazardous accumulations of gas may occur. Such areas might not be close to release points but might be areas with restricted air movement. Heavier than air gases are likely to accumulate in pits, trenches, drains, and other low areas. Lighter than air gases are more likely to accumulate in overhead spaces, above drop ceilings, etc. In general, sensors should be located close to any potential sources of major release of gas. The paper gives data on monitor sensitivity and expected lifetimes to support the monitor selection process. Proper selection of indoor and outdoor locations for monitors is described, accounting for the vapor densities of hydrogen and oxygen. The latest information on monitor alarm setpoint selection is presented. Typically, monitors require recalibration at least every six months, or more frequently for inhospitable locations, so ready access to the monitors is an important issue to consider in monitor siting. Gas monitors, depending on their type, can be susceptible to blockages of the detector element (i.e., dus« less
A global earthquake discrimination scheme to optimize ground-motion prediction equation selection
Garcia, Daniel; Wald, David J.; Hearne, Michael
2012-01-01
We present a new automatic earthquake discrimination procedure to determine in near-real time the tectonic regime and seismotectonic domain of an earthquake, its most likely source type, and the corresponding ground-motion prediction equation (GMPE) class to be used in the U.S. Geological Survey (USGS) Global ShakeMap system. This method makes use of the Flinn–Engdahl regionalization scheme, seismotectonic information (plate boundaries, global geology, seismicity catalogs, and regional and local studies), and the source parameters available from the USGS National Earthquake Information Center in the minutes following an earthquake to give the best estimation of the setting and mechanism of the event. Depending on the tectonic setting, additional criteria based on hypocentral depth, style of faulting, and regional seismicity may be applied. For subduction zones, these criteria include the use of focal mechanism information and detailed interface models to discriminate among outer-rise, upper-plate, interface, and intraslab seismicity. The scheme is validated against a large database of recent historical earthquakes. Though developed to assess GMPE selection in Global ShakeMap operations, we anticipate a variety of uses for this strategy, from real-time processing systems to any analysis involving tectonic classification of sources from seismic catalogs.
Parametric investigations of plasma characteristics in a remote inductively coupled plasma system
NASA Astrophysics Data System (ADS)
Shukla, Prasoon; Roy, Abhra; Jain, Kunal; Bhoj, Ananth
2016-09-01
Designing a remote plasma system involves source chamber sizing, selection of coils and/or electrodes to power the plasma, designing the downstream tubes, selection of materials used in the source and downstream regions, locations of inlets and outlets and finally optimizing the process parameter space of pressure, gas flow rates and power delivery. Simulations can aid in spatial and temporal plasma characterization in what are often inaccessible locations for experimental probes in the source chamber. In this paper, we report on simulations of a remote inductively coupled Argon plasma system using the modeling platform CFD-ACE +. The coupled multiphysics model description successfully address flow, chemistry, electromagnetics, heat transfer and plasma transport in the remote plasma system. The SimManager tool enables easy setup of parametric simulations to investigate the effect of varying the pressure, power, frequency, flow rates and downstream tube lengths. It can also enable the automatic solution of the varied parameters to optimize a user-defined objective function, which may be the integral ion and radical fluxes at the wafer. The fast run time coupled with the parametric and optimization capabilities can add significant insight and value in design and optimization.
Enhance the Value of a Research Paper: Choosing the Right References and Writing them Accurately.
Bavdekar, Sandeep B
2016-03-01
References help readers identify and locate sources used for justifying the need for conducting the research study, verify methods employed in the study and for discussing the interpretation of results and implications of the study. It is extremely essential that references are accurate and complete. This article provides suggestions regarding choosing references and writing reference list. References are a list of sources that are selected by authors to represent the best documents concerning the research study.1 They constitute the foundation of any research paper. Although generally written towards the end of the article-writing process, they are nevertheless extremely important. They provide the context for the hypothesis and help justify the need for conducting the research study. Authors use references to inform readers about the techniques used for conducting the study and convince them about the appropriateness of methodology used. References help provide appropriate perspective in which the research findings should be seen and interpreted. This communication will discuss the purpose of citations, how to select quality sources for citing and the importance of accuracy while writing the reference list. © Journal of the Association of Physicians of India 2011.
Production of gluconic acid using Micrococcus sp.: optimisation of carbon and nitrogen sources.
Joshi, V D; Sreekantiah, K R; Manjrekar, S P
1996-01-01
A process for production of gluconic acid from glucose by a Micrococcus sp. is described. More than 400 bacterial cultures isolated from local soil were tested for gluconic acid production. Three isolates, were selected on basis of their ability to produce gluconic acid and high titrable acidity. These were identified as Micrococcus sp. and were named M 27, M 54 and M 81. Nutritional and other parameters for maximum production of gluconic acid by the selected isolates were optimised. It was found that Micrococcus sp. isolate M 27 gave highest yield of 8.19 g gluconic acid from 9 g glucose utilised giving 91% conversion effeciency.
NASA Astrophysics Data System (ADS)
Attygalle, Athula B.; Xia, Hanxue; Pavlov, Julius
2017-08-01
The gas-phase-ion generation technique and specific ion-source settings of a mass spectrometer influence heavily the protonation processes of molecules and the abundance ratio of the generated protomers. Hitherto that has been attributed primarily to the nature of the solvent and the pH. By utilizing electrospray ionization and ion-mobility mass spectrometry (IM-MS), we demonstrate, even in the seemingly trivial case of protonated aniline, that the protomer ratio strongly depends on the source conditions. Under low in-source ion activation, nearly 100% of the N-protomer of aniline is produced, and it can be subsequently converted to the C-protomer by collisional activation effected by increasing the electrical potential difference between the entrance and exit orifices of the first vacuum region. This activation and transformation process takes place even before the ion is mass-selected and subjected to IM separation. Despite the apparent simplicity of the problem, the preferred protonation site of aniline in the gas phase—the amino group or the aromatic ring—has been a topic of controversy. Our results not only provide unambiguous evidence that ring- and nitrogen-protonated aniline can coexist and be interconverted in the gas phase, but also that the ratio of the protomers depends on the internal energy of the original ion. There are many dynamic ion-transformation and fragmentation processes that take place in the different physical compartments of a Synapt G2 HDMS instrument. Such processes can dramatically change the very identity even of small ions, and therefore should be taken into account when interpreting product-ion mass spectra.
Feature-selective attention enhances color signals in early visual areas of the human brain.
Müller, M M; Andersen, S; Trujillo, N J; Valdés-Sosa, P; Malinowski, P; Hillyard, S A
2006-09-19
We used an electrophysiological measure of selective stimulus processing (the steady-state visual evoked potential, SSVEP) to investigate feature-specific attention to color cues. Subjects viewed a display consisting of spatially intermingled red and blue dots that continually shifted their positions at random. The red and blue dots flickered at different frequencies and thereby elicited distinguishable SSVEP signals in the visual cortex. Paying attention selectively to either the red or blue dot population produced an enhanced amplitude of its frequency-tagged SSVEP, which was localized by source modeling to early levels of the visual cortex. A control experiment showed that this selection was based on color rather than flicker frequency cues. This signal amplification of attended color items provides an empirical basis for the rapid identification of feature conjunctions during visual search, as proposed by "guided search" models.
Sex ratio dynamics and fluctuating selection on personality.
Del Giudice, Marco
2012-03-21
Fluctuating selection has often been proposed as an explanation for the maintenance of genetic variation in personality. Here I argue that the temporal dynamics of the sex ratio can be a powerful source of fluctuating selection on personality traits, and develop this hypothesis with respect to humans. First, I review evidence that sex ratios modulate a wide range of social processes related to mating and parenting. Since most personality traits affect mating and parenting behavior, changes in the sex ratio can be expected to result in variable selection on personality. I then show that the temporal dynamics of the sex ratio are intrinsically characterized by fluctuations at various timescales. Finally, I address a number of evolutionary genetic challenges to the hypothesis. I conclude that the sex ratio hypothesis is a plausible explanation of genetic variation in human personality, and may be fruitfully applied to other species as well. Copyright © 2011 Elsevier Ltd. All rights reserved.
The New Method of Tsunami Source Reconstruction With r-Solution Inversion Method
NASA Astrophysics Data System (ADS)
Voronina, T. A.; Romanenko, A. A.
2016-12-01
Application of the r-solution method to reconstructing the initial tsunami waveform is discussed. This methodology is based on the inversion of remote measurements of water-level data. The wave propagation is considered within the scope of a linear shallow-water theory. The ill-posed inverse problem in question is regularized by means of a least square inversion using the truncated Singular Value Decomposition method. As a result of the numerical process, an r-solution is obtained. The method proposed allows one to control the instability of a numerical solution and to obtain an acceptable result in spite of ill posedness of the problem. Implementation of this methodology to reconstructing of the initial waveform to 2013 Solomon Islands tsunami validates the theoretical conclusion for synthetic data and a model tsunami source: the inversion result strongly depends on data noisiness, the azimuthal and temporal coverage of recording stations with respect to the source area. Furthermore, it is possible to make a preliminary selection of the most informative set of the available recording stations used in the inversion process.
The MYStIX Infrared-Excess Source Catalog
NASA Astrophysics Data System (ADS)
Povich, Matthew S.; Kuhn, Michael A.; Getman, Konstantin V.; Busk, Heather A.; Feigelson, Eric D.; Broos, Patrick S.; Townsley, Leisa K.; King, Robert R.; Naylor, Tim
2013-12-01
The Massive Young Star-Forming Complex Study in Infrared and X-rays (MYStIX) project provides a comparative study of 20 Galactic massive star-forming complexes (d = 0.4-3.6 kpc). Probable stellar members in each target complex are identified using X-ray and/or infrared data via two pathways: (1) X-ray detections of young/massive stars with coronal activity/strong winds or (2) infrared excess (IRE) selection of young stellar objects (YSOs) with circumstellar disks and/or protostellar envelopes. We present the methodology for the second pathway using Spitzer/IRAC, 2MASS, and UKIRT imaging and photometry. Although IRE selection of YSOs is well-trodden territory, MYStIX presents unique challenges. The target complexes range from relatively nearby clouds in uncrowded fields located toward the outer Galaxy (e.g., NGC 2264, the Flame Nebula) to more distant, massive complexes situated along complicated, inner Galaxy sightlines (e.g., NGC 6357, M17). We combine IR spectral energy distribution (SED) fitting with IR color cuts and spatial clustering analysis to identify IRE sources and isolate probable YSO members in each MYStIX target field from the myriad types of contaminating sources that can resemble YSOs: extragalactic sources, evolved stars, nebular knots, and even unassociated foreground/background YSOs. Applying our methodology consistently across 18 of the target complexes, we produce the MYStIX IRE Source (MIRES) Catalog comprising 20,719 sources, including 8686 probable stellar members of the MYStIX target complexes. We also classify the SEDs of 9365 IR counterparts to MYStIX X-ray sources to assist the first pathway, the identification of X-ray-detected stellar members. The MIRES Catalog provides a foundation for follow-up studies of diverse phenomena related to massive star cluster formation, including protostellar outflows, circumstellar disks, and sequential star formation triggered by massive star feedback processes.
Measuring competitive fitness in dynamic environments.
Razinkov, Ivan A; Baumgartner, Bridget L; Bennett, Matthew R; Tsimring, Lev S; Hasty, Jeff
2013-10-24
Most yeast genes are dispensable for optimal growth in laboratory cultures. However, this apparent lack of fitness contribution is difficult to reconcile with the theory of natural selection. Here we use stochastic modeling to show that environmental fluctuations can select for a genetic mechanism that does not affect growth in static laboratory environments. We then present a novel experimental platform for measuring the fitness levels of specific genotypes in fluctuating environments. We test this platform by monitoring a mixed culture of two yeast strains that differ in their ability to respond to changes in carbon source yet exhibit the same fitness level in static conditions. When the sugar in the growth medium was switched between galactose and glucose, the wild-type strain gained a growth advantage over the mutant strain. Interestingly, both our computational and experimental results show that the strength of the adaptive advantage conveyed by the wild-type genotype depends on the total number of carbon source switches, not on the frequency of these fluctuations. Our results illustrate the selective power of environmental fluctuations on seemingly slight phenotypic differences in cellular response dynamics and underscore the importance of dynamic processes in the evolution of species.
Single-Frequency GPS Relative Navigation in a High Ionosphere Orbital Environment
NASA Technical Reports Server (NTRS)
Conrad, Patrick R.; Naasz, Bo J.
2007-01-01
The Global Positioning System (GPS) provides a convenient source for space vehicle relative navigation measurements, especially for low Earth orbit formation flying and autonomous rendezvous mission concepts. For single-frequency GPS receivers, ionospheric path delay can be a significant error source if not properly mitigated. In particular, ionospheric effects are known to cause significant radial position error bias and add dramatically to relative state estimation error if the onboard navigation software does not force the use of measurements from common or shared GPS space vehicles. Results from GPS navigation simulations are presented for a pair of space vehicles flying in formation and using GPS pseudorange measurements to perform absolute and relative orbit determination. With careful measurement selection techniques relative state estimation accuracy to less than 20 cm with standard GPS pseudorange processing and less than 10 cm with single-differenced pseudorange processing is shown.
An Integrated Gate Driver in 4H-SiC for Power Converter Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ericson, Milton Nance; Frank, Steven Shane; Britton, Charles
2014-01-01
A gate driver fabricated in a 2-um 4H silicon carbide (SiC) process is presented. This process was optimized for vertical power MOSFET fabrication but accommodated integration of a few low-voltage device types including N-channel MOSFETs, resistors, and capacitors. The gate driver topology employed incorporates an input level translator, variable power connections, and separate power supply connectivity allowing selection of the output signal drive amplitude. The output stage utilizes a source follower pull-up device that is both overdriven and body source connected to improve rise time behavior. Full characterization of this design driving a SiC power MOSFET is presented including risemore » and fall times, propagation delays, and power consumption. All parameters were measured to elevated temperatures exceeding 300 C. Details of the custom test system hardware and software utilized for gate driver testing are also provided.« less
The investigation of fast neutron Threshold Activation Detectors (TAD)
NASA Astrophysics Data System (ADS)
Gozani, T.; King, M. J.; Stevenson, J.
2012-02-01
The detection of fast neutrons is usually done by liquid hydrogenous organic scintillators, where the separation between the ever present gamma rays and neutrons is achieved by the pulse shape discrimination (PSD). In many practical situation the detection of fast neutrons has to be carried out while the intense source (be it neutrons, gamma rays or x-rays) that creates these neutrons, for example by the fission process, is present. This source, or ``flash'', usually blinds the neutron detectors and temporarily incapacitates them. By the time the detectors recover the prompt neutron signature does not exist. Thus to overcome the blinding background, one needs to search for processes whereby the desired signature, such as fission neutrons could in some way be measured long after the fission occurred and when the neutron detector is fully recovered from the overload. A new approach was proposed and demonstrated a good sensitivity for the detection of fast neutrons in adverse overload situations where normally it could not be done. A temporal separation of the fission event from the prompt neutrons detection is achieved via the activation process. The main idea, called Threshold Activation Detection (or detector)-TAD, is to find appropriate substances that can be selectively activated by the fission neutrons and not by the source radiation, and then measure the radioactively decaying activation products (typically beta and γ-rays) well after the source pulse has ended. The activation material should possess certain properties: a suitable half-life; an energy threshold below which the numerous source neutrons will not activate it (e.g. about 3 MeV); easily detectable activation products and has a usable cross section for the selected reaction. Ideally the substance would be part of the scintillator. There are several good candidates for TAD. The first one we have selected is based on fluorine. One of the major advantages of this element is the fact that it is a major constituent of available scintillators (e.g., BaF2, CaF2, hydrogen free liquid fluorocarbon). Thus the activation products of the fast prompt neutrons, in particular, the beta particles, can be measured with a very high efficiency in the detector. Other detectors and substances were investigated, such as 6Li and even common detectors such as NaI. The principles and experimental results obtained with F, NaI and 6Li based TAD are shown. The various contributing activation products are identified. The insensitivity of the fluorine based TAD to (d,D) neutrons is demonstrated. Ways and means to reduce or subtract the various neutron induced activations of NaI detector are elucidated along with its fast neutron detection capabilities. 6Li could also be a useful TAD.
2018-03-01
collection of information is estimated to average 1 hour per response, including the time for reviewing instruction, searching existing data sources...gathering and maintaining the data needed, and completing and reviewing the collection of information . Send comments regarding this burden estimate or...any other aspect of this collection of information , including suggestions for reducing this burden, to Washington headquarters Services, Directorate
2000-11-22
This is a final rule amending the NASA FAR Supplement (NFS) to emphasize considerations of risk management, including safety, security (including information technology security), health, export control, and damage to the environment, within the acquisition process. This final rule addresses risk management within the context of acquisition planning, selecting sources, choosing contract type, structuring award fee incentives, administering contracts, and conducting contractor surveillance.
Sources of Career Dissatisfaction among Mid-Level Coast Guard Officers.
1983-06-01
process that leads to the determination of career satisfaction . No attempt is made to study performance reports or the person- alities of individuals...job satisfaction literature performed by Brayfield and Crockett revealed that "there is little evidence in the available literature that employee...were low."* Lawlor, Edward E. and Lyman W. Porter "The Effect of Performance in Job Satisfaction ," Fundamentals of Management Selected Readings
NASA Astrophysics Data System (ADS)
Hirsch, Piotr; Duzinkiewicz, Kazimierz; Grochowski, Michał
2017-11-01
District Heating (DH) systems are commonly supplied using local heat sources. Nowadays, modern insulation materials allow for effective and economically viable heat transportation over long distances (over 20 km). In the paper a method for optimized selection of design and operating parameters of long distance Heat Transportation System (HTS) is proposed. The method allows for evaluation of feasibility and effectivity of heat transportation from the considered heat sources. The optimized selection is formulated as multicriteria decision-making problem. The constraints for this problem include a static HTS model, allowing considerations of system life cycle, time variability and spatial topology. Thereby, variation of heat demand and ground temperature within the DH area, insulation and pipe aging and/or terrain elevation profile are taken into account in the decision-making process. The HTS construction costs, pumping power, and heat losses are considered as objective functions. Inner pipe diameter, insulation thickness, temperatures and pumping stations locations are optimized during the decision-making process. Moreover, the variants of pipe-laying e.g. one pipeline with the larger diameter or two with the smaller might be considered during the optimization. The analyzed optimization problem is multicriteria, hybrid and nonlinear. Because of such problem properties, the genetic solver was applied.
Dick, Anthony Steven
2012-01-01
Two experiments examined processes underlying cognitive inflexibility in set-shifting tasks typically used to assess the development of executive function in children. Adult participants performed a Flexible Item Selection Task (FIST) that requires shifting from categorizing by one dimension (e.g., color) to categorizing by a second orthogonal dimension (e.g., shape). The experiments showed performance of the FIST involves suppression of the representation of the ignored dimension; response times for selecting a target object in an immediately-following oddity task were slower when the oddity target was the previously-ignored stimulus of the FIST. However, proactive interference from the previously relevant stimulus dimension also impaired responding. The results are discussed with respect to two prominent theories of the source of difficulty for children and adults on dimensional shifting tasks: attentional inertia and negative priming. In contrast to prior work emphasizing one over the other process, the findings indicate that difficulty in the FIST, and by extension other set-shifting tasks, can be attributed to both the need to shift away from the previously attended representation (attentional inertia), and the need to shift to the previously ignored representation (negative priming). Results are discussed in relation to theoretical explanations for cognitive inflexibility in adults and children. PMID:23539267
Dick, Anthony Steven
2012-01-01
Two experiments examined processes underlying cognitive inflexibility in set-shifting tasks typically used to assess the development of executive function in children. Adult participants performed a Flexible Item Selection Task (FIST) that requires shifting from categorizing by one dimension (e.g., color) to categorizing by a second orthogonal dimension (e.g., shape). The experiments showed performance of the FIST involves suppression of the representation of the ignored dimension; response times for selecting a target object in an immediately-following oddity task were slower when the oddity target was the previously-ignored stimulus of the FIST. However, proactive interference from the previously relevant stimulus dimension also impaired responding. The results are discussed with respect to two prominent theories of the source of difficulty for children and adults on dimensional shifting tasks: attentional inertia and negative priming . In contrast to prior work emphasizing one over the other process, the findings indicate that difficulty in the FIST, and by extension other set-shifting tasks, can be attributed to both the need to shift away from the previously attended representation ( attentional inertia ), and the need to shift to the previously ignored representation ( negative priming ). Results are discussed in relation to theoretical explanations for cognitive inflexibility in adults and children.
Roylance, John J.; Kim, Tae Woo; Choi, Kyoung-Shin
2016-02-17
Reductive biomass conversion has been conventionally conducted using H 2 gas under high-temperature and-pressure conditions. Here, efficient electrochemical reduction of 5-hydroxymethylfurfural (HMF), a key intermediate for biomass conversion, to 2,5-bis(hydroxymethyl)furan (BHMF), an important monomer for industrial processes, was demonstrated using Ag catalytic electrodes. This process uses water as the hydrogen source under ambient conditions and eliminates the need to generate and consume H 2 for hydrogenation, providing a practical and efficient route for BHMF production. By systematic investigation of HMF reduction on the Ag electrode surface, BHMF production was achieved with the Faradaic efficiency and selectivity nearing 100%, and plausiblemore » reduction mechanisms were also elucidated. Furthermore, construction of a photoelectrochemical cell (PEC) composed of an n-type BiVO 4 semiconductor anode, which uses photogenerated holes for water oxidation, and a catalytic Ag cathode, which uses photoexcited electrons from BiVO 4 for the reduction of HMF to BHMF, was demonstrated to utilize solar energy to significantly decrease the external voltage necessary for HMF reduction. This shows the possibility of coupling electrochemical HMF reduction and solar energy conversion, which can provide more efficient and environmentally benign routes for reductive biomass conversion.« less
Minero, Claudio; Maurino, Valter; Bono, Francesca; Pelizzetti, Ezio; Marinoni, Angela; Mailhot, Gilles; Carlotti, Maria Eugenia; Vione, Davide
2007-08-01
The effect of selected organic and inorganic compounds, present in snow and cloudwater was studied. Photolysis of solutions of nitrate to nitrite was carried out in the laboratory using a UVB light source. The photolysis and other reactions were then modelled. It is shown that formate, formaldehyde, methanesulphonate, and chloride to a lesser extent, can increase the initial formation rate of nitrite. The effect, particularly significant for formate and formaldehyde, is unlikely to be caused by scavenging of hydroxyl radicals. The experimental data obtained in this work suggest that possible causes are the reduction of nitrogen dioxide and nitrate by radical species formed on photooxidation of the organic compounds. Hydroxyl scavenging by organic and inorganic compounds would not affect the initial formation rate of nitrite, but would protect it from oxidation, therefore, increasing the concentration values reached at long irradiation times. The described processes can be relevant to cloudwater and the quasi-liquid layer on the surface of ice and snow, considering that in the polar regions irradiated snow layers are important sources of nitrous acid to the atmosphere. Formate and (at a lesser extent) formaldehyde are the compounds that play the major role in the described processes of nitrite/nitrous acid photoformation by initial rate enhancement and hydroxyl scavenging.
Ronald C. Schmidtling
2001-01-01
The selection of an appropriate seed source is critical for successful southern pine plantations. Guidelines for selection of seed sources are presented for loblolly (Pinus taeda L.), slash (P. elliottii Engelm.), longleaf (P. palustris Mill.), Virginia (P. virginiana Mill.), shortleaf (P. echinata...
The algorithm of verification of welding process for plastic pipes
NASA Astrophysics Data System (ADS)
Rzasinski, R.
2017-08-01
The study analyzes the process of butt welding of PE pipes in terms of proper selection of connector parameters. The process was oriented to the elements performed as a series of types of pipes. Polymeric materials commonly referred to as polymers or plastics, synthetic materials are produced from oil products in the polyreaction compounds of low molecular weight, called monomers. During the polyreactions monomers combine to build a macromolecule material monomer named with the prefix poly polypropylene, polyethylene or polyurethane, creating particles in solid state on the order of 0,2 to 0,4 mm. Finished products from polymers of virtually any shape and size are obtained by compression molding, injection molding, extrusion, laminating, centrifugal casting, etc. Weld can only be a thermoplastic that softens at an elevated temperature, and thus can be connected via a clamp. Depending on the source and method of supplying heat include the following welding processes: welding contact, radiant welding, friction welding, dielectric welding, ultrasonic welding. The analysis will be welding contact. In connection with the development of new generation of polyethylene, and the production of pipes with increasing dimensions (diameter, wall thickness) is important to select the correct process.
Trevisan, Francesco; Calignano, Flaviana; Lorusso, Massimo; Pakkanen, Jukka; Aversa, Alberta; Ambrosio, Elisa Paola; Lombardi, Mariangela; Fino, Paolo; Manfredi, Diego
2017-01-01
The aim of this review is to analyze and to summarize the state of the art of the processing of aluminum alloys, and in particular of the AlSi10Mg alloy, obtained by means of the Additive Manufacturing (AM) technique known as Selective Laser Melting (SLM). This process is gaining interest worldwide, thanks to the possibility of obtaining a freeform fabrication coupled with high mechanical properties related to a very fine microstructure. However, SLM is very complex, from a physical point of view, due to the interaction between a concentrated laser source and metallic powders, and to the extremely rapid melting and the subsequent fast solidification. The effects of the main process variables on the properties of the final parts are analyzed in this review: from the starting powder properties, such as shape and powder size distribution, to the main process parameters, such as laser power and speed, layer thickness, and scanning strategy. Furthermore, a detailed overview on the microstructure of the AlSi10Mg material, with the related tensile and fatigue properties of the final SLM parts, in some cases after different heat treatments, is presented. PMID:28772436
Jayasena, Dinesh D; Jung, Samooel; Kim, Sun Hyo; Kim, Hyun Joo; Alahakoon, Amali U; Lee, Jun Heon; Jo, Cheorun
2015-03-15
In this study the effects of sex, meat cut and thermal processing on the carnosine, anserine, creatine, betaine and carnitine contents of Korean native chicken (KNC) meat were determined. Forty 1-day-old chicks (20 chicks of each sex) from a commercial KNC strain (Woorimatdag™) were reared under similar standard commercial conditions with similar diets, and ten birds of each sex were randomly selected and slaughtered at 14 weeks of age. Raw and cooked meat samples were prepared from both breast and leg meats and analyzed for the aforementioned functional compounds. Female KNCs had significantly higher betaine and creatine contents. The breast meat showed significantly higher carnosine and anserine contents, whereas the leg meat had a higher betaine and carnitine content. The content of all functional compounds was significantly depleted by thermal processing. This study confirms that KNC meat is a good source of the above-mentioned functional compounds, which can be considered attractive nutritional quality factors. However, their concentrations were significantly affected by thermal processing conditions, meat cut and sex. Further experiments are needed to select the best thermal processing method to preserve these functional compounds. © 2014 Society of Chemical Industry.
JIP: Java image processing on the Internet
NASA Astrophysics Data System (ADS)
Wang, Dongyan; Lin, Bo; Zhang, Jun
1998-12-01
In this paper, we present JIP - Java Image Processing on the Internet, a new Internet based application for remote education and software presentation. JIP offers an integrate learning environment on the Internet where remote users not only can share static HTML documents and lectures notes, but also can run and reuse dynamic distributed software components, without having the source code or any extra work of software compilation, installation and configuration. By implementing a platform-independent distributed computational model, local computational resources are consumed instead of the resources on a central server. As an extended Java applet, JIP allows users to selected local image files on their computers or specify any image on the Internet using an URL as input. Multimedia lectures such as streaming video/audio and digital images are integrated into JIP and intelligently associated with specific image processing functions. Watching demonstrations an practicing the functions with user-selected input data dramatically encourages leaning interest, while promoting the understanding of image processing theory. The JIP framework can be easily applied to other subjects in education or software presentation, such as digital signal processing, business, mathematics, physics, or other areas such as employee training and charged software consumption.
ERIC Educational Resources Information Center
Brosseau-Liard, Patricia E.
2014-01-01
The present research examines the effect of the costliness of an information source on children's selective learning. In three experiments (total N?=?112), 4-to 7-year-olds were given the opportunity to acquire and endorse information from one of two sources. One source, a computer, was described as always accurate; the other source, a…
A study on the seismic source parameters for earthquakes occurring in the southern Korean Peninsula
NASA Astrophysics Data System (ADS)
Rhee, H. M.; Sheen, D. H.
2015-12-01
We investigated the characteristics of the seismic source parameters of the southern part of the Korean Peninsula for the 599 events with ML≥1.7 from 2001 to 2014. A large number of data are carefully selected by visual inspection in the time and frequency domains. The data set consist of 5,093 S-wave trains on three-component seismograms recorded at broadband seismograph stations which have been operating by the Korea Meteorological Administration and the Korea Institute of Geoscience and Mineral Resources. The corner frequency, stress drop, and moment magnitude of each event were measured by using the modified method of Jo and Baag (2001), based on the methods of Snoke (1987) and Andrews (1986). We found that this method could improve the stability of the estimation of source parameters from S-wave displacement spectrum by an iterative process. Then, we compared the source parameters with those obtained from previous studies and investigated the source scaling relationship and the regional variations of source parameters in the southern Korean Peninsula.
Cassava as an energy source: a selected bibliography
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sherman, C.
1980-01-01
This selected bibliography includes 250 articles on cassava as a potential energy source. Factors included are things which influence cassava growth; such as weeding, fertilizer, diseases and genetic selection, as well as the conversion of cassava to ethanol. (DP)
Fabrics for fire resistant passenger seats in aircraft
NASA Technical Reports Server (NTRS)
Tesoro, G. C.
1978-01-01
The essential elements of the problem and of approaches to improved fire resistance in aircraft seats are reviewed. The performance requirements and availability of materials, delay in the ignition of upholstery fabric by a small source are considered a realistic objective. Results of experimental studies on the thermal response of fabrics and fabric/foam combinations suggest significant conclusions regarding: (1) the ignition behavior of a commercial 90/10 wool/nylon upholstery fabric relative to fabrics made from thermally stable polymers; (2) the role of the foam backing; (3) the behavior of seams. These results, coupled with data from other sources, also confirm the importance of materials' interactions in multicomponent assemblies, and the need for system testing prior to materials' selection. The use of an interlinear or thermal barrier between upholstery fabric and foam is a promising and viable approach to improved fire resistance of the seat assembly, but experimental evaluation of specific combinations of materials or systems is an essential part of the selection process.
EFL/ESL Textbook Selection in Korea and East Asia - Relevant Issues and Literature Review
NASA Astrophysics Data System (ADS)
Meurant, Robert C.
EFL/ESL departments periodically face the problem of textbook selection. Cogent issues are that non-native speakers will use L2 English mainly to communicate with other non-native English speakers, so an American accent is becoming less important. L2 English will mainly be used in computer-mediated communication, hence the importance of L2 Digital Literacy. The convergence of Information Communication Technologies is radically impacting Second Language Acquisition, which is integrating web-hosted Assessment and Learning Management Systems. EFL/ESL textbooks need to be compatible with blended learning, prepare students for a globalized world, and foster autonomous learning. I summarize five papers on EFL/ESL textbook evaluation and selection, and include relevant material for adaptation. Textbooks are major sources of contact with the target language, so selection is an important decision. Educators need to be systematic and objective in their approach, adopting a selection process that is open, transparent, accountable, participatory, informed and rigorous.
Dai, Tao; Li, Changzhi; Li, Lin; Zhao, Zongbao Kent; Zhang, Bo; Cong, Yu; Wang, Aiqin
2018-02-12
Tungsten carbide was employed as the catalyst in an atom-economic and renewable synthesis of para-xylene with excellent selectivity and yield from 4-methyl-3-cyclohexene-1-carbonylaldehyde (4-MCHCA). This intermediate is the product of the Diels-Alder reaction between the two readily available bio-based building blocks acrolein and isoprene. Our results suggest that 4-MCHCA undergoes a novel dehydroaromatization-hydrodeoxygenation cascade process by intramolecular hydrogen transfer that does not involve an external hydrogen source, and that the hydrodeoxygenation occurs through the direct dissociation of the C=O bond on the W 2 C surface. Notably, this process is readily applicable to the synthesis of various (multi)methylated arenes from bio-based building blocks, thus potentially providing a petroleum-independent solution to valuable aromatic compounds. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Emotional facilitation of sensory processing in the visual cortex.
Schupp, Harald T; Junghöfer, Markus; Weike, Almut I; Hamm, Alfons O
2003-01-01
A key function of emotion is the preparation for action. However, organization of successful behavioral strategies depends on efficient stimulus encoding. The present study tested the hypothesis that perceptual encoding in the visual cortex is modulated by the emotional significance of visual stimuli. Event-related brain potentials were measured while subjects viewed pleasant, neutral, and unpleasant pictures. Early selective encoding of pleasant and unpleasant images was associated with a posterior negativity, indicating primary sources of activation in the visual cortex. The study also replicated previous findings in that affective cues also elicited enlarged late positive potentials, indexing increased stimulus relevance at higher-order stages of stimulus processing. These results support the hypothesis that sensory encoding of affective stimuli is facilitated implicitly by natural selective attention. Thus, the affect system not only modulates motor output (i.e., favoring approach or avoidance dispositions), but already operates at an early level of sensory encoding.
Nutritional controls of food reward.
Fernandes, Maria F; Sharma, Sandeep; Hryhorczuk, Cecile; Auguste, Stephanie; Fulton, Stephanie
2013-08-01
The propensity to select and consume palatable nutrients is strongly influenced by the rewarding effects of food. Neural processes integrating reward, emotional states and decision-making can supersede satiety signals to promote excessive caloric intake and weight gain. While nutritional habits are influenced by reward-based neural mechanisms, nutrition and its impact on energy metabolism, in turn, plays an important role in the control of food reward. Feeding modulates the release of metabolic hormones that have an important influence on central controls of appetite. Nutrients themselves are also an essential source of energy fuel, while serving as key metabolites and acting as signalling molecules in the neural pathways that control feeding and food reward. Along these lines, this review discusses the impact of nutritionally regulated hormones and select macronutrients on the behavioural and neural processes underlying the rewarding effects of food. Copyright © 2013 Canadian Diabetes Association. Published by Elsevier Inc. All rights reserved.
Song, Min-Ho; Choi, Jung-Woo; Kim, Yang-Hann
2012-02-01
A focused source can provide an auditory illusion of a virtual source placed between the loudspeaker array and the listener. When a focused source is generated by time-reversed acoustic focusing solution, its use as a virtual source is limited due to artifacts caused by convergent waves traveling towards the focusing point. This paper proposes an array activation method to reduce the artifacts for a selected listening point inside an array of arbitrary shape. Results show that energy of convergent waves can be reduced up to 60 dB for a large region including the selected listening point. © 2012 Acoustical Society of America
Process for gamma ray induced degradation of polychlorinated biphenyls
Meikrantz, David H.; Mincher, Bruce J.; Arbon, Rodney E.
1998-01-01
The invention is a process for the in-situ destruction of polychlorinated biphenyl (PCB) compounds in transformer oils and transformers. These compounds are broken down selectively by irradiation of the object or mixture using spent nuclear fuel or any isotopic source of high energy gamma radiation. For example, the level of applied dose required to decompose 400 ppm of polychlorinated biphenyl in transformer oil to less than 50 ppm is 500 kilogray. Destruction of polychlorinated biphenyls to levels of less than 50 ppm renders the transformer oil or transformer non-PCB contaminated under current regulations. Therefore, this process can be used to treat PCB contaminated oil and equipment to minimize or eliminate the generation of PCB hazardous waste.
THE PREVALENCE AND IMPACT OF WOLF–RAYET STARS IN EMERGING MASSIVE STAR CLUSTERS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sokal, Kimberly R.; Johnson, Kelsey E.; Indebetouw, Rémy
We investigate Wolf–Rayet (WR) stars as a source of feedback contributing to the removal of natal material in the early evolution of massive star clusters. Despite previous work suggesting that massive star clusters clear out their natal material before the massive stars evolve into the WR phase, WR stars have been detected in several emerging massive star clusters. These detections suggest that the timescale for clusters to emerge can be at least as long as the time required to produce WR stars (a few million years), and could also indicate that WR stars may be providing the tipping point inmore » the combined feedback processes that drive a massive star cluster to emerge. We explore the potential overlap between the emerging phase and the WR phase with an observational survey to search for WR stars in emerging massive star clusters hosting WR stars. We select candidate emerging massive star clusters from known radio continuum sources with thermal emission and obtain optical spectra with the 4 m Mayall Telescope at Kitt Peak National Observatory and the 6.5 m MMT.{sup 4} We identify 21 sources with significantly detected WR signatures, which we term “emerging WR clusters.” WR features are detected in ∼50% of the radio-selected sample, and thus we find that WR stars are commonly present in currently emerging massive star clusters. The observed extinctions and ages suggest that clusters without WR detections remain embedded for longer periods of time, and may indicate that WR stars can aid, and therefore accelerate, the emergence process.« less
Bloodworth, J W; Holman, I P; Burgess, P J; Gillman, S; Frogbrook, Z; Brown, P
2015-09-15
In recent years water companies have started to adopt catchment management to reduce diffuse pollution in drinking water supply areas. The heterogeneity of catchments and the range of pollutants that must be removed to meet the EU Drinking Water Directive (98/83/EC) limits make it difficult to prioritise areas of a catchment for intervention. Thus conceptual frameworks are required that can disaggregate the components of pollutant risk and help water companies make decisions about where to target interventions in their catchments to maximum effect. This paper demonstrates the concept of generalising pollutants in the same framework by reviewing key pollutant processes within a source-mobilisation-delivery context. From this, criteria are developed (with input from water industry professionals involved in catchment management) which highlights the need for a new water industry specific conceptual framework. The new CaRPoW (Catchment Risk to Potable Water) framework uses the Source-Mobilisation-Delivery concept as modular components of risk that work at two scales, source and mobilisation at the field scale and delivery at the catchment scale. Disaggregating pollutant processes permits the main components of risk to be ascertained so that appropriate interventions can be selected. The generic structure also allows for the outputs from different pollutants to be compared so that potential multiple benefits can be identified. CaRPow provides a transferable framework that can be used by water companies to cost-effectively target interventions under current conditions or under scenarios of land use or climate change. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.
Major, John E; Barsi, Debby C; Mosseler, Alex; Campbell, Moira; Rajora, Om P
2003-07-01
Red spruce (Picea rubens Sarg.) and black spruce (Picea mariana (Mill.) B.S.P.) are genetically and morphologically similar but ecologically distinct species. We determined intraspecific seed-source and interspecific variation of red spruce and black spruce, from across the near-northern margins of their ranges, for several light-energy processing and freezing-tolerance adaptive traits. Before exposure to low temperature, red spruce had variable fluorescence (Fv) similar to black spruce, but higher photochemical efficiency (Fv/Fm), lower quantum yield, lower chlorophyll fluorescence (%), and higher thermal dissipation efficiency (qN), although the seed-source effect and the seed-source x species interaction were significant only for Fv/Fm. After low-temperature exposure (-40 degrees C), red spruce had significantly lower Fv/Fm, quantum yield and qN than black spruce, but higher chlorophyll fluorescence and relative fluorescence. Species, seed-source effect, and seed-source x species interaction were consistent with predictions based on genetic (e.g., geographic) origins. Multi-temperature exposures (5, -20 and -40 degrees C) often produced significant species and temperature effects, and species x temperature interactions as a result of species-specific responses to temperature exposures. The inherent physiological species-specific adaptations of red spruce and black spruce were largely consistent with a shade-tolerant, late-successional species and an early successional species, respectively. Species differences in physiological adaptations conform to a biological trade-off, probably as a result of natural selection pressure in response to light availability and prevailing temperature gradients.
Ockerman, Darwin J.; Heitmuller, Franklin T.; Wehmeyer, Loren L.
2013-01-01
During 2010, additional suspended-sediment data were collected during selected runoff events to provide new data for model testing and to help better understand the sources of suspended-sediment loads. The model was updated and used to estimate and compare sediment yields from each of 64 subwatersheds comprising the lower Nueces River watershed study area for three selected runoff events: November 20-21, 2009, September 7-8, 2010, and September 20-21, 2010. These three runoff events were characterized by heavy rainfall centered near the study area and during which minimal streamflow and suspended-sediment load entered the lower Nueces River upstream from Wesley E. Seale Dam. During all three runoff events, model simulations showed that the greatest sediment yields originated from the subwatersheds, which were largely cropland. In particular, the Bayou Creek subwatersheds were major contributors of suspended-sediment load to the lower Nueces River during the selected runoff events. During the November 2009 runoff event, high suspended-sediment concentrations in the Nueces River water withdrawn for the City of Corpus Christi public-water supply caused problems during the water-treatment process, resulting in failure to meet State water-treatment standards for turbidity in drinking water. Model simulations of the November 2009 runoff event showed that the Bayou Creek subwatersheds were the primary source of suspended-sediment loads during that runoff event.
2016-06-01
thorough market research, acquisition professionals must decide at an early stage which source selection strategy (lowest price technically...minimizing risk and ensuring best value for all stakeholders. On the basis of thorough market research, acquisition professionals must decide at an early...price-based, market -driven environment from requirements development through properly disposal. Source selection must be 8 made on a ‘best value
ERIC Educational Resources Information Center
Kim, Kyung-Sun; Sin, Sei-Ching Joanna
2007-01-01
A survey of undergraduate students examined how students' beliefs about their problem-solving styles and abilities (including avoidant style, confidence, and personal control in problem-solving) influenced their perception and selection of sources, as reflected in (1) perceived characteristics of sources, (2) source characteristics considered…
NASA Astrophysics Data System (ADS)
Götz, Joachim; Buckel, Johannes; Heckmann, Tobias
2013-04-01
The analysis of alpine sediment cascades requires the identification, differentiation and quantification of sediment sources, storages, and transport processes. This study deals with the origin of alpine sediment transfer and relates primary talus deposits to corresponding rockwall source areas within the Gradenbach catchment (Schober Mountains, Austrian Alps). Sediment storage landforms are based on a detailed geomorphological map of the catchment which was generated to analyse the sediment transfer system. Mapping was mainly performed in the field and supplemented by post-mapping analysis using LIDAR data and digital orthophotos. A fundamental part of the mapping procedure was to capture additional landform-based information with respect to morphometry, activity and connectivity. The applied procedure provides a detailed inventory of sediment storage landforms including additional information on surface characteristics, dominant and secondary erosion and deposition processes, process activity and sediment storage coupling. We develop the working hypothesis that the present-day surface area ratio between rockfall talus (area as a proxy for volume, backed by geophysical analysis of selected talus cones) and corresponding rockwall source area is a measure of rockfall activity since deglaciation; large talus cones derived from small rockwall catchments indicate high activity, while low activity can be inferred where rockfall from large rock faces has created only small deposits. The surface area ratio of talus and corresponding rockwalls is analysed using a landform-based and a process-based approach. For the landform-based approach, we designed a GIS procedure which derives the (hydrological) catchment area of the contact lines of talus and rockwall landforms in the geomorphological map. The process-based approach simulates rockfall trajectories from steep (>45°) portions of a DEM generated by a random-walk rockfall model. By back-tracing those trajectories that end on a selected talus landform, the 'rockfall contributing area' is delineated; this approach takes account of the stochastic nature of rockfall trajectories and is able to identify, for example, rockfall delivery from one rockwall segment to multiple talus landforms (or from multiple rockfall segments to the same deposit, respectively). Using both approaches, a total of 290 rockwall-talus-subsystems are statistically analysed indicating a constant relationship between rockfall source areas and corresponding areas of talus deposits of almost 1:1. However, certain rockwall-talus-subsystems deviate from this correlation since sediment storage landforms of similar size originate from varying rockwall source areas and vice versa. This varying relationship is assumed to be strongly controlled by morphometric parameters, such as rockwall slope, altitudinal interval, and aspect. The impact of these parameters on the surface area ratio will be finally discussed.
NASA Astrophysics Data System (ADS)
Teller, Amit; Lange, Manfred; Ioannou, Stelios; Keleshis, Christos
2010-05-01
The Autonomous Flying Platforms for Atmospheric and Earth Surface Observations project (APAESO) of the Energy, Environment and Water Research Center (EEWRC) at the Cyprus Institute is aimed at the dual purpose of carrying out atmospheric and earth-surface observations in the Mediterranean. The APAESO platforms will offer the unique potential to determine physical, chemical and radiative atmospheric properties, aerosol and dust concentrations, atmospheric dynamics, surface morphology, vegetation and land use patterns as well as ocean surface properties (biology, waves, currents) and to carry out archaeological site reconnaissance and contaminant detection at high spatial resolution. The first phase of APAESO was dedicated to the preliminary design and the selection of an Unmanned Aerial Vehicle (UAV) as the backbone of the APAESO infrastructure. Selection of a UAV suitable for the many research objectives as outlined above is challenging because the UAV technology is new and rapidly evolving. This notwithstanding, a very large number of systems, mostly utilized for defense purposes, are currently available. The major challenge in the selection process lies in considering the trade-off between different platform characteristics (e.g. payload weight, endurance, max. altitude for operation and price) and in optimizing the potential performance of the UAV. Based on the required characteristics for the UAV platform, a survey of possible UAVs and suitable sensors was prepared based on various data sources. We used an elimination process in order to consider only a few models for the final selection process out of about 1000 commercially available UAV models that were initially investigated. The presentation will discuss the main scientific objectives that determine the specification of the UAV platform, major considerations in selecting best available technology for our needs and will briefly describe the next phases of the project.
Normal aging delays and compromises early multifocal visual attention during object tracking.
Störmer, Viola S; Li, Shu-Chen; Heekeren, Hauke R; Lindenberger, Ulman
2013-02-01
Declines in selective attention are one of the sources contributing to age-related impairments in a broad range of cognitive functions. Most previous research on mechanisms underlying older adults' selection deficits has studied the deployment of visual attention to static objects and features. Here we investigate neural correlates of age-related differences in spatial attention to multiple objects as they move. We used a multiple object tracking task, in which younger and older adults were asked to keep track of moving target objects that moved randomly in the visual field among irrelevant distractor objects. By recording the brain's electrophysiological responses during the tracking period, we were able to delineate neural processing for targets and distractors at early stages of visual processing (~100-300 msec). Older adults showed less selective attentional modulation in the early phase of the visual P1 component (100-125 msec) than younger adults, indicating that early selection is compromised in old age. However, with a 25-msec delay relative to younger adults, older adults showed distinct processing of targets (125-150 msec), that is, a delayed yet intact attentional modulation. The magnitude of this delayed attentional modulation was related to tracking performance in older adults. The amplitude of the N1 component (175-210 msec) was smaller in older adults than in younger adults, and the target amplification effect of this component was also smaller in older relative to younger adults. Overall, these results indicate that normal aging affects the efficiency and timing of early visual processing during multiple object tracking.
Development of FWIGPR, an open-source package for full-waveform inversion of common-offset GPR data
NASA Astrophysics Data System (ADS)
Jazayeri, S.; Kruse, S.
2017-12-01
We introduce a package for full-waveform inversion (FWI) of Ground Penetrating Radar (GPR) data based on a combination of open-source programs. The FWI requires a good starting model, based on direct knowledge of field conditions or on traditional ray-based inversion methods. With a good starting model, the FWI can improve resolution of selected subsurface features. The package will be made available for general use in educational and research activities. The FWIGPR package consists of four main components: 3D to 2D data conversion, source wavelet estimation, forward modeling, and inversion. (These four components additionally require the development, by the user, of a good starting model.) A major challenge with GPR data is the unknown form of the waveform emitted by the transmitter held close to the ground surface. We apply a blind deconvolution method to estimate the source wavelet, based on a sparsity assumption about the reflectivity series of the subsurface model (Gholami and Sacchi 2012). The estimated wavelet is deconvolved from the data and the sparsest reflectivity series with fewest reflectors. The gprMax code (www.gprmax.com) is used as the forward modeling tool and the PEST parameter estimation package (www.pesthomepage.com) for the inversion. To reduce computation time, the field data are converted to an effective 2D equivalent, and the gprMax code can be run in 2D mode. In the first step, the user must create a good starting model of the data, presumably using ray-based methods. This estimated model will be introduced to the FWI process as an initial model. Next, the 3D data is converted to 2D, then the user estimates the source wavelet that best fits the observed data by sparsity assumption of the earth's response. Last, PEST runs gprMax with the initial model and calculates the misfit between the synthetic and observed data, and using an iterative algorithm calling gprMax several times ineach iteration, finds successive models that better fit the data. To gauge whether the iterative process has arrived at a local or global minima, the process can be repeated with a range of starting models. Tests have shown that this package can successfully improve estimates of selected subsurface model parameters for simple synthetic and real data. Ongoing research will focus on FWI of more complex scenarios.
A Guide to Southern Pine Seed Sources
Clark W. Lantz; John F. Kraus
1987-01-01
The selection of an appropriate seed source is critical for successful southern pine plantations. Guides for selection of seed sources are presented for loblolly, slash, longleaf, Virginia, shortleaf, and sand pines. Separate recommendations are given for areas where fusiform-rust hazard is high.
Time Reversal Mirrors and Cross Correlation Functions in Acoustic Wave Propagation
NASA Astrophysics Data System (ADS)
Fishman, Louis; Jonsson, B. Lars G.; de Hoop, Maarten V.
2009-03-01
In time reversal acoustics (TRA), a signal is recorded by an array of transducers, time reversed, and then retransmitted into the configuration. The retransmitted signal propagates back through the same medium and retrofocuses on the source that generated the signal. If the transducer array is a single, planar (flat) surface, then this configuration is referred to as a planar, one-sided, time reversal mirror (TRM). In signal processing, for example, in active-source seismic interferometry, the measurement of the wave field at two distinct receivers, generated by a common source, is considered. Cross correlating these two observations and integrating the result over the sources yield the cross correlation function (CCF). Adopting the TRM experiments as the basic starting point and identifying the kinematically correct correspondences, it is established that the associated CCF signal processing constructions follow in a specific, infinite recording time limit. This perspective also provides for a natural rationale for selecting the Green's function components in the TRM and CCF expressions. For a planar, one-sided, TRM experiment and the corresponding CCF signal processing construction, in a three-dimensional homogeneous medium, the exact expressions are explicitly calculated, and the connecting limiting relationship verified. Finally, the TRM and CCF results are understood in terms of the underlying, governing, two-way wave equation, its corresponding time reversal invariance (TRI) symmetry, and the absence of TRI symmetry in the associated one-way wave equations, highlighting the role played by the evanescent modal contributions.
Quantifying selection in evolving populations using time-resolved genetic data
NASA Astrophysics Data System (ADS)
Illingworth, Christopher J. R.; Mustonen, Ville
2013-01-01
Methods which uncover the molecular basis of the adaptive evolution of a population address some important biological questions. For example, the problem of identifying genetic variants which underlie drug resistance, a question of importance for the treatment of pathogens, and of cancer, can be understood as a matter of inferring selection. One difficulty in the inference of variants under positive selection is the potential complexity of the underlying evolutionary dynamics, which may involve an interplay between several contributing processes, including mutation, recombination and genetic drift. A source of progress may be found in modern sequencing technologies, which confer an increasing ability to gather information about evolving populations, granting a window into these complex processes. One particularly interesting development is the ability to follow evolution as it happens, by whole-genome sequencing of an evolving population at multiple time points. We here discuss how to use time-resolved sequence data to draw inferences about the evolutionary dynamics of a population under study. We begin by reviewing our earlier analysis of a yeast selection experiment, in which we used a deterministic evolutionary framework to identify alleles under selection for heat tolerance, and to quantify the selection acting upon them. Considering further the use of advanced intercross lines to measure selection, we here extend this framework to cover scenarios of simultaneous recombination and selection, and of two driver alleles with multiple linked neutral, or passenger, alleles, where the driver pair evolves under an epistatic fitness landscape. We conclude by discussing the limitations of the approach presented and outlining future challenges for such methodologies.
NASA Technical Reports Server (NTRS)
Debaille, V.; Yin, Q.-Z.; Brandon, A. D.; Jacobsen, B.; Treiman, A. H.
2007-01-01
We present a new Lu-Hf and Sm-Nd isotope systematics study of four enriched shergottites (Zagami, Shergotty, NWA856 and Los Angeles), and three nakhlites (Nakhla, MIL03346 and Yamato 000593) in order to further understand processes occurring during the early differentiation of Mars and the crystallization of its magma ocean. Two fractions of the terrestrial petrological analogue of nakhlites, the Archaean Theo's flow (Ontario, Canada) were also measured. The coupling of Nd and Hf isotopes provide direct insights on the mineralogy of the melt sources. In contrast to Sm/Nd, Lu/Hf ratios can be very large in minerals such as garnet. Selective partial melting of garnet bearing mantle sources can therefore lead to characteristic Lu/Hf signatures that can be recognized with Hf-176/Hf-177Hf ratios.
X-Ray Spectral Properties of Seven Heavily Obscured Seyfert 2 Galaxies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marchesi, S.; Ajello, M.; Comastri, A.
2017-02-10
We present the combined Chandra and Swift -BAT spectral analysis of seven Seyfert 2 galaxies selected from the Swift -BAT 100 month catalog. We selected nearby ( z ≤ 0.03) sources lacking a ROSAT counterpart that never previously been observed with Chandra in the 0.3–10 keV energy range, and targeted these objects with 10 ks Chandra ACIS-S observations. The X-ray spectral fitting over the 0.3–150 keV energy range allows us to determine that all the objects are significantly obscured, with N{sub H} ≥ 10{sup 23} cm{sup −2} at a >99% confidence level. Moreover, one to three sources are candidate Compton-thickmore » Active Galactic Nuclei (CT-AGNs; i.e., N{sub H}≥10{sup 24} cm{sup −2}). We also test the recent spectral curvature method developed by Koss et al. to find candidate CT-AGNs, finding a good agreement between our results and their predictions. Because the selection criteria we adopted were effective in detecting highly obscured AGNs, further observations of these and other Seyfert 2 galaxies selected from the Swift -BAT 100 month catalog will allow us to create a statistically significant sample of highly obscured AGNs, therefore providing a better understanding of the physics of the obscuration processes.« less
Relay selection in energy harvesting cooperative networks with rateless codes
NASA Astrophysics Data System (ADS)
Zhu, Kaiyan; Wang, Fei
2018-04-01
This paper investigates the relay selection in energy harvesting cooperative networks, where the relays harvests energy from the radio frequency (RF) signals transmitted by a source, and the optimal relay is selected and uses the harvested energy to assist the information transmission from the source to its destination. Both source and the selected relay transmit information using rateless code, which allows the destination recover original information after collecting codes bits marginally surpass the entropy of original information. In order to improve transmission performance and efficiently utilize the harvested power, the optimal relay is selected. The optimization problem are formulated to maximize the achievable information rates of the system. Simulation results demonstrate that our proposed relay selection scheme outperform other strategies.
Jones, Jeanne; Kalkan, Erol; Stephens, Christopher
2017-02-23
A continually increasing number of high-quality digital strong-motion records from stations of the National Strong-Motion Project (NSMP) of the U.S. Geological Survey (USGS), as well as data from regional seismic networks within the United States, call for automated processing of strong-motion records with human review limited to selected significant or flagged records. The NSMP has developed the Processing and Review Interface for Strong Motion data (PRISM) software to meet this need. In combination with the Advanced National Seismic System Quake Monitoring System (AQMS), PRISM automates the processing of strong-motion records. When used without AQMS, PRISM provides batch-processing capabilities. The PRISM version 1.0.0 is platform independent (coded in Java), open source, and does not depend on any closed-source or proprietary software. The software consists of two major components: a record processing engine and a review tool that has a graphical user interface (GUI) to manually review, edit, and process records. To facilitate use by non-NSMP earthquake engineers and scientists, PRISM (both its processing engine and review tool) is easy to install and run as a stand-alone system on common operating systems such as Linux, OS X, and Windows. PRISM was designed to be flexible and extensible in order to accommodate new processing techniques. This report provides a thorough description and examples of the record processing features supported by PRISM. All the computing features of PRISM have been thoroughly tested.
NASA Technical Reports Server (NTRS)
Dhooge, P. M.; Nimitz, J. S.
2001-01-01
Process analysis can identify opportunities for efficiency improvement including cost reduction, increased safety, improved quality, and decreased environmental impact. A thorough, systematic approach to materials and process selection is valuable in any analysis. New operations and facilities design offer the best opportunities for proactive cost reduction and environmental improvement, but existing operations and facilities can also benefit greatly. Materials and processes that have been used for many years may be sources of excessive resource use, waste generation, pollution, and cost burden that should be replaced. Operational and purchasing personnel may not recognize some materials and processes as problems. Reasons for materials or process replacement may include quality and efficiency improvements, excessive resource use and waste generation, materials and operational costs, safety (flammability or toxicity), pollution prevention, compatibility with new processes or materials, and new or anticipated regulations.
TkPl_SU: An Open-source Perl Script Builder for Seismic Unix
NASA Astrophysics Data System (ADS)
Lorenzo, J. M.
2017-12-01
TkPl_SU (beta) is a graphical user interface (GUI) to select parameters for Seismic Unix (SU) modules. Seismic Unix (Stockwell, 1999) is a widely distributed free software package for processing seismic reflection and signal processing. Perl/Tk is a mature, well-documented and free object-oriented graphical user interface for Perl. In a classroom environment, shell scripting of SU modules engages students and helps focus on the theoretical limitations and strengths of signal processing. However, complex interactive processing stages, e.g., selection of optimal stacking velocities, killing bad data traces, or spectral analysis requires advanced flows beyond the scope of introductory classes. In a research setting, special functionality from other free seismic processing software such as SioSeis (UCSD-NSF) can be incorporated readily via an object-oriented style to programming. An object oriented approach is a first step toward efficient extensible programming of multi-step processes, and a simple GUI simplifies parameter selection and decision making. Currently, in TkPl_SU, Perl 5 packages wrap 19 of the most common SU modules that are used in teaching undergraduate and first-year graduate student classes (e.g., filtering, display, velocity analysis and stacking). Perl packages (classes) can advantageously add new functionality around each module and clarify parameter names for easier usage. For example, through the use of methods, packages can isolate the user from repetitive control structures, as well as replace the names of abbreviated parameters with self-describing names. Moose, an extension of the Perl 5 object system, greatly facilitates an object-oriented style. Perl wrappers are self-documenting via Perl programming document markup language.
SHIELD: FITGALAXY -- A Software Package for Automatic Aperture Photometry of Extended Sources
NASA Astrophysics Data System (ADS)
Marshall, Melissa
2013-01-01
Determining the parameters of extended sources, such as galaxies, is a common but time-consuming task. Finding a photometric aperture that encompasses the majority of the flux of a source and identifying and excluding contaminating objects is often done by hand - a lengthy and difficult to reproduce process. To make extracting information from large data sets both quick and repeatable, I have developed a program called FITGALAXY, written in IDL. This program uses minimal user input to automatically fit an aperture to, and perform aperture and surface photometry on, an extended source. FITGALAXY also automatically traces the outlines of surface brightness thresholds and creates surface brightness profiles, which can then be used to determine the radial properties of a source. Finally, the program performs automatic masking of contaminating sources. Masks and apertures can be applied to multiple images (regardless of the WCS solution or plate scale) in order to accurately measure the same source at different wavelengths. I present the fluxes, as measured by the program, of a selection of galaxies from the Local Volume Legacy Survey. I then compare these results with the fluxes given by Dale et al. (2009) in order to assess the accuracy of FITGALAXY.
Sources of funding for Nobel Prize-winning work: public or private?
Tatsioni, Athina; Vavva, Effie; Ioannidis, John P A
2010-05-01
Funding is important for scientists' work and may contribute to exceptional research outcomes. We analyzed the funding sources reported in the landmark scientific papers of Nobel Prize winners. Between 2000 and 2008, 70 Nobel laureates won recognition in medicine, physics, and chemistry. Sixty five (70%) of the 93 selected papers related to the Nobel-awarded work reported some funding source including U.S. government sources in 53 (82%), non-U.S. government sources in 19 (29%), and nongovernment sources in 33 (51%). A substantial portion of this exceptional work was unfunded. We contacted Nobel laureates whose landmark papers reported no funding. Thirteen Nobel laureates responded and offered their insights about the funding process and difficulties inherent in funding. Overall, very diverse sources amounting to a total of 64 different listed sponsors supported Nobel-related work. A few public institutions, in particular the U.S. National Institutes of Health (with n=26 funded papers) and the National Science Foundation (with n=17 papers), stood out for their successful record for funding exceptional research. However, Nobel-level work arose even from completely unfunded research, especially when institutions offered a protected environment for dedicated scientists.
Clinical Natural Language Processing in 2015: Leveraging the Variety of Texts of Clinical Interest.
Névéol, A; Zweigenbaum, P
2016-11-10
To summarize recent research and present a selection of the best papers published in 2015 in the field of clinical Natural Language Processing (NLP). A systematic review of the literature was performed by the two section editors of the IMIA Yearbook NLP section by searching bibliographic databases with a focus on NLP efforts applied to clinical texts or aimed at a clinical outcome. Section editors first selected a shortlist of candidate best papers that were then peer-reviewed by independent external reviewers. The clinical NLP best paper selection shows that clinical NLP is making use of a variety of texts of clinical interest to contribute to the analysis of clinical information and the building of a body of clinical knowledge. The full review process highlighted five papers analyzing patient-authored texts or seeking to connect and aggregate multiple sources of information. They provide a contribution to the development of methods, resources, applications, and sometimes a combination of these aspects. The field of clinical NLP continues to thrive through the contributions of both NLP researchers and healthcare professionals interested in applying NLP techniques to impact clinical practice. Foundational progress in the field makes it possible to leverage a larger variety of texts of clinical interest for healthcare purposes.
Viscous remanent magnetization model for the Broken Ridge satellite magnetic anomaly
NASA Technical Reports Server (NTRS)
Johnson, B. D.
1985-01-01
An equivalent source model solution of the satellite magnetic field over Australia obtained by Mayhew et al. (1980) showed that the satellite anomalies could be related to geological features in Australia. When the processing and selection of the Magsat data over the Australian region had progressed to the point where interpretation procedures could be initiated, it was decided to start by attempting to model the Broken Ridge satellite anomaly, which represents one of the very few relatively isolated anomalies in the Magsat maps, with an unambiguous source region. Attention is given to details concerning the Broken Ridge satellite magnetic anomaly, the modeling method used, the Broken Ridge models, modeling results, and characteristics of magnetization.
Miniaci, M; Gliozzi, A S; Morvan, B; Krushynska, A; Bosia, F; Scalerandi, M; Pugno, N M
2017-05-26
The appearance of nonlinear effects in elastic wave propagation is one of the most reliable and sensitive indicators of the onset of material damage. However, these effects are usually very small and can be detected only using cumbersome digital signal processing techniques. Here, we propose and experimentally validate an alternative approach, using the filtering and focusing properties of phononic crystals to naturally select and reflect the higher harmonics generated by nonlinear effects, enabling the realization of time-reversal procedures for nonlinear elastic source detection. The proposed device demonstrates its potential as an efficient, compact, portable, passive apparatus for nonlinear elastic wave sensing and damage detection.
Scientific Data Purchase Project Overview Presentation
NASA Technical Reports Server (NTRS)
Holekamp, Kara; Fletcher, Rose
2001-01-01
The Scientific Data Purchase (SDP) project acquires science data from commercial sources. It is a demonstration project to test a new way of doing business, tap new sources of data, support Earth science research, and support the commercial remote sensing industry. Phase I of the project reviews simulated/prototypical data sets from 10 companies. Phase II of the project is a 3 year purchase/distribution of select data from 5 companies. The status of several SDP projects is reviewed in this viewgraph presentation, as is the SDP process of tasking, verification, validation, and data archiving. The presentation also lists SDP results for turnaround time, metrics, customers, data use, science research, applications research, and user feedback.
Podsakoff, Philip M; MacKenzie, Scott B; Lee, Jeong-Yeon; Podsakoff, Nathan P
2003-10-01
Interest in the problem of method biases has a long history in the behavioral sciences. Despite this, a comprehensive summary of the potential sources of method biases and how to control for them does not exist. Therefore, the purpose of this article is to examine the extent to which method biases influence behavioral research results, identify potential sources of method biases, discuss the cognitive processes through which method biases influence responses to measures, evaluate the many different procedural and statistical techniques that can be used to control method biases, and provide recommendations for how to select appropriate procedural and statistical remedies for different types of research settings.
NASA Astrophysics Data System (ADS)
Hellman, S. B.; Lisowski, S.; Baker, B.; Hagerty, M.; Lomax, A.; Leifer, J. M.; Thies, D. A.; Schnackenberg, A.; Barrows, J.
2015-12-01
Tsunami Information technology Modernization (TIM) is a National Oceanic and Atmospheric Administration (NOAA) project to update and standardize the earthquake and tsunami monitoring systems currently employed at the U.S. Tsunami Warning Centers in Ewa Beach, Hawaii (PTWC) and Palmer, Alaska (NTWC). While this project was funded by NOAA to solve a specific problem, the requirements that the delivered system be both open source and easily maintainable have resulted in the creation of a variety of open source (OS) software packages. The open source software is now complete and this is a presentation of the OS Software that has been funded by NOAA for benefit of the entire seismic community. The design architecture comprises three distinct components: (1) The user interface, (2) The real-time data acquisition and processing system and (3) The scientific algorithm library. The system follows a modular design with loose coupling between components. We now identify the major project constituents. The user interface, CAVE, is written in Java and is compatible with the existing National Weather Service (NWS) open source graphical system AWIPS. The selected real-time seismic acquisition and processing system is open source SeisComp3 (sc3). The seismic library (libseismic) contains numerous custom written and wrapped open source seismic algorithms (e.g., ML/mb/Ms/Mwp, mantle magnitude (Mm), w-phase moment tensor, bodywave moment tensor, finite-fault inversion, array processing). The seismic library is organized in a way (function naming and usage) that will be familiar to users of Matlab. The seismic library extends sc3 so that it can be called by the real-time system, but it can also be driven and tested outside of sc3, for example, by ObsPy or Earthworm. To unify the three principal components we have developed a flexible and lightweight communication layer called SeismoEdex.
Application of hierarchical Bayesian unmixing models in river sediment source apportionment
NASA Astrophysics Data System (ADS)
Blake, Will; Smith, Hugh; Navas, Ana; Bodé, Samuel; Goddard, Rupert; Zou Kuzyk, Zou; Lennard, Amy; Lobb, David; Owens, Phil; Palazon, Leticia; Petticrew, Ellen; Gaspar, Leticia; Stock, Brian; Boeckx, Pacsal; Semmens, Brice
2016-04-01
Fingerprinting and unmixing concepts are used widely across environmental disciplines for forensic evaluation of pollutant sources. In aquatic and marine systems, this includes tracking the source of organic and inorganic pollutants in water and linking problem sediment to soil erosion and land use sources. It is, however, the particular complexity of ecological systems that has driven creation of the most sophisticated mixing models, primarily to (i) evaluate diet composition in complex ecological food webs, (ii) inform population structure and (iii) explore animal movement. In the context of the new hierarchical Bayesian unmixing model, MIXSIAR, developed to characterise intra-population niche variation in ecological systems, we evaluate the linkage between ecological 'prey' and 'consumer' concepts and river basin sediment 'source' and sediment 'mixtures' to exemplify the value of ecological modelling tools to river basin science. Recent studies have outlined advantages presented by Bayesian unmixing approaches in handling complex source and mixture datasets while dealing appropriately with uncertainty in parameter probability distributions. MixSIAR is unique in that it allows individual fixed and random effects associated with mixture hierarchy, i.e. factors that might exert an influence on model outcome for mixture groups, to be explored within the source-receptor framework. This offers new and powerful ways of interpreting river basin apportionment data. In this contribution, key components of the model are evaluated in the context of common experimental designs for sediment fingerprinting studies namely simple, nested and distributed catchment sampling programmes. Illustrative examples using geochemical and compound specific stable isotope datasets are presented and used to discuss best practice with specific attention to (1) the tracer selection process, (2) incorporation of fixed effects relating to sample timeframe and sediment type in the modelling process, (3) deriving and using informative priors in sediment fingerprinting context and (4) transparency of the process and replication of model results by other users.
1986-04-01
E le,irn,’ in Nystrm, r . C . and Starbuck, W. H., (eds.), Handbook...Organizations and Society, 3 (1978), pp. 47-64. Iickson, D. J., C . R . Hinings, C . A. Lee, R . E . Schneck, and J. M. Pennings, "A Strategic Contingencies...Conceprs, St. Pai : West Publishing, 1978. Holland, W. E ., Stead, B. A., and Leibrock, R . C ., "Information Channel/Source Selection as a Correlate
Special Advanced Studies for Pollution Prevention. Delivery Order 0065: The Monitor - Spring 2001
2001-06-01
coating) baths by remov- ing trace contaminant metals as well as restoring and maintaining the hexavalent chromium or ferric species. The oxidizing...power for the process acid is restored by oxidation (trivalent chromium to hexavalent chromium or ferrous to ferric) at the anode. Other sources of...selection to the application. UF membranes are suitable for particles in the molecular range of 0.1-0.01microns. Microfiltration membranes are similar
Employment of adaptive learning techniques for the discrimination of acoustic emissions
NASA Astrophysics Data System (ADS)
Erkes, J. W.; McDonald, J. F.; Scarton, H. A.; Tam, K. C.; Kraft, R. P.
1983-11-01
The following aspects of this study on the discrimination of acoustic emissions (AE) were examined: (1) The analytical development and assessment of digital signal processing techniques for AE signal dereverberation, noise reduction, and source characterization; (2) The modeling and verification of some aspects of key selected techniques through a computer-based simulation; and (3) The study of signal propagation physics and their effect on received signal characteristics for relevant physical situations.
NASA Astrophysics Data System (ADS)
Hecht-Nielsen, Robert
1997-04-01
A new universal one-chart smooth manifold model for vector information sources is introduced. Natural coordinates (a particular type of chart) for such data manifolds are then defined. Uniformly quantized natural coordinates form an optimal vector quantization code for a general vector source. Replicator neural networks (a specialized type of multilayer perceptron with three hidden layers) are the introduced. As properly configured examples of replicator networks approach minimum mean squared error (e.g., via training and architecture adjustment using randomly chosen vectors from the source), these networks automatically develop a mapping which, in the limit, produces natural coordinates for arbitrary source vectors. The new concept of removable noise (a noise model applicable to a wide variety of real-world noise processes) is then discussed. Replicator neural networks, when configured to approach minimum mean squared reconstruction error (e.g., via training and architecture adjustment on randomly chosen examples from a vector source, each with randomly chosen additive removable noise contamination), in the limit eliminate removable noise and produce natural coordinates for the data vector portions of the noise-corrupted source vectors. Consideration regarding selection of the dimension of a data manifold source model and the training/configuration of replicator neural networks are discussed.
48 CFR 2415.308 - Source selection decision.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 48 Federal Acquisition Regulations System 6 2010-10-01 2010-10-01 true Source selection decision. 2415.308 Section 2415.308 Federal Acquisition Regulations System DEPARTMENT OF HOUSING AND URBAN... document its selection recommendation(s) in a final written report. The final report shall include...
NASA Astrophysics Data System (ADS)
Brady, M. P.; Keiser, J. R.; Leonard, D. N.; Whitmer, L.; Thomson, J. K.
2014-12-01
Thermochemical liquefaction processing of biomass to produce bio-derived fuels (e.g., gasoline, jet fuel, diesel, home heating oil, etc.) is of great recent interest as a renewable energy source. Approaches under investigation include direct liquefaction, hydrothermal liquefaction, hydropyrolysis, fast pyrolysis, etc., to produce energy dense liquids that can be utilized as produced or further processed to provide products of higher value. An issue with bio-oils is that they tend to contain significant concentrations of organic oxygenates, including acids, which make the bio-oil a potential source of corrosion issues in transport, storage, and use. Efforts devoted to modified/further processing of bio-oils to make them less corrosive are currently being widely pursued. Another issue that must also be addressed in bio-oil liquefaction is potential corrosion issues in the process equipment. Depending on the specific process, bio-oil liquefaction production temperatures are typically in the 300-600°C range, and the process environment can contain aggressive sulfur and halide species from both the biomass used and/or process additives. Detailed knowledge of the corrosion resistance of candidate process equipment alloys in these bio-oil production environments is currently lacking. This paper summarizes recent, ongoing efforts to assess the extent of corrosion of bio-oil process equipment, with the ultimate goal of providing a basis for the selection of the lowest cost alloy grades capable of providing the long-term corrosion resistance needed for future bio-oil production plants.
The Timing and Effort of Lexical Access in Natural and Degraded Speech
Wagner, Anita E.; Toffanin, Paolo; Başkent, Deniz
2016-01-01
Understanding speech is effortless in ideal situations, and although adverse conditions, such as caused by hearing impairment, often render it an effortful task, they do not necessarily suspend speech comprehension. A prime example of this is speech perception by cochlear implant users, whose hearing prostheses transmit speech as a significantly degraded signal. It is yet unknown how mechanisms of speech processing deal with such degraded signals, and whether they are affected by effortful processing of speech. This paper compares the automatic process of lexical competition between natural and degraded speech, and combines gaze fixations, which capture the course of lexical disambiguation, with pupillometry, which quantifies the mental effort involved in processing speech. Listeners’ ocular responses were recorded during disambiguation of lexical embeddings with matching and mismatching durational cues. Durational cues were selected due to their substantial role in listeners’ quick limitation of the number of lexical candidates for lexical access in natural speech. Results showed that lexical competition increased mental effort in processing natural stimuli in particular in presence of mismatching cues. Signal degradation reduced listeners’ ability to quickly integrate durational cues in lexical selection, and delayed and prolonged lexical competition. The effort of processing degraded speech was increased overall, and because it had its sources at the pre-lexical level this effect can be attributed to listening to degraded speech rather than to lexical disambiguation. In sum, the course of lexical competition was largely comparable for natural and degraded speech, but showed crucial shifts in timing, and different sources of increased mental effort. We argue that well-timed progress of information from sensory to pre-lexical and lexical stages of processing, which is the result of perceptual adaptation during speech development, is the reason why in ideal situations speech is perceived as an undemanding task. Degradation of the signal or the receiver channel can quickly bring this well-adjusted timing out of balance and lead to increase in mental effort. Incomplete and effortful processing at the early pre-lexical stages has its consequences on lexical processing as it adds uncertainty to the forming and revising of lexical hypotheses. PMID:27065901
Storm water runoff for the Y-12 Plant and selected parking lots
DOE Office of Scientific and Technical Information (OSTI.GOV)
Collins, E.T.
1996-01-01
A comparison of storm water runoff from the Y-12 Plant and selected employee vehicle parking lots to various industry data is provided in this document. This work is an outgrowth of and part of the continuing Non-Point Source Pollution Elimination Project that was initiated in the late 1980s. This project seeks to identify area pollution sources and remediate these areas through the Resource Conservation and Recovery Act/Comprehensive Environmental Response, Compensation, and Liability Act (RCRA/CERCLA) process as managed by the Environmental Restoration Organization staff. This work is also driven by the Clean Water Act Section 402(p) which, in part, deals withmore » establishing a National Pollutant Discharge Elimination System (NPDES) permit for storm water discharges. Storm water data from events occurring in 1988 through 1991 were analyzed in two reports: Feasibility Study for the Best Management Practices to Control Area Source Pollution Derived from Parking Lots at the DOE Y-12 Plant, September 1992, and Feasibility Study of Best Management Practices for Non-Point Source Pollution Control at the Oak Ridge Y-12 Plant, February 1993. These data consisted of analysis of outfalls discharging to upper East Fork Poplar Creek (EFPC) within the confines of the Y-12 Plant (see Appendixes D and E). These reports identified the major characteristics of concern as copper, iron, lead, manganese, mercury, nitrate (as nitrogen), zinc, biological oxygen demand (BOD), chemical oxygen demand (COD), total suspended solids (TSS), fecal coliform, and aluminum. Specific sources of these contaminants were not identifiable because flows upstream of outfalls were not sampled. In general, many of these contaminants were a concern in many outfalls. Therefore, separate sampling exercises were executed to assist in identifying (or eliminating) specific suspected sources as areas of concern.« less
First Results From A Multi-Ion Beam Lithography And Processing System At The University Of Florida
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gila, Brent; Appleton, Bill R.; Fridmann, Joel
2011-06-01
The University of Florida (UF) have collaborated with Raith to develop a version of the Raith ionLiNE IBL system that has the capability to deliver multi-ion species in addition to the Ga ions normally available. The UF system is currently equipped with a AuSi liquid metal alloy ion source (LMAIS) and ExB filter making it capable of delivering Au and Si ions and ion clusters for ion beam processing. Other LMAIS systems could be developed in the future to deliver other ion species. This system is capable of high performance ion beam lithography, sputter profiling, maskless ion implantation, ion beammore » mixing, and spatial and temporal ion beam assisted writing and processing over large areas (100 mm2)--all with selected ion species at voltages from 15-40 kV and nanometer precision. We discuss the performance of the system with the AuSi LMAIS source and ExB mass separator. We report on initial results from the basic system characterization, ion beam lithography, as well as for basic ion-solid interactions.« less
Radiation dosimetry for quality control of food preservation and disinfestation
NASA Astrophysics Data System (ADS)
McLaughlin, W. L.; Miller, A.; Uribe, R. M.
In the use of x and gamma rays and scanned electron beams to extend the shelf life of food by delay of sprouting and ripening, killing of microbes, and control of insect population, quality assurance is provided by standardized radiation dosimetry. By strategic placement of calibrated dosimeters that are sufficiently stable and reproducible, it is possible to monitor minimum and maximum radiation absorbed dose levels and dose uniformity for a given processed foodstuff. The dosimetry procedure is especially important in the commisioning of a process and in making adjustments of process parameters (e.g. conveyor speed) to meet changes that occur in product and source parameters (e.g. bulk density and radiation spectrum). Routine dosimetry methods and certain corrections of dosimetry data may be selected for the radiations used in typical food processes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Khabibullin, R. A., E-mail: khabibullin@isvch.ru; Shchavruk, N. V.; Pavlov, A. Yu.
2016-10-15
The Postgrowth processing of GaAs/AlGaAs multilayer heterostructures for terahertz quantumcascade lasers (QCLs) are studied. This procedure includes the thermocompression bonding of In–Au multilayer heterostructures with a doped n{sup +}-GaAs substrate, mechanical grinding, and selective wet etching of the substrate, and dry etching of QCL ridge mesastripes through a Ti/Au metallization mask 50 and 100 μm wide. Reactive-ion-etching modes with an inductively coupled plasma source in a BCl{sub 3}/Ar gas mixture are selected to obtain vertical walls of the QCL ridge mesastripes with minimum Ti/Au mask sputtering.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Conrad, Mark; Bill, Markus
2008-08-01
The nitrogen ({delta}{sup 15}N) and oxygen ({delta}{sup 18}O) isotopic compositions of nitrate in the environment are primarily a function of the source of the nitrate. The ranges of isotopic compositions for nitrate resulting from common sources are outlined in Figure 1 from Kendall (1998). As noted on Figure 1, processes such as microbial metabolism can modify the isotopic compositions of the nitrate, but the effects of these processes are generally predictable. At Hanford, nitrate and other nitrogenous compounds were significant components of most of the chemical processes used at the site. Most of the oxygen in nitrate chemicals (e.g., nitricmore » acid) is derived from atmospheric oxygen, giving it a significantly higher {delta}{sup 18}O value (+23.5{per_thousand}) than naturally occurring nitrate that obtains most of its oxygen from water (the {delta}{sup 18}O of Hanford groundwater ranges from -14{per_thousand} to -18{per_thousand}). This makes it possible to differentiate nitrate from Hanford site activities from background nitrate at the site (including most fertilizers that might have been used prior to the Department of Energy plutonium production activities at the site). In addition, the extreme thermal and chemical conditions that occurred during some of the waste processing procedures and subsequent waste storage in select single-shell tanks resulted in unique nitrate isotopic compositions that can be used to identify those waste streams in soil and groundwater at the site (Singleton et al., 2005; Christensen et al., 2007). This report presents nitrate isotope data for soil and groundwater samples from the Hanford 200 Areas and discusses the implications of that data for potential sources of groundwater contamination.« less
NASA Astrophysics Data System (ADS)
Zhao, Junsan; Chen, Guoping; Yuan, Lei
2017-04-01
The new technologies, such as 3D laser scanning, InSAR, GNSS, unmanned aerial vehicle and Internet of things, will provide much more data resources for the surveying and monitoring, as well as the development of Early Warning System (EWS). This paper provides the solutions of the design and implementation of a geological disaster monitoring and early warning system (GDMEWS), which includes landslides and debris flows hazard, based on the multi-sources of the date by use of technologies above mentioned. The complex and changeable characteristics of the GDMEWS are described. The architecture of the system, composition of the multi-source database, development mode and service logic, the methods and key technologies of system development are also analyzed. To elaborate the process of the implementation of the GDMEWS, Deqin Tibetan County is selected as a case study area, which has the unique terrain and diverse types of typical landslides and debris flows. Firstly, the system functional requirements, monitoring and forecasting models of the system are discussed. Secondly, the logic relationships of the whole process of disaster including pre-disaster, disaster rescue and post-disaster reconstruction are studied, and the support tool for disaster prevention, disaster reduction and geological disaster management are developed. Thirdly, the methods of the multi - source monitoring data integration and the generation of the mechanism model of Geological hazards and simulation are expressed. Finally, the construction of the GDMEWS is issued, which will be applied to management, monitoring and forecasting of whole disaster process in real-time and dynamically in Deqin Tibetan County. Keywords: multi-source spatial data; geological disaster; monitoring and warning system; Deqin Tibetan County
NASA Astrophysics Data System (ADS)
Cai, Z.; Wilson, R. D.
2009-05-01
Techniques for optimizing the removal of NAPL mass in source zones have advanced at a more rapid rate than strategies to assess treatment performance. Informed selection of remediation approaches would be easier if measurements of performance were more directly transferable. We developed a number of methods based on data generated from multilevel sampler (MLS) transects to assess the effectiveness of a bioaugmentation/biostimulation trial in a TCE source residing in a terrace gravel aquifer in the East Midlands, UK. In this spatially complex aquifer, treatment inferred from long screen monitoring well data was not as reliable as that from consideration of mass flux changes across transects installed in and downgradient of the source. Falling head tests were conducted in the MLS ports to generate the necessary hydraulic conductivity (K) data. Combining K with concentration provides a mass flux map that allows calculation of mass turnover and an assessment of where in the complex geology the greatest turnover occurred. Five snapshots over a 600-day period indicate a marked reduction in TCE flux, suggesting a significant reduction in DNAPL mass over that expected due to natural processes. However, persistence of daughter products suggested that complete dechlorination did not occur. The MLS fence data also revealed that delivery of both carbon source and pH buffer were not uniform across the test zone. This may have lead to the generation of niches of iron(III) and sulphate reduction as well as methanogenesis, which impacted on dechlorination processes. In the absence of this spatial data, it is difficult to reconcile apparent treatment as indicated in monitoring well data to on-going processes.
47 CFR 11.42 - Participation by communications common carriers.
Code of Federal Regulations, 2010 CFR
2010-10-01
... may, without charge, connect: (1) An originating source from the nearest service area to a selected... Emergency Action Termination, the common carriers shall disconnect the originating source and the... charge, connect an originating source from the nearest exchange to a selected Test Center and then to any...
Laser ion source for multi-nucleon transfer reaction products
NASA Astrophysics Data System (ADS)
Hirayama, Y.; Watanabe, Y. X.; Imai, N.; Ishiyama, H.; Jeong, S. C.; Miyatake, H.; Oyaizu, M.; Kimura, S.; Mukai, M.; Kim, Y. H.; Sonoda, T.; Wada, M.; Huyse, M.; Kudryavtsev, Yu.; Van Duppen, P.
2015-06-01
We have developed a laser ion source for the target-like fragments (TLFs) produced in multi-nucleon transfer (MNT) reactions. The operation principle of the source is based on the in-gas laser ionization and spectroscopy (IGLIS) approach. In the source TLFs are thermalized and neutralized in high pressure and high purity argon gas, and are extracted after being selectively re-ionized in a multi-step laser resonance ionization process. The laser ion source has been implemented at the KEK Isotope Separation System (KISS) for β-decay spectroscopy of neutron-rich isotopes with N = 126 of nuclear astrophysical interest. The simulations of gas flow and ion-beam optics have been performed to optimize the gas cell for efficient thermalization and fast transporting the TLFs, and the mass-separator for efficient transport with high mass-resolving power, respectively. To confirm the performances expected at the design stage, off-line experiments have been performed by using 56Fe atoms evaporated from a filament in the gas cell. The gas-transport time of 230 ms in the argon cell and the measured KISS mass-resolving power of 900 are consistent with the designed values. The high purity of the gas-cell system, which is extremely important for efficient and highly-selective production of laser ions, was achieved and confirmed from the mass distribution of the extracted ions. After the off-line tests, on-line experiments were conducted by directly injecting energetic 56Fe beam into the gas cell. After thermalization of the injected 56Fe beam, laser-produced singly-charged 56Fe+ ions were extracted. The extraction efficiency and selectivity of the gas cell in the presence of plasma induced by 56Fe beam injection as well as the time profile of the extracted ions were investigated; extraction efficiency of 0.25%, a beam purity of >99% and an extraction time of 270 ms. It has been confirmed that the performance of the KISS laser ion source is satisfactory to start the measurements of lifetimes of the β-decayed nuclei with N = 126 .
NASA Astrophysics Data System (ADS)
Harmon, T. C.; Rat'ko, A.; Dietrich, H.; Park, Y.; Wijsboom, Y. H.; Bendikov, M.
2008-12-01
Inorganic nitrogen (nitrate (NO3-) and ammonium (NH+)) from chemical fertilizer and livestock waste is a major source of pollution in groundwater, surface water and the air. While some sources of these chemicals, such as waste lagoons, are well-defined, their application as fertilizer has the potential to create distributed or non-point source pollution problems. Scalable nitrate sensors (small and inexpensive) would enable us to better assess non-point source pollution processes in agronomic soils, groundwater and rivers subject to non-point source inputs. This work describes the fabrication and testing of inexpensive PVC-membrane- based ion selective electrodes (ISEs) for monitoring nitrate levels in soil water environments. ISE-based sensors have the advantages of being easy to fabricate and use, but suffer several shortcomings, including limited sensitivity, poor precision, and calibration drift. However, modern materials have begun to yield more robust ISE types in laboratory settings. This work emphasizes the in situ behavior of commercial and fabricated sensors in soils subject to irrigation with dairy manure water. Results are presented in the context of deployment techniques (in situ versus soil lysimeters), temperature compensation, and uncertainty analysis. Observed temporal responses of the nitrate sensors exhibited diurnal cycling with elevated nitrate levels at night and depressed levels during the day. Conventional samples collected via lysimeters validated this response. It is concluded that while modern ISEs are not yet ready for long-term, unattended deployment, short-term installations (on the order of 2 to 4 days) are viable and may provide valuable insights into nitrogen dynamics in complex soil systems.
Visual attention: Linking prefrontal sources to neuronal and behavioral correlates.
Clark, Kelsey; Squire, Ryan Fox; Merrikhi, Yaser; Noudoost, Behrad
2015-09-01
Attention is a means of flexibly selecting and enhancing a subset of sensory input based on the current behavioral goals. Numerous signatures of attention have been identified throughout the brain, and now experimenters are seeking to determine which of these signatures are causally related to the behavioral benefits of attention, and the source of these modulations within the brain. Here, we review the neural signatures of attention throughout the brain, their theoretical benefits for visual processing, and their experimental correlations with behavioral performance. We discuss the importance of measuring cue benefits as a way to distinguish between impairments on an attention task, which may instead be visual or motor impairments, and true attentional deficits. We examine evidence for various areas proposed as sources of attentional modulation within the brain, with a focus on the prefrontal cortex. Lastly, we look at studies that aim to link sources of attention to its neuronal signatures elsewhere in the brain. Copyright © 2015. Published by Elsevier Ltd.
NASA Astrophysics Data System (ADS)
Chen, Huaiyu; Cao, Li
2017-06-01
In order to research multiple sound source localization with room reverberation and background noise, we analyze the shortcomings of traditional broadband MUSIC and ordinary auditory filtering based broadband MUSIC method, then a new broadband MUSIC algorithm with gammatone auditory filtering of frequency component selection control and detection of ascending segment of direct sound componence is proposed. The proposed algorithm controls frequency component within the interested frequency band in multichannel bandpass filter stage. Detecting the direct sound componence of the sound source for suppressing room reverberation interference is also proposed, whose merits are fast calculation and avoiding using more complex de-reverberation processing algorithm. Besides, the pseudo-spectrum of different frequency channels is weighted by their maximum amplitude for every speech frame. Through the simulation and real room reverberation environment experiments, the proposed method has good performance. Dynamic multiple sound source localization experimental results indicate that the average absolute error of azimuth estimated by the proposed algorithm is less and the histogram result has higher angle resolution.
Influence of Different Aluminum Sources on the NH3 Gas-Sensing Properties of ZnO Thin Films
NASA Astrophysics Data System (ADS)
Ozutok, Fatma; Karaduman, Irmak; Demiri, Sani; Acar, Selim
2018-02-01
Herein we report Al-doped ZnO films (AZO) deposited on the ZnO seed layer by chemical bath deposition method. Al powder, Al oxide and Al chloride were used as sources for the deposition process and investigated for their different effects on the NH3 gas-sensing performance. The morphological and microstructural properties were investigated by employing x-ray powder diffraction, scanning electron microscopy analysis and energy-dispersive x-ray spectroscopy. The characterization studies showed that the AZO thin films are crystalline and exhibit a hexagonal wurtzite structure. Ammonia (NH3) gas-sensing measurements of AZO films were performed at different concentration levels and different operation temperatures from 50°C to 210°C. The sample based on powder-Al source showed a higher response, selectivity and short response/recovery time than the remaining samples. The powder Al sample exhibited 33% response to 10-ppm ammonia gas at 190°C, confirming a strong dependence on the dopant source type.
NASA Astrophysics Data System (ADS)
Leukhin, R. I.; Shaykhutdinov, D. V.; Shirokov, K. M.; Narakidze, N. D.; Vlasov, A. S.
2017-02-01
Developing the experimental design of new electromagnetic constructions types in engineering industry enterprises requires solutions of two major problems: regulator’s parameters setup and comprehensive testing of electromagnets. A weber-ampere characteristic as a data source for electromagnet condition identification was selected. Present article focuses on development and implementation of the software for electromagnetic drive control system based on the weber-ampere characteristic measuring. The software for weber-ampere characteristic data processing based on artificial neural network is developed. Results of the design have been integrated into the program code in LabVIEW environment. The license package of LabVIEW graphic programming was used. The hardware is chosen and possibility of its use for control system implementation was proved. The trained artificial neural network defines electromagnetic drive effector position with minimal error. Developed system allows to control the electromagnetic drive powered by the voltage source, the current source and hybrid sources.
Exploring a Model of Symbolic Social Communication: The Case of ‘Magic’ Johnson
FLORA, JUNE A.; SCHOOLER, CAROLINE; MAYS, VICKIE M.; COCHRAN, SUSAN D.
2009-01-01
We propose a model of symbolic social communication to explain the process whereby sociocultural identity mediates relationships among receivers, sources and messages to shape message effects. This exploratory study examines how two at-risk groups of African American men responded to various HIV prevention messages delivered by celebrity and professional sources. We interviewed 47 men from a homeless shelter and 50 male college students. Members of both groups were likely to select Johnson as the best person to deliver HIV prevention messages among a list of African American celebrity and professional sources. Results suggest the symbolic meanings embedded in celebrities and message topics are important and enduring influences on message effects. The images and ideas that a source represents are transferred to the advocated behavior, attitude or knowledge change and thus shape how messages are interpreted and received. Further understanding of how culture influences the effects of persuasive messages is critical for the improvement of health-communication campaigns. PMID:22011997
The extreme ultraviolet explorer
NASA Technical Reports Server (NTRS)
Bowyer, Stuart; Malina, Roger F.
1990-01-01
The Extreme Ultraviolet Explorer (EUVE) mission, currently scheduled for launch in September 1991, is described. The primary purpose of the mission is to survey the celestial sphere for astronomical sources of Extreme Ultraviolet (EUV) radiation. The survey will be accomplished with the use of three EUV telescopes, each sensitive to a different segment of the EUV band. A fourth telescope will perform a high sensitivity search of a limited sample of the sky in the shortest wavelength bands. The all sky survey will be carried out in the first six months of the mission and will be made in four bands, or colors. The second phase of the mission, conducted entirely by guest observers selected by NASA, will be devoted to spectroscopic observations of EUV sources. The performance of the instrument components is described. An end to end model of the mission, from a stellar source to the resulting scientific data, was constructed. Hypothetical data from astronomical sources processed through this model are shown.
High brilliance negative ion and neutral beam source
Compton, Robert N.
1991-01-01
A high brilliance mass selected (Z-selected) negative ion and neutral beam source having good energy resolution. The source is based upon laser resonance ionization of atoms or molecules in a small gaseous medium followed by charge exchange through an alkali oven. The source is capable of producing microampere beams of an extremely wide variety of negative ions, and milliampere beams when operated in the pulsed mode.
High temporal resolution mapping of seismic noise sources using heterogeneous supercomputers
NASA Astrophysics Data System (ADS)
Gokhberg, Alexey; Ermert, Laura; Paitz, Patrick; Fichtner, Andreas
2017-04-01
Time- and space-dependent distribution of seismic noise sources is becoming a key ingredient of modern real-time monitoring of various geo-systems. Significant interest in seismic noise source maps with high temporal resolution (days) is expected to come from a number of domains, including natural resources exploration, analysis of active earthquake fault zones and volcanoes, as well as geothermal and hydrocarbon reservoir monitoring. Currently, knowledge of noise sources is insufficient for high-resolution subsurface monitoring applications. Near-real-time seismic data, as well as advanced imaging methods to constrain seismic noise sources have recently become available. These methods are based on the massive cross-correlation of seismic noise records from all available seismic stations in the region of interest and are therefore very computationally intensive. Heterogeneous massively parallel supercomputing systems introduced in the recent years combine conventional multi-core CPU with GPU accelerators and provide an opportunity for manifold increase and computing performance. Therefore, these systems represent an efficient platform for implementation of a noise source mapping solution. We present the first results of an ongoing research project conducted in collaboration with the Swiss National Supercomputing Centre (CSCS). The project aims at building a service that provides seismic noise source maps for Central Europe with high temporal resolution (days to few weeks depending on frequency and data availability). The service is hosted on the CSCS computing infrastructure; all computationally intensive processing is performed on the massively parallel heterogeneous supercomputer "Piz Daint". The solution architecture is based on the Application-as-a-Service concept in order to provide the interested external researchers the regular access to the noise source maps. The solution architecture includes the following sub-systems: (1) data acquisition responsible for collecting, on a periodic basis, raw seismic records from the European seismic networks, (2) high-performance noise source mapping application responsible for generation of source maps using cross-correlation of seismic records, (3) back-end infrastructure for the coordination of various tasks and computations, (4) front-end Web interface providing the service to the end-users and (5) data repository. The noise mapping application is composed of four principal modules: (1) pre-processing of raw data, (2) massive cross-correlation, (3) post-processing of correlation data based on computation of logarithmic energy ratio and (4) generation of source maps from post-processed data. Implementation of the solution posed various challenges, in particular, selection of data sources and transfer protocols, automation and monitoring of daily data downloads, ensuring the required data processing performance, design of a general service oriented architecture for coordination of various sub-systems, and engineering an appropriate data storage solution. The present pilot version of the service implements noise source maps for Switzerland. Extension of the solution to Central Europe is planned for the next project phase.
NASA Astrophysics Data System (ADS)
Misurelli, Sara M.
The ability to analyze an "auditory scene"---that is, to selectively attend to a target source while simultaneously segregating and ignoring distracting information---is one of the most important and complex skills utilized by normal hearing (NH) adults. The NH adult auditory system and brain work rather well to segregate auditory sources in adverse environments. However, for some children and individuals with hearing loss, selectively attending to one source in noisy environments can be extremely challenging. In a normal auditory system, information arriving at each ear is integrated, and thus these binaural cues aid in speech understanding in noise. A growing number of individuals who are deaf now receive cochlear implants (CIs), which supply hearing through electrical stimulation to the auditory nerve. In particular, bilateral cochlear implants (BICIs) are now becoming more prevalent, especially in children. However, because CI sound processing lacks both fine structure cues and coordination between stimulation at the two ears, binaural cues may either be absent or inconsistent. For children with NH and with BiCIs, this difficulty in segregating sources is of particular concern because their learning and development commonly occurs within the context of complex auditory environments. This dissertation intends to explore and understand the ability of children with NH and with BiCIs to function in everyday noisy environments. The goals of this work are to (1) Investigate source segregation abilities in children with NH and with BiCIs; (2) Examine the effect of target-interferer similarity and the benefits of source segregation for children with NH and with BiCIs; (3) Investigate measures of executive function that may predict performance in complex and realistic auditory tasks of source segregation for listeners with NH; and (4) Examine source segregation abilities in NH listeners, from school-age to adults.
swga: a primer design toolkit for selective whole genome amplification.
Clarke, Erik L; Sundararaman, Sesh A; Seifert, Stephanie N; Bushman, Frederic D; Hahn, Beatrice H; Brisson, Dustin
2017-07-15
Population genomic analyses are often hindered by difficulties in obtaining sufficient numbers of genomes for analysis by DNA sequencing. Selective whole-genome amplification (SWGA) provides an efficient approach to amplify microbial genomes from complex backgrounds for sequence acquisition. However, the process of designing sets of primers for this method has many degrees of freedom and would benefit from an automated process to evaluate the vast number of potential primer sets. Here, we present swga , a program that identifies primer sets for SWGA and evaluates them for efficiency and selectivity. We used swga to design and test primer sets for the selective amplification of Wolbachia pipientis genomic DNA from infected Drosophila melanogaster and Mycobacterium tuberculosis from human blood. We identify primer sets that successfully amplify each against their backgrounds and describe a general method for using swga for arbitrary targets. In addition, we describe characteristics of primer sets that correlate with successful amplification, and present guidelines for implementation of SWGA to detect new targets. Source code and documentation are freely available on https://www.github.com/eclarke/swga . The program is implemented in Python and C and licensed under the GNU Public License. ecl@mail.med.upenn.edu. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.