Ferraro, Jeffrey P; Ye, Ye; Gesteland, Per H; Haug, Peter J; Tsui, Fuchiang Rich; Cooper, Gregory F; Van Bree, Rudy; Ginter, Thomas; Nowalk, Andrew J; Wagner, Michael
2017-05-31
This study evaluates the accuracy and portability of a natural language processing (NLP) tool for extracting clinical findings of influenza from clinical notes across two large healthcare systems. Effectiveness is evaluated on how well NLP supports downstream influenza case-detection for disease surveillance. We independently developed two NLP parsers, one at Intermountain Healthcare (IH) in Utah and the other at University of Pittsburgh Medical Center (UPMC) using local clinical notes from emergency department (ED) encounters of influenza. We measured NLP parser performance for the presence and absence of 70 clinical findings indicative of influenza. We then developed Bayesian network models from NLP processed reports and tested their ability to discriminate among cases of (1) influenza, (2) non-influenza influenza-like illness (NI-ILI), and (3) 'other' diagnosis. On Intermountain Healthcare reports, recall and precision of the IH NLP parser were 0.71 and 0.75, respectively, and UPMC NLP parser, 0.67 and 0.79. On University of Pittsburgh Medical Center reports, recall and precision of the UPMC NLP parser were 0.73 and 0.80, respectively, and IH NLP parser, 0.53 and 0.80. Bayesian case-detection performance measured by AUROC for influenza versus non-influenza on Intermountain Healthcare cases was 0.93 (using IH NLP parser) and 0.93 (using UPMC NLP parser). Case-detection on University of Pittsburgh Medical Center cases was 0.95 (using UPMC NLP parser) and 0.83 (using IH NLP parser). For influenza versus NI-ILI on Intermountain Healthcare cases performance was 0.70 (using IH NLP parser) and 0.76 (using UPMC NLP parser). On University of Pisstburgh Medical Center cases, 0.76 (using UPMC NLP parser) and 0.65 (using IH NLP parser). In all but one instance (influenza versus NI-ILI using IH cases), local parsers were more effective at supporting case-detection although performances of non-local parsers were reasonable.
Software Development Of XML Parser Based On Algebraic Tools
NASA Astrophysics Data System (ADS)
Georgiev, Bozhidar; Georgieva, Adriana
2011-12-01
In this paper, is presented one software development and implementation of an algebraic method for XML data processing, which accelerates XML parsing process. Therefore, the proposed in this article nontraditional approach for fast XML navigation with algebraic tools contributes to advanced efforts in the making of an easier user-friendly API for XML transformations. Here the proposed software for XML documents processing (parser) is easy to use and can manage files with strictly defined data structure. The purpose of the presented algorithm is to offer a new approach for search and restructuring hierarchical XML data. This approach permits fast XML documents processing, using algebraic model developed in details in previous works of the same authors. So proposed parsing mechanism is easy accessible to the web consumer who is able to control XML file processing, to search different elements (tags) in it, to delete and to add a new XML content as well. The presented various tests show higher rapidity and low consumption of resources in comparison with some existing commercial parsers.
Evolution of the Generic Lock System at Jefferson Lab
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brian Bevins; Yves Roblin
2003-10-13
The Generic Lock system is a software framework that allows highly flexible feedback control of large distributed systems. It allows system operators to implement new feedback loops between arbitrary process variables quickly and with no disturbance to the underlying control system. Several different types of feedback loops are provided and more are being added. This paper describes the further evolution of the system since it was first presented at ICALEPCS 2001 and reports on two years of successful use in accelerator operations. The framework has been enhanced in several key ways. Multiple-input, multiple-output (MIMO) lock types have been added formore » accelerator orbit and energy stabilization. The general purpose Proportional-Integral-Derivative (PID) locks can now be tuned automatically. The generic lock server now makes use of the Proxy IOC (PIOC) developed at Jefferson Lab to allow the locks to be monitored from any EPICS Channel Access aware client. (Previously clients had to be Cdev aware.) The dependency on the Qt XML parser has been replaced with the freely available Xerces DOM parser from the Apache project.« less
Syntactic dependency parsers for biomedical-NLP.
Cohen, Raphael; Elhadad, Michael
2012-01-01
Syntactic parsers have made a leap in accuracy and speed in recent years. The high order structural information provided by dependency parsers is useful for a variety of NLP applications. We present a biomedical model for the EasyFirst parser, a fast and accurate parser for creating Stanford Dependencies. We evaluate the models trained in the biomedical domains of EasyFirst and Clear-Parser in a number of task oriented metrics. Both parsers provide stat of the art speed and accuracy in the Genia of over 89%. We show that Clear-Parser excels at tasks relating to negation identification while EasyFirst excels at tasks relating to Named Entities and is more robust to changes in domain.
Semantic Role Labeling of Clinical Text: Comparing Syntactic Parsers and Features
Zhang, Yaoyun; Jiang, Min; Wang, Jingqi; Xu, Hua
2016-01-01
Semantic role labeling (SRL), which extracts shallow semantic relation representation from different surface textual forms of free text sentences, is important for understanding clinical narratives. Since semantic roles are formed by syntactic constituents in the sentence, an effective parser, as well as an effective syntactic feature set are essential to build a practical SRL system. Our study initiates a formal evaluation and comparison of SRL performance on a clinical text corpus MiPACQ, using three state-of-the-art parsers, the Stanford parser, the Berkeley parser, and the Charniak parser. First, the original parsers trained on the open domain syntactic corpus Penn Treebank were employed. Next, those parsers were retrained on the clinical Treebank of MiPACQ for further comparison. Additionally, state-of-the-art syntactic features from open domain SRL were also examined for clinical text. Experimental results showed that retraining the parsers on clinical Treebank improved the performance significantly, with an optimal F1 measure of 71.41% achieved by the Berkeley parser. PMID:28269926
Parsing clinical text: how good are the state-of-the-art parsers?
Jiang, Min; Huang, Yang; Fan, Jung-wei; Tang, Buzhou; Denny, Josh; Xu, Hua
2015-01-01
Parsing, which generates a syntactic structure of a sentence (a parse tree), is a critical component of natural language processing (NLP) research in any domain including medicine. Although parsers developed in the general English domain, such as the Stanford parser, have been applied to clinical text, there are no formal evaluations and comparisons of their performance in the medical domain. In this study, we investigated the performance of three state-of-the-art parsers: the Stanford parser, the Bikel parser, and the Charniak parser, using following two datasets: (1) A Treebank containing 1,100 sentences that were randomly selected from progress notes used in the 2010 i2b2 NLP challenge and manually annotated according to a Penn Treebank based guideline; and (2) the MiPACQ Treebank, which is developed based on pathology notes and clinical notes, containing 13,091 sentences. We conducted three experiments on both datasets. First, we measured the performance of the three state-of-the-art parsers on the clinical Treebanks with their default settings. Then we re-trained the parsers using the clinical Treebanks and evaluated their performance using the 10-fold cross validation method. Finally we re-trained the parsers by combining the clinical Treebanks with the Penn Treebank. Our results showed that the original parsers achieved lower performance in clinical text (Bracketing F-measure in the range of 66.6%-70.3%) compared to general English text. After retraining on the clinical Treebank, all parsers achieved better performance, with the best performance from the Stanford parser that reached the highest Bracketing F-measure of 73.68% on progress notes and 83.72% on the MiPACQ corpus using 10-fold cross validation. When the combined clinical Treebanks and Penn Treebank was used, of the three parsers, the Charniak parser achieved the highest Bracketing F-measure of 73.53% on progress notes and the Stanford parser reached the highest F-measure of 84.15% on the MiPACQ corpus. Our study demonstrates that re-training using clinical Treebanks is critical for improving general English parsers' performance on clinical text, and combining clinical and open domain corpora might achieve optimal performance for parsing clinical text.
Wagner, Michael M.; Cooper, Gregory F.; Ferraro, Jeffrey P.; Su, Howard; Gesteland, Per H.; Haug, Peter J.; Millett, Nicholas E.; Aronis, John M.; Nowalk, Andrew J.; Ruiz, Victor M.; López Pineda, Arturo; Shi, Lingyun; Van Bree, Rudy; Ginter, Thomas; Tsui, Fuchiang
2017-01-01
Objectives This study evaluates the accuracy and transferability of Bayesian case detection systems (BCD) that use clinical notes from emergency department (ED) to detect influenza cases. Methods A BCD uses natural language processing (NLP) to infer the presence or absence of clinical findings from ED notes, which are fed into a Bayesain network classifier (BN) to infer patients’ diagnoses. We developed BCDs at the University of Pittsburgh Medical Center (BCDUPMC) and Intermountain Healthcare in Utah (BCDIH). At each site, we manually built a rule-based NLP and trained a Bayesain network classifier from over 40,000 ED encounters between Jan. 2008 and May. 2010 using feature selection, machine learning, and expert debiasing approach. Transferability of a BCD in this study may be impacted by seven factors: development (source) institution, development parser, application (target) institution, application parser, NLP transfer, BN transfer, and classification task. We employed an ANOVA analysis to study their impacts on BCD performance. Results Both BCDs discriminated well between influenza and non-influenza on local test cases (AUCs > 0.92). When tested for transferability using the other institution’s cases, BCDUPMC discriminations declined minimally (AUC decreased from 0.95 to 0.94, p<0.01), and BCDIH discriminations declined more (from 0.93 to 0.87, p<0.0001). We attributed the BCDIH decline to the lower recall of the IH parser on UPMC notes. The ANOVA analysis showed five significant factors: development parser, application institution, application parser, BN transfer, and classification task. Conclusion We demonstrated high influenza case detection performance in two large healthcare systems in two geographically separated regions, providing evidentiary support for the use of automated case detection from routinely collected electronic clinical notes in national influenza surveillance. The transferability could be improved by training Bayesian network classifier locally and increasing the accuracy of the NLP parser. PMID:28380048
Ye, Ye; Wagner, Michael M; Cooper, Gregory F; Ferraro, Jeffrey P; Su, Howard; Gesteland, Per H; Haug, Peter J; Millett, Nicholas E; Aronis, John M; Nowalk, Andrew J; Ruiz, Victor M; López Pineda, Arturo; Shi, Lingyun; Van Bree, Rudy; Ginter, Thomas; Tsui, Fuchiang
2017-01-01
This study evaluates the accuracy and transferability of Bayesian case detection systems (BCD) that use clinical notes from emergency department (ED) to detect influenza cases. A BCD uses natural language processing (NLP) to infer the presence or absence of clinical findings from ED notes, which are fed into a Bayesain network classifier (BN) to infer patients' diagnoses. We developed BCDs at the University of Pittsburgh Medical Center (BCDUPMC) and Intermountain Healthcare in Utah (BCDIH). At each site, we manually built a rule-based NLP and trained a Bayesain network classifier from over 40,000 ED encounters between Jan. 2008 and May. 2010 using feature selection, machine learning, and expert debiasing approach. Transferability of a BCD in this study may be impacted by seven factors: development (source) institution, development parser, application (target) institution, application parser, NLP transfer, BN transfer, and classification task. We employed an ANOVA analysis to study their impacts on BCD performance. Both BCDs discriminated well between influenza and non-influenza on local test cases (AUCs > 0.92). When tested for transferability using the other institution's cases, BCDUPMC discriminations declined minimally (AUC decreased from 0.95 to 0.94, p<0.01), and BCDIH discriminations declined more (from 0.93 to 0.87, p<0.0001). We attributed the BCDIH decline to the lower recall of the IH parser on UPMC notes. The ANOVA analysis showed five significant factors: development parser, application institution, application parser, BN transfer, and classification task. We demonstrated high influenza case detection performance in two large healthcare systems in two geographically separated regions, providing evidentiary support for the use of automated case detection from routinely collected electronic clinical notes in national influenza surveillance. The transferability could be improved by training Bayesian network classifier locally and increasing the accuracy of the NLP parser.
Parsing clinical text: how good are the state-of-the-art parsers?
2015-01-01
Background Parsing, which generates a syntactic structure of a sentence (a parse tree), is a critical component of natural language processing (NLP) research in any domain including medicine. Although parsers developed in the general English domain, such as the Stanford parser, have been applied to clinical text, there are no formal evaluations and comparisons of their performance in the medical domain. Methods In this study, we investigated the performance of three state-of-the-art parsers: the Stanford parser, the Bikel parser, and the Charniak parser, using following two datasets: (1) A Treebank containing 1,100 sentences that were randomly selected from progress notes used in the 2010 i2b2 NLP challenge and manually annotated according to a Penn Treebank based guideline; and (2) the MiPACQ Treebank, which is developed based on pathology notes and clinical notes, containing 13,091 sentences. We conducted three experiments on both datasets. First, we measured the performance of the three state-of-the-art parsers on the clinical Treebanks with their default settings. Then we re-trained the parsers using the clinical Treebanks and evaluated their performance using the 10-fold cross validation method. Finally we re-trained the parsers by combining the clinical Treebanks with the Penn Treebank. Results Our results showed that the original parsers achieved lower performance in clinical text (Bracketing F-measure in the range of 66.6%-70.3%) compared to general English text. After retraining on the clinical Treebank, all parsers achieved better performance, with the best performance from the Stanford parser that reached the highest Bracketing F-measure of 73.68% on progress notes and 83.72% on the MiPACQ corpus using 10-fold cross validation. When the combined clinical Treebanks and Penn Treebank was used, of the three parsers, the Charniak parser achieved the highest Bracketing F-measure of 73.53% on progress notes and the Stanford parser reached the highest F-measure of 84.15% on the MiPACQ corpus. Conclusions Our study demonstrates that re-training using clinical Treebanks is critical for improving general English parsers' performance on clinical text, and combining clinical and open domain corpora might achieve optimal performance for parsing clinical text. PMID:26045009
COD::CIF::Parser: an error-correcting CIF parser for the Perl language.
Merkys, Andrius; Vaitkus, Antanas; Butkus, Justas; Okulič-Kazarinas, Mykolas; Kairys, Visvaldas; Gražulis, Saulius
2016-02-01
A syntax-correcting CIF parser, COD::CIF::Parser , is presented that can parse CIF 1.1 files and accurately report the position and the nature of the discovered syntactic problems. In addition, the parser is able to automatically fix the most common and the most obvious syntactic deficiencies of the input files. Bindings for Perl, C and Python programming environments are available. Based on COD::CIF::Parser , the cod-tools package for manipulating the CIFs in the Crystallography Open Database (COD) has been developed. The cod-tools package has been successfully used for continuous updates of the data in the automated COD data deposition pipeline, and to check the validity of COD data against the IUCr data validation guidelines. The performance, capabilities and applications of different parsers are compared.
Storing files in a parallel computing system based on user-specified parser function
Faibish, Sorin; Bent, John M; Tzelnic, Percy; Grider, Gary; Manzanares, Adam; Torres, Aaron
2014-10-21
Techniques are provided for storing files in a parallel computing system based on a user-specified parser function. A plurality of files generated by a distributed application in a parallel computing system are stored by obtaining a parser from the distributed application for processing the plurality of files prior to storage; and storing one or more of the plurality of files in one or more storage nodes of the parallel computing system based on the processing by the parser. The plurality of files comprise one or more of a plurality of complete files and a plurality of sub-files. The parser can optionally store only those files that satisfy one or more semantic requirements of the parser. The parser can also extract metadata from one or more of the files and the extracted metadata can be stored with one or more of the plurality of files and used for searching for files.
ERIC Educational Resources Information Center
Dekydtspotter, Laurent
2001-01-01
From the perspective of Fodor's (1983) theory of mental organization and Chomsky's (1995) Minimalist theory of grammar, considers constraints on the interpretation of French-type and English-type cardinality interrogatives in the task of sentence comprehension, as a function of a universal parsing algorithm and hypotheses embodied in a French-type…
Benchmarking natural-language parsers for biological applications using dependency graphs.
Clegg, Andrew B; Shepherd, Adrian J
2007-01-25
Interest is growing in the application of syntactic parsers to natural language processing problems in biology, but assessing their performance is difficult because differences in linguistic convention can falsely appear to be errors. We present a method for evaluating their accuracy using an intermediate representation based on dependency graphs, in which the semantic relationships important in most information extraction tasks are closer to the surface. We also demonstrate how this method can be easily tailored to various application-driven criteria. Using the GENIA corpus as a gold standard, we tested four open-source parsers which have been used in bioinformatics projects. We first present overall performance measures, and test the two leading tools, the Charniak-Lease and Bikel parsers, on subtasks tailored to reflect the requirements of a system for extracting gene expression relationships. These two tools clearly outperform the other parsers in the evaluation, and achieve accuracy levels comparable to or exceeding native dependency parsers on similar tasks in previous biological evaluations. Evaluating using dependency graphs allows parsers to be tested easily on criteria chosen according to the semantics of particular biological applications, drawing attention to important mistakes and soaking up many insignificant differences that would otherwise be reported as errors. Generating high-accuracy dependency graphs from the output of phrase-structure parsers also provides access to the more detailed syntax trees that are used in several natural-language processing techniques.
Benchmarking natural-language parsers for biological applications using dependency graphs
Clegg, Andrew B; Shepherd, Adrian J
2007-01-01
Background Interest is growing in the application of syntactic parsers to natural language processing problems in biology, but assessing their performance is difficult because differences in linguistic convention can falsely appear to be errors. We present a method for evaluating their accuracy using an intermediate representation based on dependency graphs, in which the semantic relationships important in most information extraction tasks are closer to the surface. We also demonstrate how this method can be easily tailored to various application-driven criteria. Results Using the GENIA corpus as a gold standard, we tested four open-source parsers which have been used in bioinformatics projects. We first present overall performance measures, and test the two leading tools, the Charniak-Lease and Bikel parsers, on subtasks tailored to reflect the requirements of a system for extracting gene expression relationships. These two tools clearly outperform the other parsers in the evaluation, and achieve accuracy levels comparable to or exceeding native dependency parsers on similar tasks in previous biological evaluations. Conclusion Evaluating using dependency graphs allows parsers to be tested easily on criteria chosen according to the semantics of particular biological applications, drawing attention to important mistakes and soaking up many insignificant differences that would otherwise be reported as errors. Generating high-accuracy dependency graphs from the output of phrase-structure parsers also provides access to the more detailed syntax trees that are used in several natural-language processing techniques. PMID:17254351
Extracting noun phrases for all of MEDLINE.
Bennett, N. A.; He, Q.; Powell, K.; Schatz, B. R.
1999-01-01
A natural language parser that could extract noun phrases for all medical texts would be of great utility in analyzing content for information retrieval. We discuss the extraction of noun phrases from MEDLINE, using a general parser not tuned specifically for any medical domain. The noun phrase extractor is made up of three modules: tokenization; part-of-speech tagging; noun phrase identification. Using our program, we extracted noun phrases from the entire MEDLINE collection, encompassing 9.3 million abstracts. Over 270 million noun phrases were generated, of which 45 million were unique. The quality of these phrases was evaluated by examining all phrases from a sample collection of abstracts. The precision and recall of the phrases from our general parser compared favorably with those from three other parsers we had previously evaluated. We are continuing to improve our parser and evaluate our claim that a generic parser can effectively extract all the different phrases across the entire medical literature. PMID:10566444
"gnparser": a powerful parser for scientific names based on Parsing Expression Grammar.
Mozzherin, Dmitry Y; Myltsev, Alexander A; Patterson, David J
2017-05-26
Scientific names in biology act as universal links. They allow us to cross-reference information about organisms globally. However variations in spelling of scientific names greatly diminish their ability to interconnect data. Such variations may include abbreviations, annotations, misspellings, etc. Authorship is a part of a scientific name and may also differ significantly. To match all possible variations of a name we need to divide them into their elements and classify each element according to its role. We refer to this as 'parsing' the name. Parsing categorizes name's elements into those that are stable and those that are prone to change. Names are matched first by combining them according to their stable elements. Matches are then refined by examining their varying elements. This two stage process dramatically improves the number and quality of matches. It is especially useful for the automatic data exchange within the context of "Big Data" in biology. We introduce Global Names Parser (gnparser). It is a Java tool written in Scala language (a language for Java Virtual Machine) to parse scientific names. It is based on a Parsing Expression Grammar. The parser can be applied to scientific names of any complexity. It assigns a semantic meaning (such as genus name, species epithet, rank, year of publication, authorship, annotations, etc.) to all elements of a name. It is able to work with nested structures as in the names of hybrids. gnparser performs with ≈99% accuracy and processes 30 million name-strings/hour per CPU thread. The gnparser library is compatible with Scala, Java, R, Jython, and JRuby. The parser can be used as a command line application, as a socket server, a web-app or as a RESTful HTTP-service. It is released under an Open source MIT license. Global Names Parser (gnparser) is a fast, high precision tool for biodiversity informaticians and biologists working with large numbers of scientific names. It can replace expensive and error-prone manual parsing and standardization of scientific names in many situations, and can quickly enhance the interoperability of distributed biological information.
A Protocol for Annotating Parser Differences. Research Report. ETS RR-16-02
ERIC Educational Resources Information Center
Bruno, James V.; Cahill, Aoife; Gyawali, Binod
2016-01-01
We present an annotation scheme for classifying differences in the outputs of syntactic constituency parsers when a gold standard is unavailable or undesired, as in the case of texts written by nonnative speakers of English. We discuss its automated implementation and the results of a case study that uses the scheme to choose a parser best suited…
Processing of ICARTT Data Files Using Fuzzy Matching and Parser Combinators
NASA Technical Reports Server (NTRS)
Rutherford, Matthew T.; Typanski, Nathan D.; Wang, Dali; Chen, Gao
2014-01-01
In this paper, the task of parsing and matching inconsistent, poorly formed text data through the use of parser combinators and fuzzy matching is discussed. An object-oriented implementation of the parser combinator technique is used to allow for a relatively simple interface for adapting base parsers. For matching tasks, a fuzzy matching algorithm with Levenshtein distance calculations is implemented to match string pair, which are otherwise difficult to match due to the aforementioned irregularities and errors in one or both pair members. Used in concert, the two techniques allow parsing and matching operations to be performed which had previously only been done manually.
The parser generator as a general purpose tool
NASA Technical Reports Server (NTRS)
Noonan, R. E.; Collins, W. R.
1985-01-01
The parser generator has proven to be an extremely useful, general purpose tool. It can be used effectively by programmers having only a knowledge of grammars and no training at all in the theory of formal parsing. Some of the application areas for which a table-driven parser can be used include interactive, query languages, menu systems, translators, and programming support tools. Each of these is illustrated by an example grammar.
Domain Adaption of Parsing for Operative Notes
Wang, Yan; Pakhomov, Serguei; Ryan, James O.; Melton, Genevieve B.
2016-01-01
Background Full syntactic parsing of clinical text as a part of clinical natural language processing (NLP) is critical for a wide range of applications, such as identification of adverse drug reactions, patient cohort identification, and gene interaction extraction. Several robust syntactic parsers are publicly available to produce linguistic representations for sentences. However, these existing parsers are mostly trained on general English text and often require adaptation for optimal performance on clinical text. Our objective was to adapt an existing general English parser for the clinical text of operative reports via lexicon augmentation, statistics adjusting, and grammar rules modification based on a set of biomedical text. Method The Stanford unlexicalized probabilistic context-free grammar (PCFG) parser lexicon was expanded with SPECIALIST lexicon along with statistics collected from a limited set of operative notes tagged with a two of POS taggers (GENIA tagger and MedPost). The most frequently occurring verb entries of the SPECIALIST lexicon were adjusted based on manual review of verb usage in operative notes. Stanford parser grammar production rules were also modified based on linguistic features of operative reports. An analogous approach was then applied to the GENIA corpus to test the generalizability of this approach to biomedical text. Results The new unlexicalized PCFG parser extended with the extra lexicon from SPECIALIST along with accurate statistics collected from an operative note corpus tagged with GENIA POS tagger improved the parser performance by 2.26% from 87.64% to 89.90%. There was a progressive improvement with the addition of multiple approaches. Most of the improvement occurred with lexicon augmentation combined with statistics from the operative notes corpus. Application of this approach on the GENIA corpus showed that parsing performance was boosted by 3.81% with a simple new grammar and the addition of the GENIA corpus lexicon. Conclusion Using statistics collected from clinical text tagged with POS taggers along with proper modification of grammars and lexicons of an unlexicalized PCFG parser can improve parsing performance. PMID:25661593
PDB file parser and structure class implemented in Python.
Hamelryck, Thomas; Manderick, Bernard
2003-11-22
The biopython project provides a set of bioinformatics tools implemented in Python. Recently, biopython was extended with a set of modules that deal with macromolecular structure. Biopython now contains a parser for PDB files that makes the atomic information available in an easy-to-use but powerful data structure. The parser and data structure deal with features that are often left out or handled inadequately by other packages, e.g. atom and residue disorder (if point mutants are present in the crystal), anisotropic B factors, multiple models and insertion codes. In addition, the parser performs some sanity checking to detect obvious errors. The Biopython distribution (including source code and documentation) is freely available (under the Biopython license) from http://www.biopython.org
GazeParser: an open-source and multiplatform library for low-cost eye tracking and analysis.
Sogo, Hiroyuki
2013-09-01
Eye movement analysis is an effective method for research on visual perception and cognition. However, recordings of eye movements present practical difficulties related to the cost of the recording devices and the programming of device controls for use in experiments. GazeParser is an open-source library for low-cost eye tracking and data analysis; it consists of a video-based eyetracker and libraries for data recording and analysis. The libraries are written in Python and can be used in conjunction with PsychoPy and VisionEgg experimental control libraries. Three eye movement experiments are reported on performance tests of GazeParser. These showed that the means and standard deviations for errors in sampling intervals were less than 1 ms. Spatial accuracy ranged from 0.7° to 1.2°, depending on participant. In gap/overlap tasks and antisaccade tasks, the latency and amplitude of the saccades detected by GazeParser agreed with those detected by a commercial eyetracker. These results showed that the GazeParser demonstrates adequate performance for use in psychological experiments.
Is human sentence parsing serial or parallel? Evidence from event-related brain potentials.
Hopf, Jens-Max; Bader, Markus; Meng, Michael; Bayer, Josef
2003-01-01
In this ERP study we investigate the processes that occur in syntactically ambiguous German sentences at the point of disambiguation. Whereas most psycholinguistic theories agree on the view that processing difficulties arise when parsing preferences are disconfirmed (so-called garden-path effects), important differences exist with respect to theoretical assumptions about the parser's recovery from a misparse. A key distinction can be made between parsers that compute all alternative syntactic structures in parallel (parallel parsers) and parsers that compute only a single preferred analysis (serial parsers). To distinguish empirically between parallel and serial parsing models, we compare ERP responses to garden-path sentences with ERP responses to truly ungrammatical sentences. Garden-path sentences contain a temporary and ultimately curable ungrammaticality, whereas truly ungrammatical sentences remain so permanently--a difference which gives rise to different predictions in the two classes of parsing architectures. At the disambiguating word, ERPs in both sentence types show negative shifts of similar onset latency, amplitude, and scalp distribution in an initial time window between 300 and 500 ms. In a following time window (500-700 ms), the negative shift to garden-path sentences disappears at right central parietal sites, while it continues in permanently ungrammatical sentences. These data are taken as evidence for a strictly serial parser. The absence of a difference in the early time window indicates that temporary and permanent ungrammaticalities trigger the same kind of parsing responses. Later differences can be related to successful reanalysis in garden-path but not in ungrammatical sentences. Copyright 2003 Elsevier Science B.V.
Yang, Chunguang G; Granite, Stephen J; Van Eyk, Jennifer E; Winslow, Raimond L
2006-11-01
Protein identification using MS is an important technique in proteomics as well as a major generator of proteomics data. We have designed the protein identification data object model (PDOM) and developed a parser based on this model to facilitate the analysis and storage of these data. The parser works with HTML or XML files saved or exported from MASCOT MS/MS ions search in peptide summary report or MASCOT PMF search in protein summary report. The program creates PDOM objects, eliminates redundancy in the input file, and has the capability to output any PDOM object to a relational database. This program facilitates additional analysis of MASCOT search results and aids the storage of protein identification information. The implementation is extensible and can serve as a template to develop parsers for other search engines. The parser can be used as a stand-alone application or can be driven by other Java programs. It is currently being used as the front end for a system that loads HTML and XML result files of MASCOT searches into a relational database. The source code is freely available at http://www.ccbm.jhu.edu and the program uses only free and open-source Java libraries.
Investigating AI with BASIC and Logo: Helping the Computer to Understand INPUTS.
ERIC Educational Resources Information Center
Mandell, Alan; Lucking, Robert
1988-01-01
Investigates using the microcomputer to develop a sentence parser to simulate intelligent conversation used in artificial intelligence applications. Compares the ability of LOGO and BASIC for this use. Lists and critiques several LOGO and BASIC parser programs. (MVL)
Chen, Hung-Ming; Liou, Yong-Zan
2014-10-01
In a mobile health management system, mobile devices act as the application hosting devices for personal health records (PHRs) and the healthcare servers construct to exchange and analyze PHRs. One of the most popular PHR standards is continuity of care record (CCR). The CCR is expressed in XML formats. However, parsing is an expensive operation that can degrade XML processing performance. Hence, the objective of this study was to identify different operational and performance characteristics for those CCR parsing models including the XML DOM parser, the SAX parser, the PULL parser, and the JSON parser with regard to JSON data converted from XML-based CCR. Thus, developers can make sensible choices for their target PHR applications to parse CCRs when using mobile devices or servers with different system resources. Furthermore, the simulation experiments of four case studies are conducted to compare the parsing performance on Android mobile devices and the server with large quantities of CCR data.
Open Source Software Projects Needing Security Investments
2015-06-19
modtls, BouncyCastle, gpg, otr, axolotl. 7. Static analyzers: Clang, Frama-C. 8. Nginx. 9. OpenVPN . It was noted that the funding model may be similar...to OpenSSL, where consulting funds the company. It was also noted that OpenVPN needs to correctly use OpenSSL in order to be secure, so focusing on...Dovecot 4. Other high-impact network services: OpenSSH, OpenVPN , BIND, ISC DHCP, University of Delaware NTPD 5. Core infrastructure data parsers
Progress in The Semantic Analysis of Scientific Code
NASA Technical Reports Server (NTRS)
Stewart, Mark
2000-01-01
This paper concerns a procedure that analyzes aspects of the meaning or semantics of scientific and engineering code. This procedure involves taking a user's existing code, adding semantic declarations for some primitive variables, and parsing this annotated code using multiple, independent expert parsers. These semantic parsers encode domain knowledge and recognize formulae in different disciplines including physics, numerical methods, mathematics, and geometry. The parsers will automatically recognize and document some static, semantic concepts and help locate some program semantic errors. These techniques may apply to a wider range of scientific codes. If so, the techniques could reduce the time, risk, and effort required to develop and modify scientific codes.
Colaert, Niklaas; Barsnes, Harald; Vaudel, Marc; Helsens, Kenny; Timmerman, Evy; Sickmann, Albert; Gevaert, Kris; Martens, Lennart
2011-08-05
The Thermo Proteome Discoverer program integrates both peptide identification and quantification into a single workflow for peptide-centric proteomics. Furthermore, its close integration with Thermo mass spectrometers has made it increasingly popular in the field. Here, we present a Java library to parse the msf files that constitute the output of Proteome Discoverer. The parser is also implemented as a graphical user interface allowing convenient access to the information found in the msf files, and in Rover, a program to analyze and validate quantitative proteomics information. All code, binaries, and documentation is freely available at http://thermo-msf-parser.googlecode.com.
Chen, Mingyang; Stott, Amanda C; Li, Shenggang; Dixon, David A
2012-04-01
A robust metadata database called the Collaborative Chemistry Database Tool (CCDBT) for massive amounts of computational chemistry raw data has been designed and implemented. It performs data synchronization and simultaneously extracts the metadata. Computational chemistry data in various formats from different computing sources, software packages, and users can be parsed into uniform metadata for storage in a MySQL database. Parsing is performed by a parsing pyramid, including parsers written for different levels of data types and sets created by the parser loader after loading parser engines and configurations. Copyright © 2011 Elsevier Inc. All rights reserved.
Memory Retrieval in Parsing and Interpretation
ERIC Educational Resources Information Center
Schlueter, Ananda Lila Zoe
2017-01-01
This dissertation explores the relationship between the parser and the grammar in error-driven retrieval by examining the mechanism underlying the illusory licensing of subject-verb agreement violations ("agreement attraction"). Previous work motivates a two-stage model of agreement attraction in which the parser predicts the verb's…
Looking forwards and backwards: The real-time processing of Strong and Weak Crossover
Lidz, Jeffrey; Phillips, Colin
2017-01-01
We investigated the processing of pronouns in Strong and Weak Crossover constructions as a means of probing the extent to which the incremental parser can use syntactic information to guide antecedent retrieval. In Experiment 1 we show that the parser accesses a displaced wh-phrase as an antecedent for a pronoun when no grammatical constraints prohibit binding, but the parser ignores the same wh-phrase when it stands in a Strong Crossover relation to the pronoun. These results are consistent with two possibilities. First, the parser could apply Principle C at antecedent retrieval to exclude the wh-phrase on the basis of the c-command relation between its gap and the pronoun. Alternatively, retrieval might ignore any phrases that do not occupy an Argument position. Experiment 2 distinguished between these two possibilities by testing antecedent retrieval under Weak Crossover. In Weak Crossover binding of the pronoun is ruled out by the argument condition, but not Principle C. The results of Experiment 2 indicate that antecedent retrieval accesses matching wh-phrases in Weak Crossover configurations. On the basis of these findings we conclude that the parser can make rapid use of Principle C and c-command information to constrain retrieval. We discuss how our results support a view of antecedent retrieval that integrates inferences made over unseen syntactic structure into constraints on backward-looking processes like memory retrieval. PMID:28936483
Towards automated processing of clinical Finnish: sublanguage analysis and a rule-based parser.
Laippala, Veronika; Ginter, Filip; Pyysalo, Sampo; Salakoski, Tapio
2009-12-01
In this paper, we present steps taken towards more efficient automated processing of clinical Finnish, focusing on daily nursing notes in a Finnish Intensive Care Unit (ICU). First, we analyze ICU Finnish as a sublanguage, identifying its specific features facilitating, for example, the development of a specialized syntactic analyser. The identified features include frequent omission of finite verbs, limitations in allowed syntactic structures, and domain-specific vocabulary. Second, we develop a formal grammar and a parser for ICU Finnish, thus providing better tools for the development of further applications in the clinical domain. The grammar is implemented in the LKB system in a typed feature structure formalism. The lexicon is automatically generated based on the output of the FinTWOL morphological analyzer adapted to the clinical domain. As an additional experiment, we study the effect of using Finnish constraint grammar to reduce the size of the lexicon. The parser construction thus makes efficient use of existing resources for Finnish. The grammar currently covers 76.6% of ICU Finnish sentences, producing highly accurate best-parse analyzes with F-score of 91.1%. We find that building a parser for the highly specialized domain sublanguage is not only feasible, but also surprisingly efficient, given an existing morphological analyzer with broad vocabulary coverage. The resulting parser enables a deeper analysis of the text than was previously possible.
Linking Parser Development to Acquisition of Syntactic Knowledge
ERIC Educational Resources Information Center
Omaki, Akira; Lidz, Jeffrey
2015-01-01
Traditionally, acquisition of syntactic knowledge and the development of sentence comprehension behaviors have been treated as separate disciplines. This article reviews a growing body of work on the development of incremental sentence comprehension mechanisms and discusses how a better understanding of the developing parser can shed light on two…
The value of parsing as feature generation for gene mention recognition
Smith, Larry H; Wilbur, W John
2009-01-01
We measured the extent to which information surrounding a base noun phrase reflects the presence of a gene name, and evaluated seven different parsers in their ability to provide information for that purpose. Using the GENETAG corpus as a gold standard, we performed machine learning to recognize from its context when a base noun phrase contained a gene name. Starting with the best lexical features, we assessed the gain of adding dependency or dependency-like relations from a full sentence parse. Features derived from parsers improved performance in this partial gene mention recognition task by a small but statistically significant amount. There were virtually no differences between parsers in these experiments. PMID:19345281
A python tool for the implementation of domain-specific languages
NASA Astrophysics Data System (ADS)
Dejanović, Igor; Vaderna, Renata; Milosavljević, Gordana; Simić, Miloš; Vuković, Željko
2017-07-01
In this paper we describe textX, a meta-language and a tool for building Domain-Specific Languages. It is implemented in Python using Arpeggio PEG (Parsing Expression Grammar) parser library. From a single language description (grammar) textX will build a parser and a meta-model (a.k.a. abstract syntax) of the language. The parser is used to parse textual representations of models conforming to the meta-model. As a result of parsing, a Python object graph will be automatically created. The structure of the object graph will conform to the meta-model defined by the grammar. This approach frees a developer from the need to manually analyse a parse tree and transform it to other suitable representation. The textX library is independent of any integrated development environment and can be easily integrated in any Python project. The textX tool works as a grammar interpreter. The parser is configured at run-time using the grammar. The textX tool is a free and open-source project available at GitHub.
Structure before Meaning: Sentence Processing, Plausibility, and Subcategorization
Kizach, Johannes; Nyvad, Anne Mette; Christensen, Ken Ramshøj
2013-01-01
Natural language processing is a fast and automatized process. A crucial part of this process is parsing, the online incremental construction of a syntactic structure. The aim of this study was to test whether a wh-filler extracted from an embedded clause is initially attached as the object of the matrix verb with subsequent reanalysis, and if so, whether the plausibility of such an attachment has an effect on reaction time. Finally, we wanted to examine whether subcategorization plays a role. We used a method called G-Maze to measure response time in a self-paced reading design. The experiments confirmed that there is early attachment of fillers to the matrix verb. When this attachment is implausible, the off-line acceptability of the whole sentence is significantly reduced. The on-line results showed that G-Maze was highly suited for this type of experiment. In accordance with our predictions, the results suggest that the parser ignores (or has no access to information about) implausibility and attaches fillers as soon as possible to the matrix verb. However, the results also show that the parser uses the subcategorization frame of the matrix verb. In short, the parser ignores semantic information and allows implausible attachments but adheres to information about which type of object a verb can take, ensuring that the parser does not make impossible attachments. We argue that the evidence supports a syntactic parser informed by syntactic cues, rather than one guided by semantic cues or one that is blind, or completely autonomous. PMID:24116101
Structure before meaning: sentence processing, plausibility, and subcategorization.
Kizach, Johannes; Nyvad, Anne Mette; Christensen, Ken Ramshøj
2013-01-01
Natural language processing is a fast and automatized process. A crucial part of this process is parsing, the online incremental construction of a syntactic structure. The aim of this study was to test whether a wh-filler extracted from an embedded clause is initially attached as the object of the matrix verb with subsequent reanalysis, and if so, whether the plausibility of such an attachment has an effect on reaction time. Finally, we wanted to examine whether subcategorization plays a role. We used a method called G-Maze to measure response time in a self-paced reading design. The experiments confirmed that there is early attachment of fillers to the matrix verb. When this attachment is implausible, the off-line acceptability of the whole sentence is significantly reduced. The on-line results showed that G-Maze was highly suited for this type of experiment. In accordance with our predictions, the results suggest that the parser ignores (or has no access to information about) implausibility and attaches fillers as soon as possible to the matrix verb. However, the results also show that the parser uses the subcategorization frame of the matrix verb. In short, the parser ignores semantic information and allows implausible attachments but adheres to information about which type of object a verb can take, ensuring that the parser does not make impossible attachments. We argue that the evidence supports a syntactic parser informed by syntactic cues, rather than one guided by semantic cues or one that is blind, or completely autonomous.
An Experiment in Scientific Code Semantic Analysis
NASA Technical Reports Server (NTRS)
Stewart, Mark E. M.
1998-01-01
This paper concerns a procedure that analyzes aspects of the meaning or semantics of scientific and engineering code. This procedure involves taking a user's existing code, adding semantic declarations for some primitive variables, and parsing this annotated code using multiple, distributed expert parsers. These semantic parser are designed to recognize formulae in different disciplines including physical and mathematical formulae and geometrical position in a numerical scheme. The parsers will automatically recognize and document some static, semantic concepts and locate some program semantic errors. Results are shown for a subroutine test case and a collection of combustion code routines. This ability to locate some semantic errors and document semantic concepts in scientific and engineering code should reduce the time, risk, and effort of developing and using these codes.
Parser Combinators: a Practical Application for Generating Parsers for NMR Data
Fenwick, Matthew; Weatherby, Gerard; Ellis, Heidi JC; Gryk, Michael R.
2013-01-01
Nuclear Magnetic Resonance (NMR) spectroscopy is a technique for acquiring protein data at atomic resolution and determining the three-dimensional structure of large protein molecules. A typical structure determination process results in the deposition of a large data sets to the BMRB (Bio-Magnetic Resonance Data Bank). This data is stored and shared in a file format called NMR-Star. This format is syntactically and semantically complex making it challenging to parse. Nevertheless, parsing these files is crucial to applying the vast amounts of biological information stored in NMR-Star files, allowing researchers to harness the results of previous studies to direct and validate future work. One powerful approach for parsing files is to apply a Backus-Naur Form (BNF) grammar, which is a high-level model of a file format. Translation of the grammatical model to an executable parser may be automatically accomplished. This paper will show how we applied a model BNF grammar of the NMR-Star format to create a free, open-source parser, using a method that originated in the functional programming world known as “parser combinators”. This paper demonstrates the effectiveness of a principled approach to file specification and parsing. This paper also builds upon our previous work [1], in that 1) it applies concepts from Functional Programming (which is relevant even though the implementation language, Java, is more mainstream than Functional Programming), and 2) all work and accomplishments from this project will be made available under standard open source licenses to provide the community with the opportunity to learn from our techniques and methods. PMID:24352525
ImageParser: a tool for finite element generation from three-dimensional medical images
Yin, HM; Sun, LZ; Wang, G; Yamada, T; Wang, J; Vannier, MW
2004-01-01
Background The finite element method (FEM) is a powerful mathematical tool to simulate and visualize the mechanical deformation of tissues and organs during medical examinations or interventions. It is yet a challenge to build up an FEM mesh directly from a volumetric image partially because the regions (or structures) of interest (ROIs) may be irregular and fuzzy. Methods A software package, ImageParser, is developed to generate an FEM mesh from 3-D tomographic medical images. This software uses a semi-automatic method to detect ROIs from the context of image including neighboring tissues and organs, completes segmentation of different tissues, and meshes the organ into elements. Results The ImageParser is shown to build up an FEM model for simulating the mechanical responses of the breast based on 3-D CT images. The breast is compressed by two plate paddles under an overall displacement as large as 20% of the initial distance between the paddles. The strain and tangential Young's modulus distributions are specified for the biomechanical analysis of breast tissues. Conclusion The ImageParser can successfully exact the geometry of ROIs from a complex medical image and generate the FEM mesh with customer-defined segmentation information. PMID:15461787
An Experiment in Scientific Program Understanding
NASA Technical Reports Server (NTRS)
Stewart, Mark E. M.; Owen, Karl (Technical Monitor)
2000-01-01
This paper concerns a procedure that analyzes aspects of the meaning or semantics of scientific and engineering code. This procedure involves taking a user's existing code, adding semantic declarations for some primitive variables, and parsing this annotated code using multiple, independent expert parsers. These semantic parsers encode domain knowledge and recognize formulae in different disciplines including physics, numerical methods, mathematics, and geometry. The parsers will automatically recognize and document some static, semantic concepts and help locate some program semantic errors. Results are shown for three intensively studied codes and seven blind test cases; all test cases are state of the art scientific codes. These techniques may apply to a wider range of scientific codes. If so, the techniques could reduce the time, risk, and effort required to develop and modify scientific codes.
ULTRA: Universal Grammar as a Universal Parser
Medeiros, David P.
2018-01-01
A central concern of generative grammar is the relationship between hierarchy and word order, traditionally understood as two dimensions of a single syntactic representation. A related concern is directionality in the grammar. Traditional approaches posit process-neutral grammars, embodying knowledge of language, put to use with infinite facility both for production and comprehension. This has crystallized in the view of Merge as the central property of syntax, perhaps its only novel feature. A growing number of approaches explore grammars with different directionalities, often with more direct connections to performance mechanisms. This paper describes a novel model of universal grammar as a one-directional, universal parser. Mismatch between word order and interpretation order is pervasive in comprehension; in the present model, word order is language-particular and interpretation order (i.e., hierarchy) is universal. These orders are not two dimensions of a unified abstract object (e.g., precedence and dominance in a single tree); rather, both are temporal sequences, and UG is an invariant real-time procedure (based on Knuth's stack-sorting algorithm) transforming word order into hierarchical order. This shift in perspective has several desirable consequences. It collapses linearization, displacement, and composition into a single performance process. The architecture provides a novel source of brackets (labeled unambiguously and without search), which are understood not as part-whole constituency relations, but as storage and retrieval routines in parsing. It also explains why neutral word order within single syntactic cycles avoids 213-like permutations. The model identifies cycles as extended projections of lexical heads, grounding the notion of phase. This is achieved with a universal processor, dispensing with parameters. The empirical focus is word order in noun phrases. This domain provides some of the clearest evidence for 213-avoidance as a cross-linguistic word order generalization. Importantly, recursive phrase structure “bottoms out” in noun phrases, which are typically a single cycle (though further cycles may be embedded, e.g., relative clauses). By contrast, a simple transitive clause plausibly involves two cycles (vP and CP), embedding further nominal cycles. In the present theory, recursion is fundamentally distinct from structure-building within a single cycle, and different word order restrictions might emerge in larger domains like clauses. PMID:29497394
De Vincenzi, M
1996-01-01
This paper presents three experiments on the parsing of Italian wh-questions that manipulate the wh-type (who vs. which-N) and the wh extraction site (main clause, dependent clause with or without complementizer). The aim of these manipulations is to see whether the parser is sensitive to the type of dependencies being processed and whether the processing effects can be explained by a unique processing principle, the minimal chain principle (MCP; De Vincenzi, 1991). The results show that the parser, following the MCP, prefers structures with fewer and less complex chains. In particular: (1) There is a processing advantage for the wh-subject extractions, the structures with less complex chains; (2) there is a processing dissociation between the who and which questions; (3) the parser respects the principle that governs the well-formedness of the empty categories (ECP).
Morphosyntactic annotation of CHILDES transcripts*
SAGAE, KENJI; DAVIS, ERIC; LAVIE, ALON; MACWHINNEY, BRIAN; WINTNER, SHULY
2014-01-01
Corpora of child language are essential for research in child language acquisition and psycholinguistics. Linguistic annotation of the corpora provides researchers with better means for exploring the development of grammatical constructions and their usage. We describe a project whose goal is to annotate the English section of the CHILDES database with grammatical relations in the form of labeled dependency structures. We have produced a corpus of over 18,800 utterances (approximately 65,000 words) with manually curated gold-standard grammatical relation annotations. Using this corpus, we have developed a highly accurate data-driven parser for the English CHILDES data, which we used to automatically annotate the remainder of the English section of CHILDES. We have also extended the parser to Spanish, and are currently working on supporting more languages. The parser and the manually and automatically annotated data are freely available for research purposes. PMID:20334720
Designing a Constraint Based Parser for Sanskrit
NASA Astrophysics Data System (ADS)
Kulkarni, Amba; Pokar, Sheetal; Shukl, Devanand
Verbal understanding (śā bdabodha) of any utterance requires the knowledge of how words in that utterance are related to each other. Such knowledge is usually available in the form of cognition of grammatical relations. Generative grammars describe how a language codes these relations. Thus the knowledge of what information various grammatical relations convey is available from the generation point of view and not the analysis point of view. In order to develop a parser based on any grammar one should then know precisely the semantic content of the grammatical relations expressed in a language string, the clues for extracting these relations and finally whether these relations are expressed explicitly or implicitly. Based on the design principles that emerge from this knowledge, we model the parser as finding a directed Tree, given a graph with nodes representing the words and edges representing the possible relations between them. Further, we also use the Mīmā ṃsā constraint of ākā ṅkṣā (expectancy) to rule out non-solutions and sannidhi (proximity) to prioritize the solutions. We have implemented a parser based on these principles and its performance was found to be satisfactory giving us a confidence to extend its functionality to handle the complex sentences.
An Improved Tarpit for Network Deception
2016-03-25
World” program was, to one who is ready to join the cyber security workforce. Thirdly, I thank my mom and dad for their constant love , support, and...arrow in a part-whole relationship . In the diagram GreaseMonkey contains the three packet handler classes. The numbers next to the PriorityQueue and...arrow from Greasy to the config_parser module represents a usage relationship , where Greasy uses functions from config_parser to parse the configuration
Extracting BI-RADS Features from Portuguese Clinical Texts.
Nassif, Houssam; Cunha, Filipe; Moreira, Inês C; Cruz-Correia, Ricardo; Sousa, Eliana; Page, David; Burnside, Elizabeth; Dutra, Inês
2012-01-01
In this work we build the first BI-RADS parser for Portuguese free texts, modeled after existing approaches to extract BI-RADS features from English medical records. Our concept finder uses a semantic grammar based on the BIRADS lexicon and on iterative transferred expert knowledge. We compare the performance of our algorithm to manual annotation by a specialist in mammography. Our results show that our parser's performance is comparable to the manual method.
A Semantic Analysis Method for Scientific and Engineering Code
NASA Technical Reports Server (NTRS)
Stewart, Mark E. M.
1998-01-01
This paper develops a procedure to statically analyze aspects of the meaning or semantics of scientific and engineering code. The analysis involves adding semantic declarations to a user's code and parsing this semantic knowledge with the original code using multiple expert parsers. These semantic parsers are designed to recognize formulae in different disciplines including physical and mathematical formulae and geometrical position in a numerical scheme. In practice, a user would submit code with semantic declarations of primitive variables to the analysis procedure, and its semantic parsers would automatically recognize and document some static, semantic concepts and locate some program semantic errors. A prototype implementation of this analysis procedure is demonstrated. Further, the relationship between the fundamental algebraic manipulations of equations and the parsing of expressions is explained. This ability to locate some semantic errors and document semantic concepts in scientific and engineering code should reduce the time, risk, and effort of developing and using these codes.
Policy-Based Management Natural Language Parser
NASA Technical Reports Server (NTRS)
James, Mark
2009-01-01
The Policy-Based Management Natural Language Parser (PBEM) is a rules-based approach to enterprise management that can be used to automate certain management tasks. This parser simplifies the management of a given endeavor by establishing policies to deal with situations that are likely to occur. Policies are operating rules that can be referred to as a means of maintaining order, security, consistency, or other ways of successfully furthering a goal or mission. PBEM provides a way of managing configuration of network elements, applications, and processes via a set of high-level rules or business policies rather than managing individual elements, thus switching the control to a higher level. This software allows unique management rules (or commands) to be specified and applied to a cross-section of the Global Information Grid (GIG). This software embodies a parser that is capable of recognizing and understanding conversational English. Because all possible dialect variants cannot be anticipated, a unique capability was developed that parses passed on conversation intent rather than the exact way the words are used. This software can increase productivity by enabling a user to converse with the system in conversational English to define network policies. PBEM can be used in both manned and unmanned science-gathering programs. Because policy statements can be domain-independent, this software can be applied equally to a wide variety of applications.
Adding a Medical Lexicon to an English Parser
Szolovits, Peter
2003-01-01
We present a heuristic method to map lexical (syntactic) information from one lexicon to another, and apply the technique to augment the lexicon of the Link Grammar Parser with an enormous medical vocabulary drawn from the Specialist lexicon developed by the National Library of Medicine. This paper presents and justifies the mapping method and addresses technical problems that have to be overcome. It illustrates the utility of the method with respect to a large corpus of emergency department notes. PMID:14728251
Semantic based man-machine interface for real-time communication
NASA Technical Reports Server (NTRS)
Ali, M.; Ai, C.-S.
1988-01-01
A flight expert system (FLES) was developed to assist pilots in monitoring, diagnosing and recovering from in-flight faults. To provide a communications interface between the flight crew and FLES, a natural language interface (NALI) was implemented. Input to NALI is processed by three processors: (1) the semantics parser; (2) the knowledge retriever; and (3) the response generator. First the semantic parser extracts meaningful words and phrases to generate an internal representation of the query. At this point, the semantic parser has the ability to map different input forms related to the same concept into the same internal representation. Then the knowledge retriever analyzes and stores the context of the query to aid in resolving ellipses and pronoun references. At the end of this process, a sequence of retrievel functions is created as a first step in generating the proper response. Finally, the response generator generates the natural language response to the query. The architecture of NALI was designed to process both temporal and nontemporal queries. The architecture and implementation of NALI are described.
Overview of the ArbiTER edge plasma eigenvalue code
NASA Astrophysics Data System (ADS)
Baver, Derek; Myra, James; Umansky, Maxim
2011-10-01
The Arbitrary Topology Equation Reader, or ArbiTER, is a flexible eigenvalue solver that is currently under development for plasma physics applications. The ArbiTER code builds on the equation parser framework of the existing 2DX code, extending it to include a topology parser. This will give the code the capability to model problems with complicated geometries (such as multiple X-points and scrape-off layers) or model equations with arbitrary numbers of dimensions (e.g. for kinetic analysis). In the equation parser framework, model equations are not included in the program's source code. Instead, an input file contains instructions for building a matrix from profile functions and elementary differential operators. The program then executes these instructions in a sequential manner. These instructions may also be translated into analytic form, thus giving the code transparency as well as flexibility. We will present an overview of how the ArbiTER code is to work, as well as preliminary results from early versions of this code. Work supported by the U.S. DOE.
Detecting modification of biomedical events using a deep parsing approach.
Mackinlay, Andrew; Martinez, David; Baldwin, Timothy
2012-04-30
This work describes a system for identifying event mentions in bio-molecular research abstracts that are either speculative (e.g. analysis of IkappaBalpha phosphorylation, where it is not specified whether phosphorylation did or did not occur) or negated (e.g. inhibition of IkappaBalpha phosphorylation, where phosphorylation did not occur). The data comes from a standard dataset created for the BioNLP 2009 Shared Task. The system uses a machine-learning approach, where the features used for classification are a combination of shallow features derived from the words of the sentences and more complex features based on the semantic outputs produced by a deep parser. To detect event modification, we use a Maximum Entropy learner with features extracted from the data relative to the trigger words of the events. The shallow features are bag-of-words features based on a small sliding context window of 3-4 tokens on either side of the trigger word. The deep parser features are derived from parses produced by the English Resource Grammar and the RASP parser. The outputs of these parsers are converted into the Minimal Recursion Semantics formalism, and from this, we extract features motivated by linguistics and the data itself. All of these features are combined to create training or test data for the machine learning algorithm. Over the test data, our methods produce approximately a 4% absolute increase in F-score for detection of event modification compared to a baseline based only on the shallow bag-of-words features. Our results indicate that grammar-based techniques can enhance the accuracy of methods for detecting event modification.
Vosse, Theo; Kempen, Gerard
2009-12-01
We introduce a novel computer implementation of the Unification-Space parser (Vosse and Kempen in Cognition 75:105-143, 2000) in the form of a localist neural network whose dynamics is based on interactive activation and inhibition. The wiring of the network is determined by Performance Grammar (Kempen and Harbusch in Verb constructions in German and Dutch. Benjamins, Amsterdam, 2003), a lexicalist formalism with feature unification as binding operation. While the network is processing input word strings incrementally, the evolving shape of parse trees is represented in the form of changing patterns of activation in nodes that code for syntactic properties of words and phrases, and for the grammatical functions they fulfill. The system is capable, at least qualitatively and rudimentarily, of simulating several important dynamic aspects of human syntactic parsing, including garden-path phenomena and reanalysis, effects of complexity (various types of clause embeddings), fault-tolerance in case of unification failures and unknown words, and predictive parsing (expectation-based analysis, surprisal effects). English is the target language of the parser described.
Soares, Ana Paula; Fraga, Isabel; Comesaña, Montserrat; Piñeiro, Ana
2010-11-01
This work presents an analysis of the role of animacy in attachment preferences of relative clauses to complex noun phrases in European Portuguese (EP). The study of how the human parser solves this kind of syntactic ambiguities has been focus of extensive research. However, what is known about EP is both limited and puzzling. Additionally, as recent studies have stressed the importance of extra-syntactic variables in this process, two experiments were carried out to assess EP attachment preferences considering four animacy conditions: Study 1 used a sentence-completion-task, and Study 2 a self-paced reading task. Both studies indicate a significant preference for high attachment in EP. Furthermore, they showed that this preference was modulated by the animacy of the host NP: if the first host was inanimate and the second one was animate, the parser's preference changed to low attachment preference. These findings shed light on previous results regarding EP and strengthen the idea that, even in early stages of processing, the parser seems to be sensitive to extra-syntactic information.
Facilitating Analysis of Multiple Partial Data Streams
NASA Technical Reports Server (NTRS)
Maimone, Mark W.; Liebersbach, Robert R.
2008-01-01
Robotic Operations Automation: Mechanisms, Imaging, Navigation report Generation (ROAMING) is a set of computer programs that facilitates and accelerates both tactical and strategic analysis of time-sampled data especially the disparate and often incomplete streams of Mars Explorer Rover (MER) telemetry data described in the immediately preceding article. As used here, tactical refers to the activities over a relatively short time (one Martian day in the original MER application) and strategic refers to a longer time (the entire multi-year MER missions in the original application). Prior to installation, ROAMING must be configured with the types of data of interest, and parsers must be modified to understand the format of the input data (many example parsers are provided, including for general CSV files). Thereafter, new data from multiple disparate sources are automatically resampled into a single common annotated spreadsheet stored in a readable space-separated format, and these data can be processed or plotted at any time scale. Such processing or plotting makes it possible to study not only the details of a particular activity spanning only a few seconds, but also longer-term trends. ROAMING makes it possible to generate mission-wide plots of multiple engineering quantities [e.g., vehicle tilt as in Figure 1(a), motor current, numbers of images] that, heretofore could be found only in thousands of separate files. ROAMING also supports automatic annotation of both images and graphs. In the MER application, labels given to terrain features by rover scientists and engineers are automatically plotted in all received images based on their associated camera models (see Figure 2), times measured in seconds are mapped to Mars local time, and command names or arbitrary time-labeled events can be used to label engineering plots, as in Figure 1(b).
SAGA: A project to automate the management of software production systems
NASA Technical Reports Server (NTRS)
Campbell, R. H.
1983-01-01
The current work in progress for the SAGA project are described. The highlights of this research are: a parser independent SAGA editor, design for the screen editing facilities of the editor, delivery to NASA of release 1 of Olorin, the SAGA parser generator, personal workstation environment research, release 1 of the SAGA symbol table manager, delta generation in SAGA, requirements for a proof management system, documentation for and testing of the cyber pascal make prototype, a prototype cyber-based slicing facility, a June 1984 demonstration plan, SAGA utility programs, summary of UNIX software engineering support, and theorem prover review.
Detecting modification of biomedical events using a deep parsing approach
2012-01-01
Background This work describes a system for identifying event mentions in bio-molecular research abstracts that are either speculative (e.g. analysis of IkappaBalpha phosphorylation, where it is not specified whether phosphorylation did or did not occur) or negated (e.g. inhibition of IkappaBalpha phosphorylation, where phosphorylation did not occur). The data comes from a standard dataset created for the BioNLP 2009 Shared Task. The system uses a machine-learning approach, where the features used for classification are a combination of shallow features derived from the words of the sentences and more complex features based on the semantic outputs produced by a deep parser. Method To detect event modification, we use a Maximum Entropy learner with features extracted from the data relative to the trigger words of the events. The shallow features are bag-of-words features based on a small sliding context window of 3-4 tokens on either side of the trigger word. The deep parser features are derived from parses produced by the English Resource Grammar and the RASP parser. The outputs of these parsers are converted into the Minimal Recursion Semantics formalism, and from this, we extract features motivated by linguistics and the data itself. All of these features are combined to create training or test data for the machine learning algorithm. Results Over the test data, our methods produce approximately a 4% absolute increase in F-score for detection of event modification compared to a baseline based only on the shallow bag-of-words features. Conclusions Our results indicate that grammar-based techniques can enhance the accuracy of methods for detecting event modification. PMID:22595089
Huang, Yang; Lowe, Henry J; Klein, Dan; Cucina, Russell J
2005-01-01
The aim of this study was to develop and evaluate a method of extracting noun phrases with full phrase structures from a set of clinical radiology reports using natural language processing (NLP) and to investigate the effects of using the UMLS(R) Specialist Lexicon to improve noun phrase identification within clinical radiology documents. The noun phrase identification (NPI) module is composed of a sentence boundary detector, a statistical natural language parser trained on a nonmedical domain, and a noun phrase (NP) tagger. The NPI module processed a set of 100 XML-represented clinical radiology reports in Health Level 7 (HL7)(R) Clinical Document Architecture (CDA)-compatible format. Computed output was compared with manual markups made by four physicians and one author for maximal (longest) NP and those made by one author for base (simple) NP, respectively. An extended lexicon of biomedical terms was created from the UMLS Specialist Lexicon and used to improve NPI performance. The test set was 50 randomly selected reports. The sentence boundary detector achieved 99.0% precision and 98.6% recall. The overall maximal NPI precision and recall were 78.9% and 81.5% before using the UMLS Specialist Lexicon and 82.1% and 84.6% after. The overall base NPI precision and recall were 88.2% and 86.8% before using the UMLS Specialist Lexicon and 93.1% and 92.6% after, reducing false-positives by 31.1% and false-negatives by 34.3%. The sentence boundary detector performs excellently. After the adaptation using the UMLS Specialist Lexicon, the statistical parser's NPI performance on radiology reports increased to levels comparable to the parser's native performance in its newswire training domain and to that reported by other researchers in the general nonmedical domain.
ANTLR Tree Grammar Generator and Extensions
NASA Technical Reports Server (NTRS)
Craymer, Loring
2005-01-01
A computer program implements two extensions of ANTLR (Another Tool for Language Recognition), which is a set of software tools for translating source codes between different computing languages. ANTLR supports predicated- LL(k) lexer and parser grammars, a notation for annotating parser grammars to direct tree construction, and predicated tree grammars. [ LL(k) signifies left-right, leftmost derivation with k tokens of look-ahead, referring to certain characteristics of a grammar.] One of the extensions is a syntax for tree transformations. The other extension is the generation of tree grammars from annotated parser or input tree grammars. These extensions can simplify the process of generating source-to-source language translators and they make possible an approach, called "polyphase parsing," to translation between computing languages. The typical approach to translator development is to identify high-level semantic constructs such as "expressions," "declarations," and "definitions" as fundamental building blocks in the grammar specification used for language recognition. The polyphase approach is to lump ambiguous syntactic constructs during parsing and then disambiguate the alternatives in subsequent tree transformation passes. Polyphase parsing is believed to be useful for generating efficient recognizers for C++ and other languages that, like C++, have significant ambiguities.
The parser doesn't ignore intransitivity, after all
Staub, Adrian
2015-01-01
Several previous studies (Adams, Clifton, & Mitchell, 1998; Mitchell, 1987; van Gompel & Pickering, 2001) have explored the question of whether the parser initially analyzes a noun phrase that follows an intransitive verb as the verb's direct object. Three eyetracking experiments examined this issue in more detail. Experiment 1 strongly replicated the finding (van Gompel & Pickering, 2001) that readers experience difficulty on this noun phrase in normal reading, and found that this difficulty occurs even with a class of intransitive verbs for which a direct object is categorically prohibited. Experiment 2, however, demonstrated that this effect is not due to syntactic misanalysis, but is instead due to disruption that occurs when a comma is absent at a subordinate clause/main clause boundary. Exploring a different construction, Experiment 3 replicated the finding (Pickering & Traxler, 2003; Traxler & Pickering, 1996) that when a noun phrase “filler” is an implausible direct object for an optionally transitive relative clause verb, processing difficulty results; however, there was no evidence for such difficulty when the relative clause verb was strictly intransitive. Taken together, the three experiments undermine the support for the claim that the parser initially ignores a verb's subcategorization restrictions. PMID:17470005
Xu, Hua; AbdelRahman, Samir; Lu, Yanxin; Denny, Joshua C.; Doan, Son
2011-01-01
Semantic-based sublanguage grammars have been shown to be an efficient method for medical language processing. However, given the complexity of the medical domain, parsers using such grammars inevitably encounter ambiguous sentences, which could be interpreted by different groups of production rules and consequently result in two or more parse trees. One possible solution, which has not been extensively explored previously, is to augment productions in medical sublanguage grammars with probabilities to resolve the ambiguity. In this study, we associated probabilities with production rules in a semantic-based grammar for medication findings and evaluated its performance on reducing parsing ambiguity. Using the existing data set from 2009 i2b2 NLP (Natural Language Processing) challenge for medication extraction, we developed a semantic-based CFG (Context Free Grammar) for parsing medication sentences and manually created a Treebank of 4,564 medication sentences from discharge summaries. Using the Treebank, we derived a semantic-based PCFG (probabilistic Context Free Grammar) for parsing medication sentences. Our evaluation using a 10-fold cross validation showed that the PCFG parser dramatically improved parsing performance when compared to the CFG parser. PMID:21856440
Locating Anomalies in Complex Data Sets Using Visualization and Simulation
NASA Technical Reports Server (NTRS)
Panetta, Karen
2001-01-01
The research goals are to create a simulation framework that can accept any combination of models written at the gate or behavioral level. The framework provides the ability to fault simulate and create scenarios of experiments using concurrent simulation. In order to meet these goals we have had to fulfill the following requirements. The ability to accept models written in VHDL, Verilog or the C languages. The ability to propagate faults through any model type. The ability to create experiment scenarios efficiently without generating every possible combination of variables. The ability to accept adversity of fault models beyond the single stuck-at model. Major development has been done to develop a parser that can accept models written in various languages. This work has generated considerable attention from other universities and industry for its flexibility and usefulness. The parser uses LEXX and YACC to parse Verilog and C. We have also utilized our industrial partnership with Alternative System's Inc. to import vhdl into our simulator. For multilevel simulation, we needed to modify the simulator architecture to accept models that contained multiple outputs. This enabled us to accept behavioral components. The next major accomplishment was the addition of "functional fault models". Functional fault models change the behavior of a gate or model. For example, a bridging fault can make an OR gate behave like an AND gate. This has applications beyond fault simulation. This modeling flexibility will make the simulator more useful for doing verification and model comparison. For instance, two or more versions of an ALU can be comparatively simulated in a single execution. The results will show where and how the models differed so that the performance and correctness of the models may be evaluated. A considerable amount of time has been dedicated to validating the simulator performance on larger models provided by industry and other universities.
The Mystro system: A comprehensive translator toolkit
NASA Technical Reports Server (NTRS)
Collins, W. R.; Noonan, R. E.
1985-01-01
Mystro is a system that facilities the construction of compilers, assemblers, code generators, query interpretors, and similar programs. It provides features to encourage the use of iterative enhancement. Mystro was developed in response to the needs of NASA Langley Research Center (LaRC) and enjoys a number of advantages over similar systems. There are other programs available that can be used in building translators. These typically build parser tables, usually supply the source of a parser and parts of a lexical analyzer, but provide little or no aid for code generation. In general, only the front end of the compiler is addressed. Mystro, on the other hand, emphasizes tools for both ends of a compiler.
Building pathway graphs from BioPAX data in R.
Benis, Nirupama; Schokker, Dirkjan; Kramer, Frank; Smits, Mari A; Suarez-Diez, Maria
2016-01-01
Biological pathways are increasingly available in the BioPAX format which uses an RDF model for data storage. One can retrieve the information in this data model in the scripting language R using the package rBiopaxParser , which converts the BioPAX format to one readable in R. It also has a function to build a regulatory network from the pathway information. Here we describe an extension of this function. The new function allows the user to build graphs of entire pathways, including regulated as well as non-regulated elements, and therefore provides a maximum of information. This function is available as part of the rBiopaxParser distribution from Bioconductor.
Parsley: a Command-Line Parser for Astronomical Applications
NASA Astrophysics Data System (ADS)
Deich, William
Parsley is a sophisticated keyword + value parser, packaged as a library of routines that offers an easy method for providing command-line arguments to programs. It makes it easy for the user to enter values, and it makes it easy for the programmer to collect and validate the user's entries. Parsley is tuned for astronomical applications: for example, dates entered in Julian, Modified Julian, calendar, or several other formats are all recognized without special effort by the user or by the programmer; angles can be entered using decimal degrees or dd:mm:ss; time-like intervals as decimal hours, hh:mm:ss, or a variety of other units. Vectors of data are accepted as readily as scalars.
Solving LR Conflicts Through Context Aware Scanning
NASA Astrophysics Data System (ADS)
Leon, C. Rodriguez; Forte, L. Garcia
2011-09-01
This paper presents a new algorithm to compute the exact list of tokens expected by any LR syntax analyzer at any point of the scanning process. The lexer can, at any time, compute the exact list of valid tokens to return only tokens in this set. In the case than more than one matching token is in the valid set, the lexer can resort to a nested LR parser to disambiguate. Allowing nested LR parsing requires some slight modifications when building the LR parsing tables. We also show how LR parsers can parse conflictive and inherently ambiguous languages using a combination of nested parsing and context aware scanning. These expanded lexical analyzers can be generated from high level specifications.
Huang, Yang; Lowe, Henry J.; Klein, Dan; Cucina, Russell J.
2005-01-01
Objective: The aim of this study was to develop and evaluate a method of extracting noun phrases with full phrase structures from a set of clinical radiology reports using natural language processing (NLP) and to investigate the effects of using the UMLS® Specialist Lexicon to improve noun phrase identification within clinical radiology documents. Design: The noun phrase identification (NPI) module is composed of a sentence boundary detector, a statistical natural language parser trained on a nonmedical domain, and a noun phrase (NP) tagger. The NPI module processed a set of 100 XML-represented clinical radiology reports in Health Level 7 (HL7)® Clinical Document Architecture (CDA)–compatible format. Computed output was compared with manual markups made by four physicians and one author for maximal (longest) NP and those made by one author for base (simple) NP, respectively. An extended lexicon of biomedical terms was created from the UMLS Specialist Lexicon and used to improve NPI performance. Results: The test set was 50 randomly selected reports. The sentence boundary detector achieved 99.0% precision and 98.6% recall. The overall maximal NPI precision and recall were 78.9% and 81.5% before using the UMLS Specialist Lexicon and 82.1% and 84.6% after. The overall base NPI precision and recall were 88.2% and 86.8% before using the UMLS Specialist Lexicon and 93.1% and 92.6% after, reducing false-positives by 31.1% and false-negatives by 34.3%. Conclusion: The sentence boundary detector performs excellently. After the adaptation using the UMLS Specialist Lexicon, the statistical parser's NPI performance on radiology reports increased to levels comparable to the parser's native performance in its newswire training domain and to that reported by other researchers in the general nonmedical domain. PMID:15684131
Modular implementation of a digital hardware design automation system
NASA Astrophysics Data System (ADS)
Masud, M.
An automation system based on AHPL (A Hardware Programming Language) was developed. The project may be divided into three distinct phases: (1) Upgrading of AHPL to make it more universally applicable; (2) Implementation of a compiler for the language; and (3) illustration of how the compiler may be used to support several phases of design activities. Several new features were added to AHPL. These include: application-dependent parameters, mutliple clocks, asynchronous results, functional registers and primitive functions. The new language, called Universal AHPL, has been defined rigorously. The compiler design is modular. The parsing is done by an automatic parser generated from the SLR(1)BNF grammar of the language. The compiler produces two data bases from the AHPL description of a circuit. The first one is a tabular representation of the circuit, and the second one is a detailed interconnection linked list. The two data bases provide a means to interface the compiler to application-dependent CAD systems.
Activity Scratchpad Prototype: Simplifying the Rover Activity Planning Cycle
NASA Technical Reports Server (NTRS)
Abramyan, Lucy
2005-01-01
The Mars Exploration Rover mission depends on the Science Activity Planner as its primary interface to the Spirit and Opportunity Rovers. Scientists alternate between a series of mouse clicks and keyboard inputs to create a set of instructions for the rovers. To accelerate planning by minimizing mouse usage, a rover planning editor should receive the majority of inputted commands from the keyboard. Thorough investigation of the Eclipse platform's Java editor has provided the understanding of the base model for the Activity Scratchpad. Desirable Eclipse features can be mapped to specific rover planning commands, such as auto-completion for activity titles and content assist for target names. A custom editor imitating the Java editor's features was created with an XML parser for experimenting purposes. The prototype editor minimized effort for redundant tasks and significantly improved the visual representation of XML syntax by highlighting keywords, coloring rules, folding projections, and providing hover assist, templates and an outline view of the code.
BIOSPIDA: A Relational Database Translator for NCBI.
Hagen, Matthew S; Lee, Eva K
2010-11-13
As the volume and availability of biological databases continue widespread growth, it has become increasingly difficult for research scientists to identify all relevant information for biological entities of interest. Details of nucleotide sequences, gene expression, molecular interactions, and three-dimensional structures are maintained across many different databases. To retrieve all necessary information requires an integrated system that can query multiple databases with minimized overhead. This paper introduces a universal parser and relational schema translator that can be utilized for all NCBI databases in Abstract Syntax Notation (ASN.1). The data models for OMIM, Entrez-Gene, Pubmed, MMDB and GenBank have been successfully converted into relational databases and all are easily linkable helping to answer complex biological questions. These tools facilitate research scientists to locally integrate databases from NCBI without significant workload or development time.
NASA Astrophysics Data System (ADS)
Zhang, Min; Pavlicek, William; Panda, Anshuman; Langer, Steve G.; Morin, Richard; Fetterly, Kenneth A.; Paden, Robert; Hanson, James; Wu, Lin-Wei; Wu, Teresa
2015-03-01
DICOM Index Tracker (DIT) is an integrated platform to harvest rich information available from Digital Imaging and Communications in Medicine (DICOM) to improve quality assurance in radiology practices. It is designed to capture and maintain longitudinal patient-specific exam indices of interests for all diagnostic and procedural uses of imaging modalities. Thus, it effectively serves as a quality assurance and patient safety monitoring tool. The foundation of DIT is an intelligent database system which stores the information accepted and parsed via a DICOM receiver and parser. The database system enables the basic dosimetry analysis. The success of DIT implementation at Mayo Clinic Arizona calls for the DIT deployment at the enterprise level which requires significant improvements. First, for geographically distributed multi-site implementation, the first bottleneck is the communication (network) delay; the second is the scalability of the DICOM parser to handle the large volume of exams from different sites. To address this issue, DICOM receiver and parser are separated and decentralized by site. To facilitate the enterprise wide Quality Assurance (QA), a notable challenge is the great diversities of manufacturers, modalities and software versions, as the solution DIT Enterprise provides the standardization tool for device naming, protocol naming, physician naming across sites. Thirdly, advanced analytic engines are implemented online which support the proactive QA in DIT Enterprise.
Toward a theory of distributed word expert natural language parsing
NASA Technical Reports Server (NTRS)
Rieger, C.; Small, S.
1981-01-01
An approach to natural language meaning-based parsing in which the unit of linguistic knowledge is the word rather than the rewrite rule is described. In the word expert parser, knowledge about language is distributed across a population of procedural experts, each representing a word of the language, and each an expert at diagnosing that word's intended usage in context. The parser is structured around a coroutine control environment in which the generator-like word experts ask questions and exchange information in coming to collective agreement on sentence meaning. The word expert theory is advanced as a better cognitive model of human language expertise than the traditional rule-based approach. The technical discussion is organized around examples taken from the prototype LISP system which implements parts of the theory.
The power and limits of a rule-based morpho-semantic parser.
Baud, R. H.; Rassinoux, A. M.; Ruch, P.; Lovis, C.; Scherrer, J. R.
1999-01-01
The venue of Electronic Patient Record (EPR) implies an increasing amount of medical texts readily available for processing, as soon as convenient tools are made available. The chief application is text analysis, from which one can drive other disciplines like indexing for retrieval, knowledge representation, translation and inferencing for medical intelligent systems. Prerequisites for a convenient analyzer of medical texts are: building the lexicon, developing semantic representation of the domain, having a large corpus of texts available for statistical analysis, and finally mastering robust and powerful parsing techniques in order to satisfy the constraints of the medical domain. This article aims at presenting an easy-to-use parser ready to be adapted in different settings. It describes its power together with its practical limitations as experienced by the authors. PMID:10566313
The power and limits of a rule-based morpho-semantic parser.
Baud, R H; Rassinoux, A M; Ruch, P; Lovis, C; Scherrer, J R
1999-01-01
The venue of Electronic Patient Record (EPR) implies an increasing amount of medical texts readily available for processing, as soon as convenient tools are made available. The chief application is text analysis, from which one can drive other disciplines like indexing for retrieval, knowledge representation, translation and inferencing for medical intelligent systems. Prerequisites for a convenient analyzer of medical texts are: building the lexicon, developing semantic representation of the domain, having a large corpus of texts available for statistical analysis, and finally mastering robust and powerful parsing techniques in order to satisfy the constraints of the medical domain. This article aims at presenting an easy-to-use parser ready to be adapted in different settings. It describes its power together with its practical limitations as experienced by the authors.
Retrieval Interference in Syntactic Processing: The Case of Reflexive Binding in English.
Patil, Umesh; Vasishth, Shravan; Lewis, Richard L
2016-01-01
It has been proposed that in online sentence comprehension the dependency between a reflexive pronoun such as himself/herself and its antecedent is resolved using exclusively syntactic constraints. Under this strictly syntactic search account, Principle A of the binding theory-which requires that the antecedent c-command the reflexive within the same clause that the reflexive occurs in-constrains the parser's search for an antecedent. The parser thus ignores candidate antecedents that might match agreement features of the reflexive (e.g., gender) but are ineligible as potential antecedents because they are in structurally illicit positions. An alternative possibility accords no special status to structural constraints: in addition to using Principle A, the parser also uses non-structural cues such as gender to access the antecedent. According to cue-based retrieval theories of memory (e.g., Lewis and Vasishth, 2005), the use of non-structural cues should result in increased retrieval times and occasional errors when candidates partially match the cues, even if the candidates are in structurally illicit positions. In this paper, we first show how the retrieval processes that underlie the reflexive binding are naturally realized in the Lewis and Vasishth (2005) model. We present the predictions of the model under the assumption that both structural and non-structural cues are used during retrieval, and provide a critical analysis of previous empirical studies that failed to find evidence for the use of non-structural cues, suggesting that these failures may be Type II errors. We use this analysis and the results of further modeling to motivate a new empirical design that we use in an eye tracking study. The results of this study confirm the key predictions of the model concerning the use of non-structural cues, and are inconsistent with the strictly syntactic search account. These results present a challenge for theories advocating the infallibility of the human parser in the case of reflexive resolution, and provide support for the inclusion of agreement features such as gender in the set of retrieval cues.
Retrieval Interference in Syntactic Processing: The Case of Reflexive Binding in English
Patil, Umesh; Vasishth, Shravan; Lewis, Richard L.
2016-01-01
It has been proposed that in online sentence comprehension the dependency between a reflexive pronoun such as himself/herself and its antecedent is resolved using exclusively syntactic constraints. Under this strictly syntactic search account, Principle A of the binding theory—which requires that the antecedent c-command the reflexive within the same clause that the reflexive occurs in—constrains the parser's search for an antecedent. The parser thus ignores candidate antecedents that might match agreement features of the reflexive (e.g., gender) but are ineligible as potential antecedents because they are in structurally illicit positions. An alternative possibility accords no special status to structural constraints: in addition to using Principle A, the parser also uses non-structural cues such as gender to access the antecedent. According to cue-based retrieval theories of memory (e.g., Lewis and Vasishth, 2005), the use of non-structural cues should result in increased retrieval times and occasional errors when candidates partially match the cues, even if the candidates are in structurally illicit positions. In this paper, we first show how the retrieval processes that underlie the reflexive binding are naturally realized in the Lewis and Vasishth (2005) model. We present the predictions of the model under the assumption that both structural and non-structural cues are used during retrieval, and provide a critical analysis of previous empirical studies that failed to find evidence for the use of non-structural cues, suggesting that these failures may be Type II errors. We use this analysis and the results of further modeling to motivate a new empirical design that we use in an eye tracking study. The results of this study confirm the key predictions of the model concerning the use of non-structural cues, and are inconsistent with the strictly syntactic search account. These results present a challenge for theories advocating the infallibility of the human parser in the case of reflexive resolution, and provide support for the inclusion of agreement features such as gender in the set of retrieval cues. PMID:27303315
BIOSPIDA: A Relational Database Translator for NCBI
Hagen, Matthew S.; Lee, Eva K.
2010-01-01
As the volume and availability of biological databases continue widespread growth, it has become increasingly difficult for research scientists to identify all relevant information for biological entities of interest. Details of nucleotide sequences, gene expression, molecular interactions, and three-dimensional structures are maintained across many different databases. To retrieve all necessary information requires an integrated system that can query multiple databases with minimized overhead. This paper introduces a universal parser and relational schema translator that can be utilized for all NCBI databases in Abstract Syntax Notation (ASN.1). The data models for OMIM, Entrez-Gene, Pubmed, MMDB and GenBank have been successfully converted into relational databases and all are easily linkable helping to answer complex biological questions. These tools facilitate research scientists to locally integrate databases from NCBI without significant workload or development time. PMID:21347013
Automatic Parsing of Parental Verbal Input
Sagae, Kenji; MacWhinney, Brian; Lavie, Alon
2006-01-01
To evaluate theoretical proposals regarding the course of child language acquisition, researchers often need to rely on the processing of large numbers of syntactically parsed utterances, both from children and their parents. Because it is so difficult to do this by hand, there are currently no parsed corpora of child language input data. To automate this process, we developed a system that combined the MOR tagger, a rule-based parser, and statistical disambiguation techniques. The resultant system obtained nearly 80% correct parses for the sentences spoken to children. To achieve this level, we had to construct a particular processing sequence that minimizes problems caused by the coverage/ambiguity trade-off in parser design. These procedures are particularly appropriate for use with the CHILDES database, an international corpus of transcripts. The data and programs are now freely available over the Internet. PMID:15190707
Incremental Refinement of FAÇADE Models with Attribute Grammar from 3d Point Clouds
NASA Astrophysics Data System (ADS)
Dehbi, Y.; Staat, C.; Mandtler, L.; Pl¨umer, L.
2016-06-01
Data acquisition using unmanned aerial vehicles (UAVs) has gotten more and more attention over the last years. Especially in the field of building reconstruction the incremental interpretation of such data is a demanding task. In this context formal grammars play an important role for the top-down identification and reconstruction of building objects. Up to now, the available approaches expect offline data in order to parse an a-priori known grammar. For mapping on demand an on the fly reconstruction based on UAV data is required. An incremental interpretation of the data stream is inevitable. This paper presents an incremental parser of grammar rules for an automatic 3D building reconstruction. The parser enables a model refinement based on new observations with respect to a weighted attribute context-free grammar (WACFG). The falsification or rejection of hypotheses is supported as well. The parser can deal with and adapt available parse trees acquired from previous interpretations or predictions. Parse trees derived so far are updated in an iterative way using transformation rules. A diagnostic step searches for mismatches between current and new nodes. Prior knowledge on façades is incorporated. It is given by probability densities as well as architectural patterns. Since we cannot always assume normal distributions, the derivation of location and shape parameters of building objects is based on a kernel density estimation (KDE). While the level of detail is continuously improved, the geometrical, semantic and topological consistency is ensured.
Neuroanatomical term generation and comparison between two terminologies.
Srinivas, Prashanti R; Gusfield, Daniel; Mason, Oliver; Gertz, Michael; Hogarth, Michael; Stone, James; Jones, Edward G; Gorin, Fredric A
2003-01-01
An approach and software tools are described for identifying and extracting compound terms (CTs), acronyms and their associated contexts from textual material that is associated with neuroanatomical atlases. A set of simple syntactic rules were appended to the output of a commercially available part of speech (POS) tagger (Qtag v 3.01) that extracts CTs and their associated context from the texts of neuroanatomical atlases. This "hybrid" parser. appears to be highly sensitive and recognized 96% of the potentially germane neuroanatomical CTs and acronyms present in the cat and primate thalamic atlases. A comparison of neuroanatomical CTs and acronymsbetween the cat and primate atlas texts was initially performed using exact-term matching. The implementation of string-matching algorithms significantly improved the identification of relevant terms and acronyms between the two domains. The End Gap Free string matcher identified 98% of CTs and the Needleman Wunsch (NW) string matcher matched 36% of acronyms between the two atlases. Combining several simple grammatical and lexical rules with the POS tagger ("hybrid parser") (1) extracted complex neuroanatomical terms and acronyms from selected cat and primate thalamic atlases and (2) and facilitated the semi-automated generation of a highly granular thalamic terminology. The implementation of string-matching algorithms (1) reconciled terminological errors generated by optical character recognition (OCR) software used to generate the neuroanatomical text information and (2) increased the sensitivity of matching neuroanatomical terms and acronyms between the two neuroanatomical domains that were generated by the "hybrid" parser.
GBParsy: a GenBank flatfile parser library with high speed.
Lee, Tae-Ho; Kim, Yeon-Ki; Nahm, Baek Hie
2008-07-25
GenBank flatfile (GBF) format is one of the most popular sequence file formats because of its detailed sequence features and ease of readability. To use the data in the file by a computer, a parsing process is required and is performed according to a given grammar for the sequence and the description in a GBF. Currently, several parser libraries for the GBF have been developed. However, with the accumulation of DNA sequence information from eukaryotic chromosomes, parsing a eukaryotic genome sequence with these libraries inevitably takes a long time, due to the large GBF file and its correspondingly large genomic nucleotide sequence and related feature information. Thus, there is significant need to develop a parsing program with high speed and efficient use of system memory. We developed a library, GBParsy, which was C language-based and parses GBF files. The parsing speed was maximized by using content-specified functions in place of regular expressions that are flexible but slow. In addition, we optimized an algorithm related to memory usage so that it also increased parsing performance and efficiency of memory usage. GBParsy is at least 5-100x faster than current parsers in benchmark tests. GBParsy is estimated to extract annotated information from almost 100 Mb of a GenBank flatfile for chromosomal sequence information within a second. Thus, it should be used for a variety of applications such as on-time visualization of a genome at a web site.
Disambiguating the species of biomedical named entities using natural language parsers
Wang, Xinglong; Tsujii, Jun'ichi; Ananiadou, Sophia
2010-01-01
Motivation: Text mining technologies have been shown to reduce the laborious work involved in organizing the vast amount of information hidden in the literature. One challenge in text mining is linking ambiguous word forms to unambiguous biological concepts. This article reports on a comprehensive study on resolving the ambiguity in mentions of biomedical named entities with respect to model organisms and presents an array of approaches, with focus on methods utilizing natural language parsers. Results: We build a corpus for organism disambiguation where every occurrence of protein/gene entity is manually tagged with a species ID, and evaluate a number of methods on it. Promising results are obtained by training a machine learning model on syntactic parse trees, which is then used to decide whether an entity belongs to the model organism denoted by a neighbouring species-indicating word (e.g. yeast). The parser-based approaches are also compared with a supervised classification method and results indicate that the former are a more favorable choice when domain portability is of concern. The best overall performance is obtained by combining the strengths of syntactic features and supervised classification. Availability: The corpus and demo are available at http://www.nactem.ac.uk/deca_details/start.cgi, and the software is freely available as U-Compare components (Kano et al., 2009): NaCTeM Species Word Detector and NaCTeM Species Disambiguator. U-Compare is available at http://-compare.org/ Contact: xinglong.wang@manchester.ac.uk PMID:20053840
FLIP for FLAG model visualization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wooten, Hasani Omar
A graphical user interface has been developed for FLAG users. FLIP (FLAG Input deck Parser) provides users with an organized view of FLAG models and a means for efficiently and easily navigating and editing nodes, parameters, and variables.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Busby, L.
This is an adaptation of the pre-existing Scimark benchmark code to a variety of Python and Lua implementations. It also measures performance of the Fparser expression parser and C and C++ code on a variety of simple scientific expressions.
A translator writing system for microcomputer high-level languages and assemblers
NASA Technical Reports Server (NTRS)
Collins, W. R.; Knight, J. C.; Noonan, R. E.
1980-01-01
In order to implement high level languages whenever possible, a translator writing system of advanced design was developed. It is intended for routine production use by many programmers working on different projects. As well as a fairly conventional parser generator, it includes a system for the rapid generation of table driven code generators. The parser generator was developed from a prototype version. The translator writing system includes various tools for the management of the source text of a compiler under construction. In addition, it supplies various default source code sections so that its output is always compilable and executable. The system thereby encourages iterative enhancement as a development methodology by ensuring an executable program from the earliest stages of a compiler development project. The translator writing system includes PASCAL/48 compiler, three assemblers, and two compilers for a subset of HAL/S.
Vulnerabilities in Bytecode Removed by Analysis, Nuanced Confinement and Diversification (VIBRANCE)
2015-06-01
VIBRANCE tool starts with a vulnerable Java application and automatically hardens it against SQL injection, OS command injection, file path traversal...7 2.2 Java Front End...7 2.2.2 Java Byte Code Parser
USPAS | U.S. Particle Accelerator School
U.S. Particle Accelerator School U.S. Particle Accelerator School U.S. Particle Accelerator School U.S. Particle Accelerator School Education in Beam Physics and Accelerator Technology Home About About University Credits Joint International Accelerator School University-Style Programs Symposium-Style Programs
Chen, W; Kowatch, R; Lin, S; Splaingard, M; Huang, Y
2015-01-01
Nationwide Children's Hospital established an i2b2 (Informatics for Integrating Biology & the Bedside) application for sleep disorder cohort identification. Discrete data were gleaned from semistructured sleep study reports. The system showed to work more efficiently than the traditional manual chart review method, and it also enabled searching capabilities that were previously not possible. We report on the development and implementation of the sleep disorder i2b2 cohort identification system using natural language processing of semi-structured documents. We developed a natural language processing approach to automatically parse concepts and their values from semi-structured sleep study documents. Two parsers were developed: a regular expression parser for extracting numeric concepts and a NLP based tree parser for extracting textual concepts. Concepts were further organized into i2b2 ontologies based on document structures and in-domain knowledge. 26,550 concepts were extracted with 99% being textual concepts. 1.01 million facts were extracted from sleep study documents such as demographic information, sleep study lab results, medications, procedures, diagnoses, among others. The average accuracy of terminology parsing was over 83% when comparing against those by experts. The system is capable of capturing both standard and non-standard terminologies. The time for cohort identification has been reduced significantly from a few weeks to a few seconds. Natural language processing was shown to be powerful for quickly converting large amount of semi-structured or unstructured clinical data into discrete concepts, which in combination of intuitive domain specific ontologies, allows fast and effective interactive cohort identification through the i2b2 platform for research and clinical use.
Wh-filler-gap dependency formation guides reflexive antecedent search
Frazier, Michael; Ackerman, Lauren; Baumann, Peter; Potter, David; Yoshida, Masaya
2015-01-01
Prior studies on online sentence processing have shown that the parser can resolve non-local dependencies rapidly and accurately. This study investigates the interaction between the processing of two such non-local dependencies: wh-filler-gap dependencies (WhFGD) and reflexive-antecedent dependencies. We show that reflexive-antecedent dependency resolution is sensitive to the presence of a WhFGD, and argue that the filler-gap dependency established by WhFGD resolution is selected online as the antecedent of a reflexive dependency. We investigate the processing of constructions like (1), where two NPs might be possible antecedents for the reflexive, namely which cowgirl and Mary. Even though Mary is linearly closer to the reflexive, the only grammatically licit antecedent for the reflexive is the more distant wh-NP, which cowgirl. (1). Which cowgirl did Mary expect to have injured herself due to negligence? Four eye-tracking text-reading experiments were conducted on examples like (1), differing in whether the embedded clause was non-finite (1 and 3) or finite (2 and 4), and in whether the tail of the wh-dependency intervened between the reflexive and its closest overt antecedent (1 and 2) or the wh-dependency was associated with a position earlier in the sentence (3 and 4). The results of Experiments 1 and 2 indicate the parser accesses the result of WhFGD formation during reflexive antecedent search. The resolution of a wh-dependency alters the representation that reflexive antecedent search operates over, allowing the grammatical but linearly distant antecedent to be accessed rapidly. In the absence of a long-distance WhFGD (Experiments 3 and 4), wh-NPs were not found to impact reading times of the reflexive, indicating that the parser's ability to select distant wh-NPs as reflexive antecedents crucially involves syntactic structure. PMID:26500579
Chen, W.; Kowatch, R.; Lin, S.; Splaingard, M.
2015-01-01
Summary Nationwide Children’s Hospital established an i2b2 (Informatics for Integrating Biology & the Bedside) application for sleep disorder cohort identification. Discrete data were gleaned from semistructured sleep study reports. The system showed to work more efficiently than the traditional manual chart review method, and it also enabled searching capabilities that were previously not possible. Objective We report on the development and implementation of the sleep disorder i2b2 cohort identification system using natural language processing of semi-structured documents. Methods We developed a natural language processing approach to automatically parse concepts and their values from semi-structured sleep study documents. Two parsers were developed: a regular expression parser for extracting numeric concepts and a NLP based tree parser for extracting textual concepts. Concepts were further organized into i2b2 ontologies based on document structures and in-domain knowledge. Results 26,550 concepts were extracted with 99% being textual concepts. 1.01 million facts were extracted from sleep study documents such as demographic information, sleep study lab results, medications, procedures, diagnoses, among others. The average accuracy of terminology parsing was over 83% when comparing against those by experts. The system is capable of capturing both standard and non-standard terminologies. The time for cohort identification has been reduced significantly from a few weeks to a few seconds. Conclusion Natural language processing was shown to be powerful for quickly converting large amount of semi-structured or unstructured clinical data into discrete concepts, which in combination of intuitive domain specific ontologies, allows fast and effective interactive cohort identification through the i2b2 platform for research and clinical use. PMID:26171080
Construction of a menu-based system
NASA Technical Reports Server (NTRS)
Noonan, R. E.; Collins, W. R.
1985-01-01
The development of the user interface to a software code management system is discussed. The user interface was specified using a grammar and implemented using a LR parser generator. This was found to be an effective method for the rapid prototyping of a menu based system.
Identifying the null subject: evidence from event-related brain potentials.
Demestre, J; Meltzer, S; García-Albea, J E; Vigil, A
1999-05-01
Event-related brain potentials (ERPs) were recorded during spoken language comprehension to study the on-line effects of gender agreement violations in controlled infinitival complements. Spanish sentences were constructed in which the complement clause contained a predicate adjective marked for syntactic gender. By manipulating the gender of the antecedent (i.e., the controller) of the implicit subject while holding constant the gender of the adjective, pairs of grammatical and ungrammatical sentences were created. The detection of such a gender agreement violation would indicate that the parser had established the coreference relation between the null subject and its antecedent. The results showed a complex biphasic ERP (i.e., an early negativity with prominence at anterior and central sites, followed by a centroparietal positivity) in the violating condition as compared to the non-violating conditions. The brain reacts to NP-adjective gender agreement violations within a few hundred milliseconds of their occurrence. The data imply that the parser has properly coindexed the null subject of an infinitive clause with its antecedent.
Competing explanations for cosmic acceleration or why is the expansion of the universe accelerating?
NASA Astrophysics Data System (ADS)
Ishak, Mustapha
2012-06-01
For more than a decade, a number of cosmological observations have been indicating that the expansion of the universe is accelerating. Cosmic acceleration and the questions associated with it have become one of the most challenging and puzzling problems in cosmology and physics. Cosmic acceleration can be caused by (i) a repulsive dark energy pervading the universe, (ii) an extension to General Relativity that takes effect at cosmological scales of distance, or (iii) the acceleration may be an apparent effect due to the fact that the expansion rate of space-time is uneven from one region to another in the universe. I will review the basics of these possibilities and provide some recent results including ours on these questions.
NASA Astrophysics Data System (ADS)
Barletta, William A.
2009-03-01
Only a handful of universities in the US offer any formal training in accelerator science. The United States Particle Accelerator School (USPAS) is National Graduate Educational Program that has developed a highly successful educational paradigm that, over the past twenty-years, has granted more university credit in accelerator/beam science and technology than any university in the world. Sessions are held twice annually, hosted by major US research universities that approve course credit, certify the USPAS faculty, and grant course credit. The USPAS paradigm is readily extensible to other rapidly developing, cross-disciplinary research areas such as high energy density physics.
Multimedia CALLware: The Developer's Responsibility.
ERIC Educational Resources Information Center
Dodigovic, Marina
The early computer-assisted-language-learning (CALL) programs were silent and mostly limited to screen or printer supported written text as the prevailing communication resource. The advent of powerful graphics, sound and video combined with AI-based parsers and sound recognition devices gradually turned the computer into a rather anthropomorphic…
Mention Detection: Heuristics for the OntoNotes Annotations
2011-01-01
Mention Detection: Heuristics for the OntoNotes annotations Jonathan K. Kummerfeld, Mohit Bansal, David Burkett and Dan Klein Computer Science...considered the provided parses and parses produced by the Berke - ley parser (Petrov et al., 2006) trained on the pro- vided training data. We added a
Overview of graduate training program of John Adams Institute for Accelerator Science
NASA Astrophysics Data System (ADS)
Seryi, Andrei
The John Adams Institute for Accelerator Science is a center of excellence in the UK for advanced and novel accelerator technology, providing expertise, research, development and training in accelerator techniques, and promoting advanced accelerator applications in science and society. We work in JAI on design of novel light sources upgrades of 3-rd generation and novel FELs, on plasma acceleration and its application to industrial and medical fields, on novel energy recovery compact linacs and advanced beam diagnostics, and many other projects. The JAI is based on three universities - University of Oxford, Imperial College London and Royal Holloway University of London. Every year 6 to 10 accelerators science experts, trained via research on cutting edge projects, defend their PhD thesis in JAI partner universities. In this presentation we will overview the research and in particular the highly successful graduate training program in JAI.
Effective Teaching in Accelerated Learning Programs
ERIC Educational Resources Information Center
Boyd, Drick
2004-01-01
According to Wlodkowski (2003), "accelerated learning programs are one of the fastest growing transformations in higher education" (p. 5). The Center for the Study of Accelerated Learning at Regis University has documented at least 250 colleges or universities that offer accelerated learning programs for working adults. By definition, accelerated…
Grammar as a Programming Language. Artificial Intelligence Memo 391.
ERIC Educational Resources Information Center
Rowe, Neil
Student projects that involve writing generative grammars in the computer language, "LOGO," are described in this paper, which presents a grammar-running control structure that allows students to modify and improve the grammar interpreter itself while learning how a simple kind of computer parser works. Included are procedures for…
The Effect of Syntactic Constraints on the Processing of Backwards Anaphora
ERIC Educational Resources Information Center
Kazanina, Nina; Lau, Ellen F.; Lieberman, Moti; Yoshida, Masaya; Phillips, Colin
2007-01-01
This article presents three studies that investigate when syntactic constraints become available during the processing of long-distance backwards pronominal dependencies ("backwards anaphora" or "cataphora"). Earlier work demonstrated that in such structures the parser initiates an active search for an antecedent for a pronoun, leading to gender…
A natural language interface to databases
NASA Technical Reports Server (NTRS)
Ford, D. R.
1988-01-01
The development of a Natural Language Interface which is semantic-based and uses Conceptual Dependency representation is presented. The system was developed using Lisp and currently runs on a Symbolics Lisp machine. A key point is that the parser handles morphological analysis, which expands its capabilities of understanding more words.
Brain Responses to Filled Gaps
ERIC Educational Resources Information Center
Hestvik, Arild; Maxfield, Nathan; Schwartz, Richard G.; Shafer, Valerie
2007-01-01
An unresolved issue in the study of sentence comprehension is whether the process of gap-filling is mediated by the construction of empty categories (traces), or whether the parser relates fillers directly to the associated verb's argument structure. We conducted an event-related potentials (ERP) study that used the violation paradigm to examine…
Robo-Sensei's NLP-Based Error Detection and Feedback Generation
ERIC Educational Resources Information Center
Nagata, Noriko
2009-01-01
This paper presents a new version of Robo-Sensei's NLP (Natural Language Processing) system which updates the version currently available as the software package "ROBO-SENSEI: Personal Japanese Tutor" (Nagata, 2004). Robo-Sensei's NLP system includes a lexicon, a morphological generator, a word segmentor, a morphological parser, a syntactic…
HTSeq--a Python framework to work with high-throughput sequencing data.
Anders, Simon; Pyl, Paul Theodor; Huber, Wolfgang
2015-01-15
A large choice of tools exists for many standard tasks in the analysis of high-throughput sequencing (HTS) data. However, once a project deviates from standard workflows, custom scripts are needed. We present HTSeq, a Python library to facilitate the rapid development of such scripts. HTSeq offers parsers for many common data formats in HTS projects, as well as classes to represent data, such as genomic coordinates, sequences, sequencing reads, alignments, gene model information and variant calls, and provides data structures that allow for querying via genomic coordinates. We also present htseq-count, a tool developed with HTSeq that preprocesses RNA-Seq data for differential expression analysis by counting the overlap of reads with genes. HTSeq is released as an open-source software under the GNU General Public Licence and available from http://www-huber.embl.de/HTSeq or from the Python Package Index at https://pypi.python.org/pypi/HTSeq. © The Author 2014. Published by Oxford University Press.
Coaching versus Direct Service Models for University Training to Accelerated Schools.
ERIC Educational Resources Information Center
Kirby, Peggy C.; Meza, James, Jr.
This paper examines the changing roles and relationships of schools, central offices, and university facilitators at 11 schools that implemented the nationally recognized Accelerated Schools process. The schools joined the Louisiana Accelerated Schools Network in the summer of 1994. The paper begins with an overview of the Accelerated Schools…
Psychological Adjustment in a College-Level Program of Marked Academic Acceleration.
ERIC Educational Resources Information Center
Robinson, Nancy M.; Janos, Paul M.
1986-01-01
The questionnaire responses of 24 markedly accelerated young students at the University of Washington were compared with those of 24 regular-aged university students, 23 National Merit Scholors, and 27 students who had qualified for acceleration but instead elected to participate in high school. Accelerants appeared as well adjusted as all…
ERIC Educational Resources Information Center
Jared, Debra; Jouravlev, Olessia; Joanisse, Marc F.
2017-01-01
Decomposition theories of morphological processing in visual word recognition posit an early morpho-orthographic parser that is blind to semantic information, whereas parallel distributed processing (PDP) theories assume that the transparency of orthographic-semantic relationships influences processing from the beginning. To test these…
ERIC Educational Resources Information Center
Maxfield, Nathan D.; Lyon, Justine M.; Silliman, Elaine R.
2009-01-01
Bailey and Ferreira (2003) hypothesized and reported behavioral evidence that disfluencies (filled and silent pauses) undesirably affect sentence processing when they appear before disambiguating verbs in Garden Path (GP) sentences. Disfluencies here cause the parser to "linger" on, and apparently accept as correct, an erroneous parse. Critically,…
Intelligent interfaces for expert systems
NASA Technical Reports Server (NTRS)
Villarreal, James A.; Wang, Lui
1988-01-01
Vital to the success of an expert system is an interface to the user which performs intelligently. A generic intelligent interface is being developed for expert systems. This intelligent interface was developed around the in-house developed Expert System for the Flight Analysis System (ESFAS). The Flight Analysis System (FAS) is comprised of 84 configuration controlled FORTRAN subroutines that are used in the preflight analysis of the space shuttle. In order to use FAS proficiently, a person must be knowledgeable in the areas of flight mechanics, the procedures involved in deploying a certain payload, and an overall understanding of the FAS. ESFAS, still in its developmental stage, is taking into account much of this knowledge. The generic intelligent interface involves the integration of a speech recognizer and synthesizer, a preparser, and a natural language parser to ESFAS. The speech recognizer being used is capable of recognizing 1000 words of connected speech. The natural language parser is a commercial software package which uses caseframe instantiation in processing the streams of words from the speech recognizer or the keyboard. The systems configuration is described along with capabilities and drawbacks.
Two models of minimalist, incremental syntactic analysis.
Stabler, Edward P
2013-07-01
Minimalist grammars (MGs) and multiple context-free grammars (MCFGs) are weakly equivalent in the sense that they define the same languages, a large mildly context-sensitive class that properly includes context-free languages. But in addition, for each MG, there is an MCFG which is strongly equivalent in the sense that it defines the same language with isomorphic derivations. However, the structure-building rules of MGs but not MCFGs are defined in a way that generalizes across categories. Consequently, MGs can be exponentially more succinct than their MCFG equivalents, and this difference shows in parsing models too. An incremental, top-down beam parser for MGs is defined here, sound and complete for all MGs, and hence also capable of parsing all MCFG languages. But since the parser represents its grammar transparently, the relative succinctness of MGs is again evident. Although the determinants of MG structure are narrowly and discretely defined, probabilistic influences from a much broader domain can influence even the earliest analytic steps, allowing frequency and context effects to come early and from almost anywhere, as expected in incremental models. Copyright © 2013 Cognitive Science Society, Inc.
Supernovae, an accelerating universe and the cosmological constant
Kirshner, Robert P.
1999-01-01
Observations of supernova explosions halfway back to the Big Bang give plausible evidence that the expansion of the universe has been accelerating since that epoch, approximately 8 billion years ago and suggest that energy associated with the vacuum itself may be responsible for the acceleration. PMID:10200242
Reading Orthographically Strange Nonwords: Modelling Backup Strategies in Reading
ERIC Educational Resources Information Center
Perry, Conrad
2018-01-01
The latest version of the connectionist dual process model of reading (CDP++.parser) was tested on a set of nonwords, many of which were orthographically strange (e.g., PSIZ). A grapheme-by-grapheme read-out strategy was used because the normal strategy produced many poor responses. The new strategy allowed the model to produce results similar to…
Working Memory in the Processing of Long-Distance Dependencies: Interference and Filler Maintenance
ERIC Educational Resources Information Center
Ness, Tal; Meltzer-Asscher, Aya
2017-01-01
During the temporal delay between the filler and gap sites in long-distance dependencies, the "active filler" strategy can be implemented in two ways: the filler phrase can be actively maintained in working memory ("maintenance account"), or it can be retrieved only when the parser posits a gap ("retrieval account").…
ERIC Educational Resources Information Center
Heift, Trude; Schulze, Mathias
2012-01-01
This book provides the first comprehensive overview of theoretical issues, historical developments and current trends in ICALL (Intelligent Computer-Assisted Language Learning). It assumes a basic familiarity with Second Language Acquisition (SLA) theory and teaching, CALL and linguistics. It is of interest to upper undergraduate and/or graduate…
NASA Astrophysics Data System (ADS)
Zhu, Zong-Hong; Alcaniz, Jailson S.
2005-02-01
There is mounting observational evidence that the expansion of our universe is undergoing an acceleration. A dark energy component has usually been invoked as the most feasible mechanism for the acceleration. However, it is desirable to explore alternative possibilities motivated by particle physics before adopting such an untested entity. In this work, we focus our attention on an acceleration mechanism arising from gravitational leakage into extra dimensions. We test this scenario with high-z Type Ia supernovae compiled by Tonry and coworkers and recent measurements of the X-ray gas mass fractions in clusters of galaxies published by Allen and coworkers. A combination of the two databases gives, at a 99% confidence level, Ωm=0.29+0.04-0.02, Ωrc=0.21+/-0.08, and Ωk=-0.36+0.31-0.35, indicating a closed universe. We then constrain the model using the test of the turnaround redshift, zq=0, at which the universe switches from deceleration to acceleration. We show that, in order to explain that acceleration happened earlier than zq=0=0.6 within the framework of gravitational leakage into extra dimensions, a low matter density, Ωm<0.27, or a closed universe is necessary.
Development of the Accelerator Mass Spectrometry technology at the Comenius University in Bratislava
NASA Astrophysics Data System (ADS)
Povinec, Pavel P.; Masarik, Jozef; Ješkovský, Miroslav; Kaizer, Jakub; Šivo, Alexander; Breier, Robert; Pánik, Ján; Staníček, Jaroslav; Richtáriková, Marta; Zahoran, Miroslav; Zeman, Jakub
2015-10-01
An Accelerator Mass Spectrometry (AMS) laboratory has been established at the Centre for Nuclear and Accelerator Technologies (CENTA) at the Comenius University in Bratislava comprising of a MC-SNICS ion source, 3 MV Pelletron tandem accelerator, and an analyzer of accelerated ions. The preparation of targets for 14C and 129I AMS measurements is described in detail. The development of AMS techniques for potassium, uranium and thorium analysis in radiopure materials required for ultra-low background underground experiments is briefly mentioned.
Accelerated cosmos in a nonextensive setup
NASA Astrophysics Data System (ADS)
Moradpour, H.; Bonilla, Alexander; Abreu, Everton M. C.; Neto, Jorge Ananias
2017-12-01
Here we consider a flat FRW universe whose horizon entropy meets the Rényi entropy of nonextensive systems. In our model, the ordinary energy-momentum conservation law is not always valid. By applying the Clausius relation as well as the Cai-Kim temperature to the apparent horizon of a flat FRW universe, we obtain modified Friedmann equations. Fitting the model to the observational data on the current accelerated universe, some values for the model parameters are also addressed. Our study shows that the current accelerating phase of universe expansion may be described by a geometrical fluid, originated from the nonextensive aspects of geometry, which models a varying dark energy source interacting with the matter field in the Rastall way. Moreover, our results indicate that the probable nonextensive features of spacetime may also be used to model a varying dark energy source which does not interact with the matter field and is compatible with the current accelerated phase of the Universe.
Lazzarato, F; Franceschinis, G; Botta, M; Cordero, F; Calogero, R A
2004-11-01
RRE allows the extraction of non-coding regions surrounding a coding sequence [i.e. gene upstream region, 5'-untranslated region (5'-UTR), introns, 3'-UTR, downstream region] from annotated genomic datasets available at NCBI. RRE parser and web-based interface are accessible at http://www.bioinformatica.unito.it/bioinformatics/rre/rre.html
Sorry Dave, I’m Afraid I Can’t Do That: Explaining Unachievable Robot Tasks using Natural Language
2013-06-24
processing components used by Brooks et al. [6]: the Bikel parser [3] combined with the null element (understood subject) restoration of Gabbard et al...Intelligent Robots and Systems (IROS), pages 1988 – 1993, 2010. [12] Ryan Gabbard , Mitch Marcus, and Seth Kulick. Fully parsing the Penn Treebank. In Human
ERIC Educational Resources Information Center
Metzner, Paul; von der Malsburg, Titus; Vasishth, Shravan; Rösler, Frank
2017-01-01
How important is the ability to freely control eye movements for reading comprehension? And how does the parser make use of this freedom? We investigated these questions using coregistration of eye movements and event-related brain potentials (ERPs) while participants read either freely or in a computer-controlled word-by-word format (also known…
Integrated Intelligence: Robot Instruction via Interactive Grounded Learning
2016-02-14
ADDRESS (ES) U.S. Army Research Office P.O. Box 12211 Research Triangle Park, NC 27709-2211 Robotics; Natural Language Processing ; Grounded Language ...Logical Forms for Referring Expression Generation, Emperical Methods in Natural Language Processing (EMNLP). 18-OCT-13, . : , Tom Kwiatkowska, Eunsol...Choi, Yoav Artzi, Luke Zettlemoyer. Scaling Semantic Parsers with On-the-fly Ontology Matching, Emperical Methods in Natural Langauge Processing
ACPYPE - AnteChamber PYthon Parser interfacE.
Sousa da Silva, Alan W; Vranken, Wim F
2012-07-23
ACPYPE (or AnteChamber PYthon Parser interfacE) is a wrapper script around the ANTECHAMBER software that simplifies the generation of small molecule topologies and parameters for a variety of molecular dynamics programmes like GROMACS, CHARMM and CNS. It is written in the Python programming language and was developed as a tool for interfacing with other Python based applications such as the CCPN software suite (for NMR data analysis) and ARIA (for structure calculations from NMR data). ACPYPE is open source code, under GNU GPL v3, and is available as a stand-alone application at http://www.ccpn.ac.uk/acpype and as a web portal application at http://webapps.ccpn.ac.uk/acpype. We verified the topologies generated by ACPYPE in three ways: by comparing with default AMBER topologies for standard amino acids; by generating and verifying topologies for a large set of ligands from the PDB; and by recalculating the structures for 5 protein-ligand complexes from the PDB. ACPYPE is a tool that simplifies the automatic generation of topology and parameters in different formats for different molecular mechanics programmes, including calculation of partial charges, while being object oriented for integration with other applications.
Griss, Johannes; Reisinger, Florian; Hermjakob, Henning; Vizcaíno, Juan Antonio
2012-03-01
We here present the jmzReader library: a collection of Java application programming interfaces (APIs) to parse the most commonly used peak list and XML-based mass spectrometry (MS) data formats: DTA, MS2, MGF, PKL, mzXML, mzData, and mzML (based on the already existing API jmzML). The library is optimized to be used in conjunction with mzIdentML, the recently released standard data format for reporting protein and peptide identifications, developed by the HUPO proteomics standards initiative (PSI). mzIdentML files do not contain spectra data but contain references to different kinds of external MS data files. As a key functionality, all parsers implement a common interface that supports the various methods used by mzIdentML to reference external spectra. Thus, when developing software for mzIdentML, programmers no longer have to support multiple MS data file formats but only this one interface. The library (which includes a viewer) is open source and, together with detailed documentation, can be downloaded from http://code.google.com/p/jmzreader/. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Expressions Module for the Satellite Orbit Analysis Program
NASA Technical Reports Server (NTRS)
Edmonds, Karina
2008-01-01
The Expressions Module is a software module that has been incorporated into the Satellite Orbit Analysis Program (SOAP). The module includes an expressions- parser submodule built on top of an analytical system, enabling the user to define logical and numerical variables and constants. The variables can capture output from SOAP orbital-prediction and geometric-engine computations. The module can combine variables and constants with built-in logical operators (such as Boolean AND, OR, and NOT), relational operators (such as >, <, or =), and mathematical operators (such as addition, subtraction, multiplication, division, modulus, exponentiation, differentiation, and integration). Parentheses can be used to specify precedence of operations. The module contains a library of mathematical functions and operations, including logarithms, trigonometric functions, Bessel functions, minimum/ maximum operations, and floating- point-to-integer conversions. The module supports combinations of time, distance, and angular units and has a dimensional- analysis component that checks for correct usage of units. A parser based on the Flex language and the Bison program looks for and indicates errors in syntax. SOAP expressions can be built using other expressions as arguments, thus enabling the user to build analytical trees. A graphical user interface facilitates use.
Perruchet, Pierre; Tillmann, Barbara
2010-03-01
This study investigates the joint influences of three factors on the discovery of new word-like units in a continuous artificial speech stream: the statistical structure of the ongoing input, the initial word-likeness of parts of the speech flow, and the contextual information provided by the earlier emergence of other word-like units. Results of an experiment conducted with adult participants show that these sources of information have strong and interactive influences on word discovery. The authors then examine the ability of different models of word segmentation to account for these results. PARSER (Perruchet & Vinter, 1998) is compared to the view that word segmentation relies on the exploitation of transitional probabilities between successive syllables, and with the models based on the Minimum Description Length principle, such as INCDROP. The authors submit arguments suggesting that PARSER has the advantage of accounting for the whole pattern of data without ad-hoc modifications, while relying exclusively on general-purpose learning principles. This study strengthens the growing notion that nonspecific cognitive processes, mainly based on associative learning and memory principles, are able to account for a larger part of early language acquisition than previously assumed. Copyright © 2009 Cognitive Science Society, Inc.
Lexical and sublexical units in speech perception.
Giroux, Ibrahima; Rey, Arnaud
2009-03-01
Saffran, Newport, and Aslin (1996a) found that human infants are sensitive to statistical regularities corresponding to lexical units when hearing an artificial spoken language. Two sorts of segmentation strategies have been proposed to account for this early word-segmentation ability: bracketing strategies, in which infants are assumed to insert boundaries into continuous speech, and clustering strategies, in which infants are assumed to group certain speech sequences together into units (Swingley, 2005). In the present study, we test the predictions of two computational models instantiating each of these strategies i.e., Serial Recurrent Networks: Elman, 1990; and Parser: Perruchet & Vinter, 1998 in an experiment where we compare the lexical and sublexical recognition performance of adults after hearing 2 or 10 min of an artificial spoken language. The results are consistent with Parser's predictions and the clustering approach, showing that performance on words is better than performance on part-words only after 10 min. This result suggests that word segmentation abilities are not merely due to stronger associations between sublexical units but to the emergence of stronger lexical representations during the development of speech perception processes. Copyright © 2009, Cognitive Science Society, Inc.
ChemicalTagger: A tool for semantic text-mining in chemistry.
Hawizy, Lezan; Jessop, David M; Adams, Nico; Murray-Rust, Peter
2011-05-16
The primary method for scientific communication is in the form of published scientific articles and theses which use natural language combined with domain-specific terminology. As such, they contain free owing unstructured text. Given the usefulness of data extraction from unstructured literature, we aim to show how this can be achieved for the discipline of chemistry. The highly formulaic style of writing most chemists adopt make their contributions well suited to high-throughput Natural Language Processing (NLP) approaches. We have developed the ChemicalTagger parser as a medium-depth, phrase-based semantic NLP tool for the language of chemical experiments. Tagging is based on a modular architecture and uses a combination of OSCAR, domain-specific regex and English taggers to identify parts-of-speech. The ANTLR grammar is used to structure this into tree-based phrases. Using a metric that allows for overlapping annotations, we achieved machine-annotator agreements of 88.9% for phrase recognition and 91.9% for phrase-type identification (Action names). It is possible parse to chemical experimental text using rule-based techniques in conjunction with a formal grammar parser. ChemicalTagger has been deployed for over 10,000 patents and has identified solvents from their linguistic context with >99.5% precision.
Saul Perlmutter, Distant Supernovae, Dark Energy, and the Accelerating
, Distant Supernovae, Dark Energy, and the Accelerating Expansion of the Universe Resources with Additional nature of dark energy.'1 'The accelerating expansion means that the universe could expand forever until , in the distant future, it is cold and dark. The teams' discovery led to speculation that there is a
Universality of accelerating change
NASA Astrophysics Data System (ADS)
Eliazar, Iddo; Shlesinger, Michael F.
2018-03-01
On large time scales the progress of human technology follows an exponential growth trend that is termed accelerating change. The exponential growth trend is commonly considered to be the amalgamated effect of consecutive technology revolutions - where the progress carried in by each technology revolution follows an S-curve, and where the aging of each technology revolution drives humanity to push for the next technology revolution. Thus, as a collective, mankind is the 'intelligent designer' of accelerating change. In this paper we establish that the exponential growth trend - and only this trend - emerges universally, on large time scales, from systems that combine together two elements: randomness and amalgamation. Hence, the universal generation of accelerating change can be attained by systems with no 'intelligent designer'.
Quality control of concrete at the stage of designing its composition and technology
NASA Astrophysics Data System (ADS)
Kudyakov, A.; Prischepa, I.; Kiselev, D.; Prischepa, B.
2016-01-01
The results of tests on samples of foam concrete with a hardening accelerator are presented. As the setting and hardening accelerators the following chemical additives were used: Universal-P-2 and Asilin 12. All additives were added into the insulating foam concrete mix of brand D 400 in the amount of 0.5% to 1% of cement weight. By using of additives in foam concrete technology - hardening accelerators Asilin 12 and Universal P2 in the amount of 0.5 % - and 1.0% by weight of cement foam concrete structure formation is accelerated and increases strength by 60%. For the industrial preparation of foam concrete mix technological regulations are worked out, in which it is recommended to use additives - hardening accelerators Asilin 12 in the amount of 0.5% and Universal P2 - 1% of cement weight.
Linking Semantic and Knowledge Representations in a Multi-Domain Dialogue System
2007-06-01
accuracy evaluation presented in the next section shows that the generic version of the grammar performs similarly well on two evaluation domains...of extra insertions; for example, discourse adverbials such as now were inserted if present in the lattice. In addition, different tense and pronoun...automatic lexicon specialization technique improves parser speed and accuracy. 1 Introduction This paper presents an architecture of a language
The Hermod Behavioral Synthesis System
1988-06-08
LDescription 1 lib tech-independent Transformation & Parser Optimization lib Hardware • g - utSynhesze Generator li Datapath lb Hardware liCotllb...Proc. 22nd Design Automation Conference, ACM/IEEE, June 1985, pp. 475-481. [7] G . De Micheli, "Synthesis of Control Systems", in Design Systems for...VLSI Circuits: Logic Synthesis and Silicon Compilation, G . De Micheli, A. Sangiovanni-Vincentelli, and P. Antognetti, (editor), Martinus Nijhoff
Acceleration of black hole universe
NASA Astrophysics Data System (ADS)
Zhang, T. X.; Frederick, C.
2014-01-01
Recently, Zhang slightly modified the standard big bang theory and developed a new cosmological model called black hole universe, which is consistent with Mach's principle, governed by Einstein's general theory of relativity, and able to explain all observations of the universe. Previous studies accounted for the origin, structure, evolution, expansion, and cosmic microwave background radiation of the black hole universe, which grew from a star-like black hole with several solar masses through a supermassive black hole with billions of solar masses to the present state with hundred billion-trillions of solar masses by accreting ambient matter and merging with other black holes. This paper investigates acceleration of the black hole universe and provides an alternative explanation for the redshift and luminosity distance measurements of type Ia supernovae. The results indicate that the black hole universe accelerates its expansion when it accretes the ambient matter in an increasing rate. In other words, i.e., when the second-order derivative of the mass of the black hole universe with respect to the time is positive . For a constant deceleration parameter , we can perfectly explain the type Ia supernova measurements with the reduced chi-square to be very close to unity, χ red˜1.0012. The expansion and acceleration of black hole universe are driven by external energy.
The new 6 MV multi-nuclide AMS facility at the University of Tsukuba
NASA Astrophysics Data System (ADS)
Sasa, Kimikazu; Takahashi, Tsutomu; Matsumura, Masumi; Matsunaka, Tetsuya; Satou, Yukihiko; Izumi, Daiki; Sueki, Keisuke
2015-10-01
The former accelerator mass spectrometry (AMS) system installed on the 12UD Pelletron tandem accelerator at the University of Tsukuba was completely destroyed by the Great East Japan Earthquake on 11 March 2011. A replacement has been designed and constructed at the university as part of the post-quake reconstruction project. It consists of a 6 MV Pelletron tandem accelerator, two multiple cathode AMS ion sources (MC-SNICSs), and a rare-particle detection system. The 6 MV Pelletron tandem accelerator will be applied not only to AMS, but also to areas such as nanotechnology, ion beam analysis, heavy ion irradiation, and nuclear physics. The rare-particle detection system will be capable of measuring environmental levels of long-lived radioisotopes of 10Be, 14C, 26Al, 36Cl, 41Ca, and 129I. It is also expected to measure other radioisotopes such as 32Si and 90Sr. The 6 MV Pelletron tandem accelerator was installed in the spring of 2014 at the University of Tsukuba. Routine beam delivery and AMS experiments will start in 2015.
Quality control of concrete at the stage of designing its composition and technology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kudyakov, A., E-mail: kudyakow@mail.tomsknet.ru; Prischepa, I., E-mail: ingaprishepa@mail.ru; Kiselev, D.
The results of tests on samples of foam concrete with a hardening accelerator are presented. As the setting and hardening accelerators the following chemical additives were used: Universal-P-2 and Asilin 12. All additives were added into the insulating foam concrete mix of brand D 400 in the amount of 0.5% to 1% of cement weight. By using of additives in foam concrete technology – hardening accelerators Asilin 12 and Universal P2 in the amount of 0.5 % - and 1.0% by weight of cement foam concrete structure formation is accelerated and increases strength by 60%. For the industrial preparation ofmore » foam concrete mix technological regulations are worked out, in which it is recommended to use additives – hardening accelerators Asilin 12 in the amount of 0.5% and Universal P2 - 1% of cement weight.« less
Bulk viscous quintessential inflation
NASA Astrophysics Data System (ADS)
Haro, Jaume; Pan, Supriya
In a spatially-flat Friedmann-Lemaître-Robertson-Walker universe, the incorporation of bulk viscous process in general relativity leads to an appearance of a nonsingular background of the universe that both at early and late times depicts an accelerated universe. These early and late scenarios of the universe can be analytically calculated and mimicked, in the context of general relativity, by a single scalar field whose potential could also be obtained analytically where the early inflationary phase is described by a one-dimensional Higgs potential and the current acceleration is realized by an exponential potential. We show that the early inflationary universe leads to a power spectrum of the cosmological perturbations which match with current observational data, and after leaving the inflationary phase, the universe suffers a phase transition needed to explain the reheating of the universe via gravitational particle production. Furthermore, we find that at late times, the universe enters into the de Sitter phase that can explain the current cosmic acceleration. Finally, we also find that such bulk viscous-dominated universe attains the thermodynamical equilibrium, but in an asymptotic manner.
Controllability in Multi-Stage Laser Ion Acceleration
NASA Astrophysics Data System (ADS)
Kawata, S.; Kamiyama, D.; Ohtake, Y.; Barada, D.; Ma, Y. Y.; Kong, Q.; Wang, P. X.; Gu, Y. J.; Li, X. F.; Yu, Q.
2015-11-01
The present paper shows a concept for a future laser ion accelerator, which should have an ion source, ion collimators, ion beam bunchers and ion post acceleration devices. Based on the laser ion accelerator components, the ion particle energy and the ion energy spectrum are controlled, and a future compact laser ion accelerator would be designed for ion cancer therapy or for ion material treatment. In this study each component is designed to control the ion beam quality. The energy efficiency from the laser to ions is improved by using a solid target with a fine sub-wavelength structure or a near-critical density gas plasma. The ion beam collimation is performed by holes behind the solid target or a multi-layered solid target. The control of the ion energy spectrum and the ion particle energy, and the ion beam bunching are successfully realized by a multi-stage laser-target interaction. A combination of each component provides a high controllability of the ion beam quality to meet variable requirements in various purposes in the laser ion accelerator. The work was partly supported by MEXT, JSPS, ASHULA project/ ILE, Osaka University, CORE (Center for Optical Research and Education, Utsunomiya University, Japan), Fudan University and CDI (Creative Dept. for Innovation) in CCRD, Utsunomiya University.
ERIC Educational Resources Information Center
Ouellon, Conrad, Comp.
Presentations from a colloquium on applications of research on natural languages to computer science address the following topics: (1) analysis of complex adverbs; (2) parser use in computerized text analysis; (3) French language utilities; (4) lexicographic mapping of official language notices; (5) phonographic codification of Spanish; (6)…
Xyce Parallel Electronic Simulator : reference guide, version 2.0.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hoekstra, Robert John; Waters, Lon J.; Rankin, Eric Lamont
This document is a reference guide to the Xyce Parallel Electronic Simulator, and is a companion document to the Xyce Users' Guide. The focus of this document is (to the extent possible) exhaustively list device parameters, solver options, parser options, and other usage details of Xyce. This document is not intended to be a tutorial. Users who are new to circuit simulation are better served by the Xyce Users' Guide.
Effective Cyber Situation Awareness (CSA) Assessment and Training
2013-11-01
activity/scenario. y. Save Wireshark Captures. z. Save SNORT logs. aa. Save MySQL databases. 4. After the completion of the scenario, the reversion...line or from custom Java code. • Cisco ASA Parser: Builds normalized vendor-neutral firewall rule specifications from Cisco ASA and PIX firewall...The Service tool lets analysts build Cauldron models from either the command line or from custom Java code. Functionally, it corresponds to the
Xyce™ Parallel Electronic Simulator Reference Guide Version 6.8
DOE Office of Scientific and Technical Information (OSTI.GOV)
Keiter, Eric R.; Aadithya, Karthik Venkatraman; Mei, Ting
This document is a reference guide to the Xyce Parallel Electronic Simulator, and is a companion document to the Xyce Users' Guide. The focus of this document is (to the extent possible) exhaustively list device parameters, solver options, parser options, and other usage details of Xyce . This document is not intended to be a tutorial. Users who are new to circuit simulation are better served by the Xyce Users' Guide.
A Formal Model of Ambiguity and its Applications in Machine Translation
2010-01-01
structure indicates linguisti- cally implausible segmentation that might be generated using dictionary - driven approaches...derivation. As was done in the monolingual case, the functions LHS, RHSi, RHSo and υ can be extended to a derivation δ. D(q) where q ∈V denotes the... monolingual parses. My algorithm runs more efficiently than O(n6) with many grammars (including those that required using heuristic search with other parsers
Xyce Parallel Electronic Simulator Reference Guide Version 6.6.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Keiter, Eric R.; Aadithya, Karthik Venkatraman; Mei, Ting
This document is a reference guide to the Xyce Parallel Electronic Simulator, and is a companion document to the Xyce Users' Guide [1] . The focus of this document is (to the extent possible) exhaustively list device parameters, solver options, parser options, and other usage details of Xyce . This document is not intended to be a tutorial. Users who are new to circuit simulation are better served by the Xyce Users' Guide [1] . The information herein is subject to change without notice. Copyright c 2002-2016 Sandia Corporation. All rights reserved. Acknowledgements The BSIM Group at the University ofmore » California, Berkeley developed the BSIM3, BSIM4, BSIM6, BSIM-CMG and BSIM-SOI models. The BSIM3 is Copyright c 1999, Regents of the University of California. The BSIM4 is Copyright c 2006, Regents of the University of California. The BSIM6 is Copyright c 2015, Regents of the University of California. The BSIM-CMG is Copyright c 2012 and 2016, Regents of the University of California. The BSIM-SOI is Copyright c 1990, Regents of the University of California. All rights reserved. The Mextram model has been developed by NXP Semiconductors until 2007, Delft University of Technology from 2007 to 2014, and Auburn University since April 2015. Copyrights c of Mextram are with Delft University of Technology, NXP Semiconductors and Auburn University. The MIT VS Model Research Group developed the MIT Virtual Source (MVS) model. Copyright c 2013 Massachusetts Institute of Technology (MIT). The EKV3 MOSFET model was developed by the EKV Team of the Electronics Laboratory-TUC of the Technical University of Crete. Trademarks Xyce TM Electronic Simulator and Xyce TM are trademarks of Sandia Corporation. Orcad, Orcad Capture, PSpice and Probe are registered trademarks of Cadence Design Systems, Inc. Microsoft, Windows and Windows 7 are registered trademarks of Microsoft Corporation. Medici, DaVinci and Taurus are registered trademarks of Synopsys Corporation. Amtec and TecPlot are trademarks of Amtec Engineering, Inc. All other trademarks are property of their respective owners. Contacts World Wide Web http://xyce.sandia.gov https://info.sandia.gov/xyce (Sandia only) Email xyce@sandia.gov (outside Sandia) xyce-sandia@sandia.gov (Sandia only) Bug Reports (Sandia only) http://joseki-vm.sandia.gov/bugzilla http://morannon.sandia.gov/bugzilla« less
Predicting complex syntactic structure in real time: Processing of negative sentences in Russian.
Kazanina, Nina
2017-11-01
In Russian negative sentences the verb's direct object may appear either in the accusative case, which is licensed by the verb (as is common cross-linguistically), or in the genitive case, which is licensed by the negation (Russian-specific "genitive-of-negation" phenomenon). Such sentences were used to investigate whether case marking is employed for anticipating syntactic structure, and whether lexical heads other than the verb can be predicted on the basis of a case-marked noun phrase. Experiment 1, a completion task, confirmed that genitive-of-negation is part of Russian speakers' active grammatical repertoire. In Experiments 2 and 3, the genitive/accusative case manipulation on the preverbal object led to shorter reading times at the negation and verb in the genitive versus accusative condition. Furthermore, Experiment 3 manipulated linear order of the direct object and the negated verb in order to distinguish whether the abovementioned facilitatory effect was predictive or integrative in nature, and concluded that the parser actively predicts a verb and (otherwise optional) negation on the basis of a preceding genitive-marked object. Similarly to a head-final language, case-marking information on preverbal noun phrases (NPs) is used by the parser to enable incremental structure building in a free-word-order language such as Russian.
ChemicalTagger: A tool for semantic text-mining in chemistry
2011-01-01
Background The primary method for scientific communication is in the form of published scientific articles and theses which use natural language combined with domain-specific terminology. As such, they contain free owing unstructured text. Given the usefulness of data extraction from unstructured literature, we aim to show how this can be achieved for the discipline of chemistry. The highly formulaic style of writing most chemists adopt make their contributions well suited to high-throughput Natural Language Processing (NLP) approaches. Results We have developed the ChemicalTagger parser as a medium-depth, phrase-based semantic NLP tool for the language of chemical experiments. Tagging is based on a modular architecture and uses a combination of OSCAR, domain-specific regex and English taggers to identify parts-of-speech. The ANTLR grammar is used to structure this into tree-based phrases. Using a metric that allows for overlapping annotations, we achieved machine-annotator agreements of 88.9% for phrase recognition and 91.9% for phrase-type identification (Action names). Conclusions It is possible parse to chemical experimental text using rule-based techniques in conjunction with a formal grammar parser. ChemicalTagger has been deployed for over 10,000 patents and has identified solvents from their linguistic context with >99.5% precision. PMID:21575201
GENPLOT: A formula-based Pascal program for data manipulation and plotting
NASA Astrophysics Data System (ADS)
Kramer, Matthew J.
Geochemical processes involving alteration, differentiation, fractionation, or migration of elements may be elucidated by a number of discrimination or variation diagrams (e.g., AFM, Harker, Pearce, and many others). The construction of these diagrams involves arithmetic combination of selective elements (involving major, minor, or trace elements). GENPLOT utilizes a formula-based algorithm (an expression parser) which enables the program to manipulate multiparameter databases and plot XY, ternary, tetrahedron, and REE type plots without needing to change either the source code or rearranging databases. Formulae may be any quadratic expression whose variables are the column headings of the data matrix. A full-screen editor with limited equations and arithmetic functions (spreadsheet) has been incorporated into the program to aid data entry and editing. Data are stored as ASCII files to facilitate interchange of data between other programs and computers. GENPLOT was developed in Turbo Pascal for the IBM and compatible computers but also is available in Apple Pascal for the Apple Ile and Ill. Because the source code is too extensive to list here (about 5200 lines of Pascal code), the expression parsing routine, which is central to GENPLOT's flexibility is incorporated into a smaller demonstration program named SOLVE. The following paper includes a discussion on how the expression parser works and a detailed description of GENPLOT's capabilities.
Transition from AdS universe to DS universe in the BPP model
NASA Astrophysics Data System (ADS)
Kim, Wontae; Yoon, Myungseok
2007-04-01
It can be shown that in the BPP model the smooth phase transition from the asymptotically decelerated AdS universe to the asymptotically accelerated DS universe is possible by solving the modified semiclassical equations of motion. This transition comes from noncommutative Poisson algebra, which gives the constant curvature scalars asymptotically. The decelerated expansion of the early universe is due to the negative energy density with the negative pressure induced by quantum back reaction, and the accelerated late-time universe comes from the positive energy and the negative pressure which behave like dark energy source in recent cosmological models.
The solutions and thermodynamic dark energy in the accelerating universe
DOE Office of Scientific and Technical Information (OSTI.GOV)
Demirel, E. C. Günay
Recently, Tachyonic matter expressed in terms of scalar field is suggested to be the reason of acceleration of the universe as dark energy [1]-[3]. In this study, dynamic solutions and thermodynamic properties of matters such as Tachyonic matters were investigated.
Marshak Lectureship: The Turkish Accelerator Center, TAC
NASA Astrophysics Data System (ADS)
Yavas, Omer
2012-02-01
The Turkish Accelerator Center (TAC) project is comprised of five different electron and proton accelerator complexes, to be built over 15 years, with a phased approach. The Turkish Government funds the project. Currently there are 23 Universities in Turkey associated with the TAC project. The current funded project, which is to run until 2013 aims *To establish a superconducting linac based infra-red free electron laser and Bremsstrahlung Facility (TARLA) at the Golbasi Campus of Ankara University, *To establish the Institute of Accelerator Technologies in Ankara University, and *To complete the Technical Design Report of TAC. The proposed facilities are a 3^rd generation Synchrotron Radiation facility, SASE-FEL facility, a GeV scale Proton Accelerator facility and an electron-positron collider as a super charm factory. In this talk, an overview on the general status and road map of TAC project will be given. National and regional importance of TAC will be expressed and the structure of national and internatonal collaborations will be explained.
ERIC Educational Resources Information Center
Melendez, Edwin; Suarez, Carlos
This document describes the Accelerated Associate's Degree Program for Licensed Practical Nurses (LPN) at the Inter-American University of Puerto Rico. The program, targeting unemployed LPNs living in San Juan, Puerto Rico, allows students to complete an associate's degree in one year. Fifty-four students enrolled during the first year and 50% of…
NASA Astrophysics Data System (ADS)
Ureña-López, L. Arturo; Robles, Victor H.; Matos, T.
2017-08-01
Recent analysis of the rotation curves of a large sample of galaxies with very diverse stellar properties reveals a relation between the radial acceleration purely due to the baryonic matter and the one inferred directly from the observed rotation curves. Assuming the dark matter (DM) exists, this acceleration relation is tantamount to an acceleration relation between DM and baryons. This leads us to a universal maximum acceleration for all halos. Using the latter in DM profiles that predict inner cores implies that the central surface density μDM=ρsrs must be a universal constant, as suggested by previous studies of selected galaxies, revealing a strong correlation between the density ρs and scale rs parameters in each profile. We then explore the consequences of the constancy of μDM in the context of the ultralight scalar field dark matter model (SFDM). We find that for this model μDM=648 M⊙ pc-2 and that the so-called WaveDM soliton profile should be a universal feature of the DM halos. Comparing with the data from the Milky Way and Andromeda satellites, we find that they are all consistent with a boson mass of the scalar field particle of the order of 10-21 eV /c2, which puts the SFDM model in agreement with recent cosmological constraints.
NASA Astrophysics Data System (ADS)
Sapone, Domenico
In this paper we review a part of the approaches that have been considered to explain the extraordinary discovery of the late time acceleration of the Universe. We discuss the arguments that have led physicists and astronomers to accept dark energy as the current preferable candidate to explain the acceleration. We highlight the problems and the attempts to overcome the difficulties related to such a component. We also consider alternative theories capable of explaining the acceleration of the Universe, such as modification of gravity. We compare the two approaches and point out the observational consequences, reaching the sad but foresightful conclusion that we will not be able to distinguish between a Universe filled by dark energy or a Universe where gravity is different from General Relativity. We review the present observations and discuss the future experiments that will help us to learn more about our Universe. This is not intended to be a complete list of all the dark energy models but this paper should be seen as a review on the phenomena responsible for the acceleration. Moreover, in a landscape of hardly compelling theories, it is an important task to build simple measurable parameters useful for future experiments that will help us to understand more about the evolution of the Universe.
2016-02-01
In addition , the parser updates some parameters based on uncertainties. For example, Analytica was very slow to update Pk values based on...moderate range. The additional security environments helped to fill gaps in lower severity. Weapons Effectiveness Pk values were modified to account for two...project is to help improve the value and character of defense resource planning in an era of growing uncertainty and complex strategic challenges
NASA Astrophysics Data System (ADS)
Derriere, Sebastien; Gray, Norman; Demleitner, Markus; Louys, Mireille; Ochsenbein, Francois; Derriere, Sebastien; Gray, Norman
2014-05-01
This document describes a recommended syntax for writing the string representation of unit labels ("VOUnits"). In addition, it describes a set of recognised and deprecated units, which is as far as possible consistent with other relevant standards (BIPM, ISO/IEC and the IAU). The intention is that units written to conform to this specification will likely also be parsable by other well-known parsers. To this end, we include machine-readable grammars for other units syntaxes.
Xyce parallel electronic simulator reference guide, Version 6.0.1.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Keiter, Eric R; Mei, Ting; Russo, Thomas V.
2014-01-01
This document is a reference guide to the Xyce Parallel Electronic Simulator, and is a companion document to the Xyce Users Guide [1] . The focus of this document is (to the extent possible) exhaustively list device parameters, solver options, parser options, and other usage details of Xyce. This document is not intended to be a tutorial. Users who are new to circuit simulation are better served by the Xyce Users Guide [1] .
Xyce parallel electronic simulator reference guide, version 6.0.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Keiter, Eric R; Mei, Ting; Russo, Thomas V.
2013-08-01
This document is a reference guide to the Xyce Parallel Electronic Simulator, and is a companion document to the Xyce Users Guide [1] . The focus of this document is (to the extent possible) exhaustively list device parameters, solver options, parser options, and other usage details of Xyce. This document is not intended to be a tutorial. Users who are new to circuit simulation are better served by the Xyce Users Guide [1] .
2000-01-01
for flight test data, and both generic and specialized tools of data filtering , data calibration, modeling , system identification, and simulation...GRAMMATICAL MODEL AND PARSER FOR AIR TRAFFIC CONTROLLER’S COMMANDS 11 A SPEECH-CONTROLLED INTERACTIVE VIRTUAL ENVIRONMENT FOR SHIP FAMILIARIZATION 12... MODELING AND SIMULATION IN THE 21ST CENTURY 23 NEW COTS HARDWARE AND SOFTWARE REDUCE THE COST AND EFFORT IN REPLACING AGING FLIGHT SIMULATORS SUBSYSTEMS
Criteria for Evaluating the Performance of Compilers
1974-10-01
cannot be made to fit, then an auxiliary mechanism outside the parser might be used . Finally, changing the choice of parsing tech - nique to a...was not useful in providing a basic for compiler evaluation. The study of the first question eztablished criteria and methodb for assigning four...program. The study of the second question estab- lished criteria for defining a "compiler Gibson mix", and established methods for using this "mix" to
Intelligent Agents as a Basis for Natural Language Interfaces
1988-01-01
language analysis component of UC, which produces a semantic representa tion of the input. This representation is in the form of a KODIAK network (see...Appendix A). Next, UC’s Concretion Mechanism performs concretion inferences ([Wilensky, 1983] and [Norvig, 1983]) based on the semantic network...The first step in UC’s processing is done by UC’s parser/understander component which produces a KODIAK semantic network representa tion of
Learning for Semantic Parsing with Kernels under Various Forms of Supervision
2007-08-01
natural language sentences to their formal executable meaning representations. This is a challenging problem and is critical for developing computing...sentences are semantically tractable. This indi- cates that Geoquery is more challenging domain for semantic parsing than ATIS. In the past, there have been a...Combining parsers. In Proceedings of the Conference on Em- pirical Methods in Natural Language Processing and Very Large Corpora (EMNLP/ VLC -99), pp. 187–194
Xyce Parallel Electronic Simulator Reference Guide Version 6.4
DOE Office of Scientific and Technical Information (OSTI.GOV)
Keiter, Eric R.; Mei, Ting; Russo, Thomas V.
This document is a reference guide to the Xyce Parallel Electronic Simulator, and is a companion document to the Xyce Users' Guide [1] . The focus of this document is (to the extent possible) exhaustively list device parameters, solver options, parser options, and other usage details of Xyce . This document is not intended to be a tutorial. Users who are new to circuit simulation are better served by the Xyce Users' Guide [1] . Trademarks The information herein is subject to change without notice. Copyright c 2002-2015 Sandia Corporation. All rights reserved. Xyce TM Electronic Simulator and Xyce TMmore » are trademarks of Sandia Corporation. Portions of the Xyce TM code are: Copyright c 2002, The Regents of the University of California. Produced at the Lawrence Livermore National Laboratory. Written by Alan Hindmarsh, Allan Taylor, Radu Serban. UCRL-CODE-2002-59 All rights reserved. Orcad, Orcad Capture, PSpice and Probe are registered trademarks of Cadence Design Systems, Inc. Microsoft, Windows and Windows 7 are registered trademarks of Microsoft Corporation. Medici, DaVinci and Taurus are registered trademarks of Synopsys Corporation. Amtec and TecPlot are trademarks of Amtec Engineering, Inc. Xyce 's expression library is based on that inside Spice 3F5 developed by the EECS Department at the University of California. The EKV3 MOSFET model was developed by the EKV Team of the Electronics Laboratory-TUC of the Technical University of Crete. All other trademarks are property of their respective owners. Contacts Bug Reports (Sandia only) http://joseki.sandia.gov/bugzilla http://charleston.sandia.gov/bugzilla World Wide Web http://xyce.sandia.gov http://charleston.sandia.gov/xyce (Sandia only) Email xyce@sandia.gov (outside Sandia) xyce-sandia@sandia.gov (Sandia only)« less
A new AMS facility at Inter University Accelerator Centre, New Delhi
NASA Astrophysics Data System (ADS)
Kumar, Pankaj; Chopra, S.; Pattanaik, J. K.; Ojha, S.; Gargari, S.; Joshi, R.; Kanjilal, D.
2015-10-01
Inter University Accelerator Centre (IUAC), a national facility of government of India, is having a 15UD Pelletron accelerator for multidisciplinary ion beam based research programs. Recently, a new accelerator mass spectrometry (AMS) facility has been developed after incorporating many changes in the existing 15UD Pelletron accelerator. A clean chemistry laboratory for 10Be and 26Al with all the modern facilities has also been developed for the chemical processing of samples. 10Be measurements on sediment samples, inter laboratory comparison results and 26Al measurements on standard samples are presented in this paper. In addition to the 10Be and 26Al AMS facilities, a new 14C AMS facility based on a dedicated 500 kV tandem ion accelerator with two cesium sputter ion sources, is also being setup at IUAC.
Method for Direct Measurement of Cosmic Acceleration by 21-cm Absorption Systems
NASA Astrophysics Data System (ADS)
Yu, Hao-Ran; Zhang, Tong-Jie; Pen, Ue-Li
2014-07-01
So far there is only indirect evidence that the Universe is undergoing an accelerated expansion. The evidence for cosmic acceleration is based on the observation of different objects at different distances and requires invoking the Copernican cosmological principle and Einstein's equations of motion. We examine the direct observability using recession velocity drifts (Sandage-Loeb effect) of 21-cm hydrogen absorption systems in upcoming radio surveys. This measures the change in velocity of the same objects separated by a time interval and is a model-independent measure of acceleration. We forecast that for a CHIME-like survey with a decade time span, we can detect the acceleration of a ΛCDM universe with 5σ confidence. This acceleration test requires modest data analysis and storage changes from the normal processing and cannot be recovered retroactively.
"DIANA" - A New, Deep-Underground Accelerator Facility for Astrophysics Experiments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Leitner, M.; Leitner, D.; Lemut, A.
2009-05-28
The DIANA project (Dakota Ion Accelerators for Nuclear Astrophysics) is a collaboration between the University of Notre Dame, University of North Carolina, Western Michigan University, and Lawrence Berkeley National Laboratory to build a nuclear astrophysics accelerator facility 1.4 km below ground. DIANA is part of the US proposal DUSEL (Deep Underground Science and Engineering Laboratory) to establish a cross-disciplinary underground laboratory in the former gold mine of Homestake in South Dakota, USA. DIANA would consist of two high-current accelerators, a 30 to 400 kV variable, high-voltage platform, and a second, dynamitron accelerator with a voltage range of 350 kV tomore » 3 MV. As a unique feature, both accelerators are planned to be equipped with either high-current microwave ion sources or multi-charged ECR ion sources producing ions from protons to oxygen. Electrostatic quadrupole transport elements will be incorporated in the dynamitron high voltage column. Compared to current astrophysics facilities, DIANA could increase the available beam densities on target by magnitudes: up to 100 mA on the low energy accelerator and several mA on the high energy accelerator. An integral part of the DIANA project is the development of a high-density super-sonic gas-jet target which can handle these anticipated beam powers. The paper will explain the main components of the DIANA accelerators and their beam transport lines and will discuss related technical challenges.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Taylor, R.C.
This thesis involved the construction of (1) a grammar that incorporates knowledge on base invariancy and secondary structure in a molecule and (2) a parser engine that uses the grammar to position bases into the structural subunits of the molecule. These concepts were combined with a novel pinning technique to form a tool that semi-automates insertion of a new species into the alignment for the 16S rRNA molecule (a component of the ribosome) maintained by Dr. Carl Woese's group at the University of Illinois at Urbana. The tool was tested on species extracted from the alignment and on a groupmore » of entirely new species. The results were very encouraging, and the tool should be substantial aid to the curators of the 16S alignment. The construction of the grammar was itself automated, allowing application of the tool to alignments for other molecules. The logic programming language Prolog was used to construct all programs involved. The computational linguistics approach used here was found to be a useful way to attach the problem of insertion into an alignment.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Taylor, Ronald C.
This thesis involved the construction of (1) a grammar that incorporates knowledge on base invariancy and secondary structure in a molecule and (2) a parser engine that uses the grammar to position bases into the structural subunits of the molecule. These concepts were combined with a novel pinning technique to form a tool that semi-automates insertion of a new species into the alignment for the 16S rRNA molecule (a component of the ribosome) maintained by Dr. Carl Woese`s group at the University of Illinois at Urbana. The tool was tested on species extracted from the alignment and on a groupmore » of entirely new species. The results were very encouraging, and the tool should be substantial aid to the curators of the 16S alignment. The construction of the grammar was itself automated, allowing application of the tool to alignments for other molecules. The logic programming language Prolog was used to construct all programs involved. The computational linguistics approach used here was found to be a useful way to attach the problem of insertion into an alignment.« less
Recon2Neo4j: applying graph database technologies for managing comprehensive genome-scale networks.
Balaur, Irina; Mazein, Alexander; Saqi, Mansoor; Lysenko, Artem; Rawlings, Christopher J; Auffray, Charles
2017-04-01
The goal of this work is to offer a computational framework for exploring data from the Recon2 human metabolic reconstruction model. Advanced user access features have been developed using the Neo4j graph database technology and this paper describes key features such as efficient management of the network data, examples of the network querying for addressing particular tasks, and how query results are converted back to the Systems Biology Markup Language (SBML) standard format. The Neo4j-based metabolic framework facilitates exploration of highly connected and comprehensive human metabolic data and identification of metabolic subnetworks of interest. A Java-based parser component has been developed to convert query results (available in the JSON format) into SBML and SIF formats in order to facilitate further results exploration, enhancement or network sharing. The Neo4j-based metabolic framework is freely available from: https://diseaseknowledgebase.etriks.org/metabolic/browser/ . The java code files developed for this work are available from the following url: https://github.com/ibalaur/MetabolicFramework . ibalaur@eisbm.org. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.
Improvement of the High Fluence Irradiation Facility at the University of Tokyo
NASA Astrophysics Data System (ADS)
Murakami, Kenta; Iwai, Takeo; Abe, Hiroaki; Sekimura, Naoto
2016-08-01
This paper reports the modification of the High Fluence Irradiation Facility at the University of Tokyo (HIT). The HIT facility was severely damaged during the 2011 earthquake, which occurred off the Pacific coast of Tohoku. A damaged 1.0 MV tandem Cockcroft-Walton accelerator was replaced with a 1.7 MV accelerator, which was formerly used in another campus of the university. A decision was made to maintain dual-beam irradiation capability by repairing the 3.75 MV single-ended Van de Graaff accelerator and reconstructing the related beamlines. A new beamline was connected with a 200 kV transmission electron microscope (TEM) to perform in-situ TEM observation under ion irradiation.
Teaching and Research with Accelerators at Tarleton State University
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marble, Daniel K.
2009-03-10
Tarleton State University students began performing both research and laboratory experiments using accelerators in 1998 through visitation programs at the University of North Texas, US Army Research Laboratory, and the Naval Surface Warfare Center at Carderock. In 2003, Tarleton outfitted its new science building with a 1 MV pelletron that was donated by the California Institution of Technology. The accelerator has been upgraded and supports a wide range of classes for both the Physics program and the ABET accredited Engineering Physics program as well as supplying undergraduate research opportunities on campus. A discussion of various laboratory activities and research projectsmore » performed by Tarleton students will be presented.« less
Local expansion flows of galaxies: quantifying acceleration effect of dark energy
NASA Astrophysics Data System (ADS)
Chernin, A. D.; Teerikorpi, P.
2013-08-01
The nearest expansion flow of galaxies observed around the Local group is studied as an archetypical example of the newly discovered local expansion flows around groups and clusters of galaxies in the nearby Universe. The flow is accelerating due to the antigravity produced by the universal dark energy background. We introduce a new acceleration measure of the flow which is the dimensionless ``acceleration parameter" Q (x) = x - x-2 depending on the normalized distance x only. The parameter is zero at the zero-gravity distance x = 1, and Q(x) ∝ x, when x ≫ 1. At the distance x = 3, the parameter Q = 2.9. Since the expansion flows have a self-similar structure in normalized variables, we expect that the result is valid as well for all the other expansion flows around groups and clusters of galaxies on the spatial scales from ˜ 1 to ˜ 10 Mpc everywhere in the Universe.
The status and road map of Turkish Accelerator Center (TAC)
NASA Astrophysics Data System (ADS)
Yavaş, Ö.
2012-02-01
Turkish Accelerator Center (TAC) project is supported by the State Planning Organization (SPO) of Turkey and coordinated by Ankara University. After having completed the Feasibility Report (FR) in 2000 and the Conceptual Design Report (CDR) in 2005, third phase of the project started in 2006 as an inter-universities project including ten Turkish Universities with the support of SPO. Third phase of the project has two main scientific goals: to prepare the Technical Design Report (TDR) of TAC and to establish an Infrared Free Electron Laser (IR FEL) facility, named as Turkish Accelerator and Radiation Laboratory at Ankara (TARLA) as a first step. The facility is planned to be completed in 2015 and will be based on 15-40 MeV superconducting linac. In this paper, main aims, national and regional importance, main parts main parameters, status and road map of Turkish Accelerator Center will be presented.
Accelerator Science: Circular vs. Linear
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lincoln, Don
Particle accelerator are scientific instruments that allow scientists to collide particles together at incredible energies to study the secrets of the universe. However, there are many manners in which particle accelerators can be constructed. In this video, Fermilab’s Dr. Don Lincoln explains the pros and cons of circular and linear accelerators.
A new IBA-AMS laboratory at the Comenius University in Bratislava (Slovakia)
NASA Astrophysics Data System (ADS)
Povinec, Pavel P.; Masarik, Jozef; Kúš, Peter; Holý, Karol; Ješkovský, Miroslav; Breier, Robert; Staníček, Jaroslav; Šivo, Alexander; Richtáriková, Marta; Kováčik, Andrej; Szarka, Ján; Steier, Peter; Priller, Alfred
2015-01-01
A Centre for Nuclear and Accelerator Technologies (CENTA) has been established at the Comenius University in Bratislava comprising of a tandem laboratory designed for Ion Beam Analysis (IBA), Ion Beam Modification (IBM) of materials and Accelerator Mass Spectrometry (AMS). The main equipment of the laboratory, i.e. Alphatross and MC-SNICS ion sources, 3 MV Pelletron tandem accelerator, and analyzers of accelerated ions are described. Optimization of ion beam characteristics for different ion sources with gas and solid targets, for transmission of accelerated ions with different energy and charge state, for different parameters of the high-energy ion analyzers, as well as first AMS results are presented. The scientific program of the CENTA will be devoted mainly to nuclear, environmental, life and material sciences.
Effects of Tasks on BOLD Signal Responses to Sentence Contrasts: Review and Commentary
Caplan, David; Gow, David
2010-01-01
Functional neuroimaging studies of syntactic processing have been interpreted as identifying the neural locations of parsing and interpretive operations. However, current behavioral studies of sentence processing indicate that many operations occur simultaneously with parsing and interpretation. In this review, we point to issues that arise in discriminating the effects of these concurrent processes from those of the parser/interpreter in neural measures and to approaches that may help resolve them. PMID:20932562
Analysis of the Impact of Data Normalization on Cyber Event Correlation Query Performance
2012-03-01
2003). Organizations use it in planning, target marketing , decision-making, data analysis, and customer services (Shin, 2003). Organizations that...Following this IP address is a router message sequence number. This is a globally unique number for each router terminal and can range from...Appendix G, invokes the PERL parser for the log files from a particular USAF base, and invokes the CTL file that loads the resultant CSV file into the
Xyce parallel electronic simulator reference guide, version 6.1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Keiter, Eric R; Mei, Ting; Russo, Thomas V.
2014-03-01
This document is a reference guide to the Xyce Parallel Electronic Simulator, and is a companion document to the Xyce Users Guide [1] . The focus of this document is (to the extent possible) exhaustively list device parameters, solver options, parser options, and other usage details of Xyce. This document is not intended to be a tutorial. Users who are new to circuit simulation are better served by the Xyce Users Guide [1] .
System Data Model (SDM) Source Code
2012-08-23
CROSS_COMPILE=/opt/gumstix/build_arm_nofpu/staging_dir/bin/arm-linux-uclibcgnueabi- 8 : CC=$(CROSS_COMPILE)gcc 9: CXX=$(CROSS_COMPILE)g++ 10 : AR...and flags to pass to it 6: LEX=flex 7: LEXFLAGS=-B 8 : 9: ## The parser generator to invoke and flags to pass to it 10 : YACC=bison 11: YACCFLAGS...5: # Point to default PetaLinux root directory 6: ifndef ROOTDIR 7: ROOTDIR=$(PETALINUX)/software/petalinux-dist 8 : endif 9: 10 : PATH:=$(PATH
Understanding and Capturing People’s Mobile App Privacy Preferences
2013-10-28
The entire apps’ metadata takes up about 500MB of storage space when stored in a MySQL database and all the binary files take approximately 300GB of...functionality that can de- compile Dalvik bytecodes to Java source code faster than other de-compilers. Given the scale of the app analysis we planned on... java libraries, such as parser, sql connectors, etc Targeted Ads 137 admob, adwhirl, greystripe… Provided by mobile behavioral ads company to
DSS 13 Microprocessor Antenna Controller
NASA Technical Reports Server (NTRS)
Gosline, R. M.
1984-01-01
A microprocessor based antenna controller system developed as part of the unattended station project for DSS 13 is described. Both the hardware and software top level designs are presented and the major problems encounted are discussed. Developments useful to related projects include a JPL standard 15 line interface using a single board computer, a general purpose parser, a fast floating point to ASCII conversion technique, and experience gained in using off board floating point processors with the 8080 CPU.
Intelligent Information Retrieval for a Multimedia Database Using Captions
1992-07-23
The user was allowed to retrieve any of several multimedia types depending on the descriptors entered. An example mentioned was the assembly of a...statistics showed some performance improvements over a keyword search. Similar type work was described by Wong eL al (1987) where a vector space representation...keyword) lists for searching the lexicon (a syntactic parser is not used); a type hierarchy of terms was used in the process. The system then checked the
Extracting BI-RADS Features from Portuguese Clinical Texts
Nassif, Houssam; Cunha, Filipe; Moreira, Inês C.; Cruz-Correia, Ricardo; Sousa, Eliana; Page, David; Burnside, Elizabeth; Dutra, Inês
2013-01-01
In this work we build the first BI-RADS parser for Portuguese free texts, modeled after existing approaches to extract BI-RADS features from English medical records. Our concept finder uses a semantic grammar based on the BIRADS lexicon and on iterative transferred expert knowledge. We compare the performance of our algorithm to manual annotation by a specialist in mammography. Our results show that our parser’s performance is comparable to the manual method. PMID:23797461
Friederici, A D
1995-09-01
This paper presents a model describing the temporal and neurotopological structure of syntactic processes during comprehension. It postulates three distinct phases of language comprehension, two of which are primarily syntactic in nature. During the first phase the parser assigns the initial syntactic structure on the basis of word category information. These early structural processes are assumed to be subserved by the anterior parts of the left hemisphere, as event-related brain potentials show this area to be maximally activated when phrase structure violations are processed and as circumscribed lesions in this area lead to an impairment of the on-line structural assignment. During the second phase lexical-semantic and verb-argument structure information is processed. This phase is neurophysiologically manifest in a negative component in the event-related brain potential around 400 ms after stimulus onset which is distributed over the left and right temporo-parietal areas when lexical-semantic information is processed and over left anterior areas when verb-argument structure information is processed. During the third phase the parser tries to map the initial syntactic structure onto the available lexical-semantic and verb-argument structure information. In case of an unsuccessful match between the two types of information reanalyses may become necessary. These processes of structural reanalysis are correlated with a centroparietally distributed late positive component in the event-related brain potential.(ABSTRACT TRUNCATED AT 250 WORDS)
Gauging the cosmic acceleration with recent type Ia supernovae data sets
NASA Astrophysics Data System (ADS)
Velten, Hermano; Gomes, Syrios; Busti, Vinicius C.
2018-04-01
We revisit a model-independent estimator for cosmic acceleration based on type Ia supernovae distance measurements. This approach does not rely on any specific theory for gravity, energy content, nor parametrization for the scale factor or deceleration parameter and is based on falsifying the null hypothesis that the Universe never expanded in an accelerated way. By generating mock catalogs of known cosmologies, we test the robustness of this estimator, establishing its limits of applicability. We detail the pros and cons of such an approach. For example, we find that there are specific counterexamples in which the estimator wrongly provides evidence against acceleration in accelerating cosmologies. The dependence of the estimator on the H0 value is also discussed. Finally, we update the evidence for acceleration using the recent UNION2.1 and Joint Light-Curve Analysis samples. Contrary to recent claims, available data strongly favor an accelerated expansion of the Universe in complete agreement with the standard Λ CDM model.
Recent results from the University of Washington's 38 mm ram accelerator
NASA Technical Reports Server (NTRS)
De Turenne, J. A.; Chew, G.; Bruckner, A. P.
1992-01-01
The ram accelerator is a propulsive device that accelerates projectiles using gasdynamic cycles similar to those which generate thrust in airbreathing ramjets. The projectile, analogous to the centerbody of a ramjet, travels supersonically through a stationary tube containing a gaseous fuel and oxidizer mixture. The projectile itself carries no onboard propellant. A combustion zone follows the projectile and stabilizes the shock structure. The resulting pressure distribution continuously accelerates the projectile. Several modes of ram accelerator operation have been investigated experimentally and theoretically. At velocities below the Chapman-Jouguet (C-J) detonation speed of the propellant mixture, the thermally choked propulsion mode accelerates the projectiles. At projectile velocities between approximately 90 and 110 percent of the C-J speed, a transdetonative propulsion mode occurs. At velocities beyond 110 percent of the C-J speed, projectiles experience superdetonative propulsion. This paper presents recent experimental results from these propulsion modes obtained with the University of Washington's 38-mm bore ram accelerator. Data from investigations with hydrogen diluted-gas mixtures are also introduced.
A bridge between unified cosmic history by f( R)-gravity and BIonic system
NASA Astrophysics Data System (ADS)
Sepehri, Alireza; Capozziello, Salvatore; Setare, Mohammad Reza
2016-04-01
Recently, the cosmological deceleration-acceleration transition redshift in f( R) gravity has been considered in order to address consistently the problem of cosmic evolution. It is possible to show that the deceleration parameter changes sign at a given redshift according to observational data. Furthermore, a f( R) gravity cosmological model can be constructed in brane-antibrane system starting from the very early universe and accounting for the cosmological redshift at all phases of cosmic history, from inflation to late time acceleration. Here we propose a f( R) model where transition redshifts correspond to inflation-deceleration and deceleration-late time acceleration transitions starting froma BIon system. At the point where the universe was born, due to the transition of k black fundamental strings to the BIon configuration, the redshift is approximately infinity and decreases with reducing temperature (z˜ T2). The BIon is a configuration in flat space of a universe-brane and a parallel anti-universe-brane connected by a wormhole. This wormhole is a channel for flowing energy from extra dimensions into our universe, occurring at inflation and decreasing with redshift as z˜ T^{4+1/7}. Dynamics consists with the fact that the wormhole misses its energy and vanishes as soon as inflation ends and deceleration begins. Approaching two universe branes together, a tachyon is originated, it grows up and causes the formation of a wormhole. We show that, in the framework of f( R) gravity, the cosmological redshift depends on the tachyonic potential and has a significant decrease at deceleration-late time acceleration transition point (z˜ T^{2/3}). As soon as today acceleration approaches, the redshift tends to zero and the cosmological model reduces to the standard Λ CDM cosmology.
Accelerator Science: Circular vs. Linear
Lincoln, Don
2018-06-12
Particle accelerator are scientific instruments that allow scientists to collide particles together at incredible energies to study the secrets of the universe. However, there are many manners in which particle accelerators can be constructed. In this video, Fermilabâs Dr. Don Lincoln explains the pros and cons of circular and linear accelerators.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Perlmutter, Saul
2012-01-13
The Department of Energy (DOE) hosted an event Friday, January 13, with 2011 Physics Nobel Laureate Saul Perlmutter. Dr. Perlmutter, a physicist at the Department’s Lawrence Berkeley National Laboratory and a professor of physics at the University of California at Berkeley, won the 2011 Nobel Prize in Physics “for the discovery of the accelerating expansion of the Universe through observations of distant supernovae.” DOE’s Office of Science has supported Dr. Perlmutter’s research at Berkeley Lab since 1983. After the introduction from Secretary of Energy Steven Chu, Dr. Perlmutter delivered a presentation entitled "Supernovae, Dark Energy and the Accelerating Universe: Howmore » DOE Helped to Win (yet another) Nobel Prize." [Copied with editing from DOE Media Advisory issued January 10th, found at http://energy.gov/articles/energy-department-host-event-2011-physics-nobel-laureate-saul-perlmutter]« less
The Naples University 3 MV tandem accelerator
DOE Office of Scientific and Technical Information (OSTI.GOV)
Campajola, L.; Brondi, A.
2013-07-18
The 3 MV tandem accelerator of the Naples University is used for research activities and applications in many fields. At the beginning of operation (1977) the main utilization was in the field of nuclear physics. Later, the realization of new beam lines allowed the development of applied activities as radiocarbon dating, ion beam analysis, biophysics, ion implantation etc. At present, the availability of different ion sources and many improvements on the accelerator allow to run experiments in a wide range of subjects. An overview of the characteristics and major activities of the laboratory is presented.
The accelerated residency program: the Marshall University family practice 9-year experience.
Petrany, Stephen M; Crespo, Richard
2002-10-01
In 1989, the American Board of Family Practice (ABFP) approved the first of 12 accelerated residency programs in family practice. These experimental programs provide a 1-year experience for select medical students that combines the requirements of the fourth year of medical school with those of the first year of residency, reducing the total training time by 1 year. This paper reports on the achievements and limitations of the Marshall University accelerated residency program over a 9-year period that began in 1992. Several parameters have been monitored since the inception of the accelerated program and provide the basis for comparison of accelerated and traditional residents. These include initial resident characteristics, performance outcomes, and practice choices. A total of 16 students were accepted into the accelerated track from 1992 through 1998. During the same time period, 44 residents entered the traditional residency program. Accelerated resident tended to be older and had more career experience than their traditional counterparts. As a group, the accelerated residents scored an average of 30 points higher on the final in-training exams provided by the ABFP. All residents in both groups remained at Marshall to complete the full residency training experience, and all those who have taken the ABFP certifying exam have passed. Accelerated residents were more likely to practice in West Virginia, consistent with one of the initial goals for the program. In addition, accelerated residents were more likely to be elected chief resident and choose an academic career than those in the traditional group. Both groups opted for small town or rural practice equally. The Marshall University family practice 9-year experience with the accelerated residency track demonstrates that for carefully selected candidates, the program can provide an overall shortened path to board certification and attract students who excel academically and have high leadership potential. Reports from other accelerated programs are needed to fully assess the outcomes of this experiment in postgraduate medical education.
Accelerated testing for studying pavement design and performance (FY 2002) : research summary.
DOT National Transportation Integrated Search
2004-01-01
This report covers the Fiscal Year 2002 project conducted at the Accelerated Testing : Laboratory at Kansas State University. The project was selected and funded by the : Midwest States Accelerated Testing Pooled Fund Program, which includes Iowa, Ka...
Accelerator Science: Proton vs. Electron
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lincoln, Don
Particle accelerators are one of the most powerful ways to study the fundamental laws that govern the universe. However, there are many design considerations that go into selecting and building a particular accelerator. In this video, Fermilab’s Dr. Don Lincoln explains the pros and cons of building an accelerator that collides pairs of protons to one that collides electrons.
The Organization of Knowledge in a Multi-Lingual, Integrated Parser.
1984-11-01
presunto S maniatico sexual quo dio muerte a golpes y a punalades a una mujer do 55 anos, informiron fuentes illegadas a Is investigacion. Literally in...el hospital la joven Rosa Areas, la que fue herida de bala por un uniformado. English: Rosa Areas is still in the hospital after being shot and wounded...by a soldier. In this sentence, the subject, " joven " (young person), is found after the verb, "se encuentra" (finds herself). To handle situations
Extract and visualize geolocation from any text file
NASA Astrophysics Data System (ADS)
Boustani, M.
2015-12-01
There are variety of text file formats such as PDF, HTML and more which contains words about locations(countries, cities, regions and more). GeoParser developed as one of sub-projects under DARPA Memex to help finding any geolocation information crawled website data. It is a web application benefiting from Apache Tika to extract locations from any text file format and visualize geolocations on the map. https://github.com/MBoustani/GeoParserhttps://github.com/chrismattmann/tika-pythonhttp://www.darpa.mil/program/memex
Numerical Function Generators Using LUT Cascades
2007-06-01
either algebraically (for example, sinðxÞ) or as a table of input/ output values. The user defines the numerical function by using the syntax of Scilab ...defined function in Scilab or specify it directly. Note that, by changing the parser of our system, any format can be used for the design entry. First...Methods for Multiple-Valued Input Address Generators,” Proc. 36th IEEE Int’l Symp. Multiple-Valued Logic (ISMVL ’06), May 2006. [29] Scilab 3.0, INRIA-ENPC
DBPQL: A view-oriented query language for the Intel Data Base Processor
NASA Technical Reports Server (NTRS)
Fishwick, P. A.
1983-01-01
An interactive query language (BDPQL) for the Intel Data Base Processor (DBP) is defined. DBPQL includes a parser generator package which permits the analyst to easily create and manipulate the query statement syntax and semantics. The prototype language, DBPQL, includes trace and performance commands to aid the analyst when implementing new commands and analyzing the execution characteristics of the DBP. The DBPQL grammar file and associated key procedures are included as an appendix to this report.
Catalog Descriptions Using VOTable Files
NASA Astrophysics Data System (ADS)
Thompson, R.; Levay, K.; Kimball, T.; White, R.
2008-08-01
Additional information is frequently required to describe database table contents and make it understandable to users. For this reason, the Multimission Archive at Space Telescope (MAST) creates Òdescription filesÓ for each table/catalog. After trying various XML and CSV formats, we finally chose VOTable. These files are easy to update via an HTML form, easily read using an XML parser such as (in our case) the PHP5 SimpleXML extension, and have found multiple uses in our data access/retrieval process.
Parser for Sabin-to-Mahoney Transition Model of Quasispecies Replication
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ecale Zhou, Carol
2016-01-03
This code is a data parse for preparing output from the Qspp agent-based stochastic simulation model for plotting in Excel. This code is specific to a set of simulations that were run for the purpose of preparing data for a publication. It is necessary to make this code open-source in order to publish the model code (Qspp), which has already been released. There is a necessity of assuring that results from using Qspp for a publication
Open Radio Communications Architecture Core Framework V1.1.0 Volume 1 Software Users Manual
2005-02-01
on a PC utilizing the KDE desktop that comes with Red Hat Linux . The default desktop for most Red Hat Linux installations is the GNOME desktop. The...SCA) v2.2. The software was designed for a desktop computer running the Linux operating system (OS). It was developed in C++, uses ACE/TAO for CORBA...middleware, Xerces for the XML parser, and Red Hat Linux for the Operating System. The software is referred to as, Open Radio Communication
1990-01-01
Identification of Syntactic Units Exemplar I.A. (#l) Problem (1) The tough coach the young. (2) The tough coach married a star. (3) The tough coach married ...34the tough" vs. "the tough coach" and (b) "people" vs. " married people." The problem could also be considered a problem of determining lexical...and " married " in example (2). Once the parser specifies a verb, the structure of the rest of the sentence is determined: specifying "coach" as a
Attrition and success rates of accelerated students in nursing courses: a systematic review.
Doggrell, Sheila Anne; Schaffer, Sally
2016-01-01
There is a comprehensive literature on the academic outcomes (attrition and success) of students in traditional/baccalaureate nursing programs, but much less is known about the academic outcomes of students in accelerated nursing programs. The aim of this systematic review is to report on the attrition and success rates (either internal examination or NCLEX-RN) of accelerated students, compared to traditional students. For the systematic review, the databases (Pubmed, Cinahl and PsychINFO) and Google Scholar were searched using the search terms 'accelerated' or 'accreditation for prior learning', 'fast-track' or 'top up' and 'nursing' with 'attrition' or 'retention' or 'withdrawal' or 'success' from 1994 to January 2016. All relevant articles were included, regardless of quality. The findings of 19 studies of attrition rates and/or success rates for accelerated students are reported. For international accelerated students, there were only three studies, which are heterogeneous, and have major limitations. One of three studies has lower attrition rates, and one has shown higher success rates, than traditional students. In contrast, another study has shown high attrition and low success for international accelerated students. For graduate accelerated students, most of the studies are high quality, and showed that they have rates similar or better than traditional students. Thus, five of six studies have shown similar or lower attrition rates. Four of these studies with graduate accelerated students and an additional seven studies of success rates only, have shown similar or better success rates, than traditional students. There are only three studies of non-university graduate accelerated students, and these had weaknesses, but were consistent in reporting higher attrition rates than traditional students. The paucity and weakness of information available makes it unclear as to the attrition and/or success of international accelerated students in nursing programs. The good information available suggests that accelerated programs may be working reasonably well for the graduate students. However, the limited information available for non-university graduate students is weak, but consistent, in suggesting they may struggle in accelerated courses. Further studies are needed to determine the attrition and success rates of accelerated students, particularly for international and non-university graduate students.
Lambda-universe in scalar-tensor gravity
NASA Astrophysics Data System (ADS)
Berman, Marcelo Samuel
2009-09-01
We present a lambda-Universe, in scalar-tensor gravity, reviewing Berman and Trevisan’s inflationary case (Berman and Trevisan in Int. J. Theor. Phys., 2009) and then we find a solution for an accelerating power-law scale-factor. The negativity of cosmic pressure implies acceleration of the expansion, even with Λ<0. The cosmological term, and the coupling “constant”, are in fact, time-varying.
Doggrell, Sheila Anne; Schaffer, Sally
2016-02-01
To reduce nursing shortages, accelerated nursing programs are available for domestic and international students. However, the withdrawal and failure rates from these programs may be different than for the traditional programs. The main aim of our study was to improve the retention and experience of accelerated nursing students. The academic background, age, withdrawal and failure rates of the accelerated and traditional students were determined. Data from 2009 and 2010 were collected prior to intervention. In an attempt to reduce the withdrawal of accelerated students, we set up an intervention, which was available to all students. The assessment of the intervention was a pre-post-test design with non-equivalent groups (the traditional and the accelerated students). The elements of the intervention were a) a formative website activity of some basic concepts in anatomy, physiology and pharmacology, b) a workshop addressing study skills and online resources, and c) resource lectures in anatomy/physiology and microbiology. The formative website and workshop was evaluated using questionnaires. The accelerated nursing students were five years older than the traditional students (p < 0.0001). The withdrawal rates from a pharmacology course are higher for accelerated nursing students, than for traditional students who have undertaken first year courses in anatomy and physiology (p = 0.04 in 2010). The withdrawing students were predominantly the domestic students with non-university qualifications or equivalent experience. The failure rates were also higher for this group, compared to the traditional students (p = 0.05 in 2009 and 0.03 in 2010). In contrast, the withdrawal rates for the international and domestic graduate accelerated students were very low. After the intervention, the withdrawal and failure rates in pharmacology for domestic accelerated students with non-university qualifications were not significantly different than those of traditional students. The accelerated international and domestic graduate nursing students have low withdrawal rates and high success rates in a pharmacology course. However, domestic students with non-university qualifications have higher withdrawal and failure rates than other nursing students and may be underprepared for university study in pharmacology in nursing programs. The introduction of an intervention was associated with reduced withdrawal and failure rates for these students in the pharmacology course.
Entangle Accelerating Universe
NASA Astrophysics Data System (ADS)
González-Díaz, Pedro F.; Robles-Pérez, Salvador a. i. e.
We show that there exists a T-duality symmetry between two-dimensional warp drives and two dimensional Tolman-Hawking and Gidding-Strominger baby universes respectively correlated in pairs, so that the creation of warp drives is also equivalent to space-time squeezing. It has been also seen that the nucleation of warp drives entails a violation of the Bell's inequalities. These results are generalized to the case of any dynamically accelerating universe whose creation is also physically equivalent to spacetime squeezing and to the violation of the Bell's inequalities, so that the universe we are living in should be governed by essential sharp quantum theory laws and must be a quantum entangled system.
NASA Astrophysics Data System (ADS)
Del McDaniel, Floyd; Doyle, Barney L.
Jerry Duggan was an experimental MeV-accelerator-based nuclear and atomic physicist who, over the past few decades, played a key role in the important transition of this field from basic to applied physics. His fascination for and application of particle accelerators spanned almost 60 years, and led to important discoveries in the following fields: accelerator-based analysis (accelerator mass spectrometry, ion beam techniques, nuclear-based analysis, nuclear microprobes, neutron techniques); accelerator facilities, stewardship, and technology development; accelerator applications (industrial, medical, security and defense, and teaching with accelerators); applied research with accelerators (advanced synthesis and modification, radiation effects, nanosciences and technology); physics research (atomic and molecular physics, and nuclear physics); and many other areas and applications. Here we describe Jerry’s physics education at the University of North Texas (B. S. and M. S.) and Louisiana State University (Ph.D.). We also discuss his research at UNT, LSU, and Oak Ridge National Laboratory, his involvement with the industrial aspects of accelerators, and his impact on many graduate students, colleagues at UNT and other universities, national laboratories, and industry and acquaintances around the world. Along the way, we found it hard not to also talk about his love of family, sports, fishing, and other recreational activities. While these were significant accomplishments in his life, Jerry will be most remembered for his insight in starting and his industry in maintaining and growing what became one of the most diverse accelerator conferences in the world — the International Conference on the Application of Accelerators in Research and Industry, or what we all know as CAARI. Through this conference, which he ran almost single-handed for decades, Jerry came to know, and became well known by, literally thousands of atomic and nuclear physicists, accelerator engineers and vendors, medical doctors, cultural heritage experts... the list goes on and on. While thousands of his acquaintances already miss Jerry, this is being felt most by his family and us (B.D. and F.D.M).
NASA Astrophysics Data System (ADS)
Del McDaniel, Floyd; Doyle, Barney L.
Jerry Duggan was an experimental MeV-accelerator-based nuclear and atomic physicist who, over the past few decades, played a key role in the important transition of this field from basic to applied physics. His fascination for and application of particle accelerators spanned almost 60 years, and led to important discoveries in the following fields: accelerator-based analysis (accelerator mass spectrometry, ion beam techniques, nuclear-based analysis, nuclear microprobes, neutron techniques); accelerator facilities, stewardship, and technology development; accelerator applications (industrial, medical, security and defense, and teaching with accelerators); applied research with accelerators (advanced synthesis and modification, radiation effects, nanosciences and technology); physics research (atomic and molecular physics, and nuclear physics); and many other areas and applications. Here we describe Jerry's physics education at the University of North Texas (B. S. and M. S.) and Louisiana State University (Ph.D.). We also discuss his research at UNT, LSU, and Oak Ridge National Laboratory, his involvement with the industrial aspects of accelerators, and his impact on many graduate students, colleagues at UNT and other universities, national laboratories, and industry and acquaintances around the world. Along the way, we found it hard not to also talk about his love of family, sports, fishing, and other recreational activities. While these were significant accomplishments in his life, Jerry will be most remembered for his insight in starting and his industry in maintaining and growing what became one of the most diverse accelerator conferences in the world — the International Conference on the Application of Accelerators in Research and Industry, or what we all know as CAARI. Through this conference, which he ran almost single-handed for decades, Jerry came to know, and became well known by, literally thousands of atomic and nuclear physicists, accelerator engineers and vendors, medical doctors, cultural heritage experts... the list goes on and on. While thousands of his acquaintances already miss Jerry, this is being felt most by his family and us (B.D. and F.D.M).
Iowa Acceleration Scale Manual: A Guide for Whole-Grade Acceleration K-8. (3rd Edition, Manual)
ERIC Educational Resources Information Center
Assouline, Susan G.; Colangelo, Nicholas; Lupkowski-Shoplik, Ann; Forstadt, Leslie; Lipscomb, Jonathon
2009-01-01
Feedback from years of nationwide use has resulted in a 3rd Edition of this unique, systematic, and objective guide to considering and implementing academic acceleration. Developed and tested by the Belin-Blank Center at the University of Iowa, the IAS ensures that acceleration decisions are systematic, thoughtful, well reasoned, and defensible.…
Accelerator Science: Proton vs. Electron
Lincoln, Don
2018-06-12
Particle accelerators are one of the most powerful ways to study the fundamental laws that govern the universe. However, there are many design considerations that go into selecting and building a particular accelerator. In this video, Fermilabâs Dr. Don Lincoln explains the pros and cons of building an accelerator that collides pairs of protons to one that collides electrons.
The Beginning and End of the Universe
NASA Technical Reports Server (NTRS)
Gardner, Jonathan P.
2011-01-01
Cosmology is the scientific study of how the Universe began more than 13 billion years ago, how its properties have changed from that time to the present, and what its eventual fate might be. Observational cosmology uses telescopes like the Hubble to reach back in time to find the faint echoes of the Big Bang. In this lecture, I will give an overview of cosmology, highlighting the very rapid progress this field has made in the last decade, and the role that NASA space telescopes have played and will continue to play in the years to come. I will then focus on two of the most intriguing of those recent discoveries: inflation and dark energy. Our universe began in an extremely rapid accelerated expansion, called inflation, which removed all traces anything that may have existed before, flattened the geometry of space-time, and turned microscopic quantum fluctuations into the largest structures in the universe. At the present time, more than 70% of the mass-energy in the Universe consists of a mysterious substance called dark energy. The dark energy causes the expansion of the Universe to accelerate, and he will discuss the ways that we might be able to measure that acceleration more accurately, revealing the nature of the dark energy and learning the eventual fate of the Universe.
The Beginning and End of the Universe
NASA Technical Reports Server (NTRS)
Gardner, Jonathan P.
2012-01-01
Cosmology is the scientific study of how the Universe began more than 13 billion years ago, how its properties have changed from that time to the present, and what its eventual fate might be. Observational cosmology uses telescopes like the Hubble to reach back in time to find the faint echoes of the Big Bang. In this lecture, I will give an overview of cosmology, highlighting the very rapid progress this field has made in the last decade, and the role that NASA space telescopes have played and will continue to play in the years to come. I will then focus on two of the most intriguing of those recent discoveries: inflation and dark energy. Our universe began in an extremely rapid accelerated expansion, called inflation, which removed all traces anything that may have existed before, flattened the geometry of space-time, and turned microscopic quantum fluctuations into the largest structures in the universe. At the present time, more than 70% of the mass-energy in the Universe consists of a mysterious substance called dark energy. The dark energy causes the expansion of the Universe to accelerate, and he will discuss the ways that we might be able to measure that acceleration more accurately, revealing the nature of the dark energy and learning the eventual fate of the Universe.
Evolution and dynamics of a matter creation model
NASA Astrophysics Data System (ADS)
Pan, S.; de Haro, J.; Paliathanasis, A.; Slagter, R. J.
2016-08-01
In a flat Friedmann-Lemaître-Robertson-Walker (FLRW) geometry, we consider the expansion of the universe powered by the gravitationally induced `adiabatic' matter creation. To demonstrate how matter creation works well with the expanding universe, we have considered a general creation rate and analysed this rate in the framework of dynamical analysis. The dynamical analysis hints the presence of a non-singular universe (without the big bang singularity) with two successive accelerated phases, one at the very early phase of the universe (I.e. inflation), and the other one describes the current accelerating universe, where this early, late accelerated phases are associated with an unstable fixed point (I.e. repeller) and a stable fixed point (attractor), respectively. We have described this phenomena by analytic solutions of the Hubble function and the scale factor of the FLRW universe. Using Jacobi last multiplier method, we have found a Lagrangian for this matter creation rate describing this scenario of the universe. To match with our early physics results, we introduce an equivalent dynamics driven by a single scalar field, discuss the associated observable parameters and compare them with the latest Planck data sets. Finally, introducing the teleparallel modified gravity, we have established an equivalent gravitational theory in the framework of matter creation.
ERIC Educational Resources Information Center
Shaw, Stuart D.; Werno, Magda A.
2016-01-01
This case study sought to gain a better understanding of the impact of the Cambridge Acceleration Program on students' transition from high school to college at one American university. The findings from an online questionnaire indicate that many participants develop a range of skills that are perceived as important in the context of university…
DIANA - A deep underground accelerator for nuclear astrophysics experiments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Winklehner, Daniel; Leitner, Daniela; Lemut, Alberto
DIANA (Dakota Ion Accelerator for Nuclear Astrophysics) is a proposed facility designed to be operated deep underground. The DIANA collaboration includes nuclear astrophysics groups from Lawrence Berkeley National Laboratory, Michigan State University, Western Michigan University, Colorado School of Mines, and the University of North Carolina, and is led by the University of Notre Dame. The scientific goals of the facility are measurements of low energy nuclear cross-sections associated with sun and pre-supernova stars in a laboratory setup at energies that are close to those in stars. Because of the low stellar temperatures associated with these environments, and the high Coulombmore » barrier, the reaction cross-sections are extremely low. Therefore these measurements are hampered by small signal to background ratios. By going underground the background due to cosmic rays can be reduced by several orders of magnitude. We report on the design status of the DIANA facility with focus on the 3 MV electrostatic accelerator.« less
Dark energy two decades after: observables, probes, consistency tests.
Huterer, Dragan; Shafer, Daniel L
2018-01-01
The discovery of the accelerating universe in the late 1990s was a watershed moment in modern cosmology, as it indicated the presence of a fundamentally new, dominant contribution to the energy budget of the universe. Evidence for dark energy, the new component that causes the acceleration, has since become extremely strong, owing to an impressive variety of increasingly precise measurements of the expansion history and the growth of structure in the universe. Still, one of the central challenges of modern cosmology is to shed light on the physical mechanism behind the accelerating universe. In this review, we briefly summarize the developments that led to the discovery of dark energy. Next, we discuss the parametric descriptions of dark energy and the cosmological tests that allow us to better understand its nature. We then review the cosmological probes of dark energy. For each probe, we briefly discuss the physics behind it and its prospects for measuring dark energy properties. We end with a summary of the current status of dark energy research.
Perlmutter, Saul; Chu, Steven
2018-05-31
The Department of Energy (DOE) hosted an event Friday, January 13, with 2011 Physics Nobel Laureate Saul Perlmutter. Dr. Perlmutter, a physicist at the Departmentâs Lawrence Berkeley National Laboratory and a professor of physics at the University of California at Berkeley, won the 2011 Nobel Prize in Physics âfor the discovery of the accelerating expansion of the Universe through observations of distant supernovae.â DOEâs Office of Science has supported Dr. Perlmutterâs research at Berkeley Lab since 1983. After the introduction from Secretary of Energy Steven Chu, Dr. Perlmutter delivered a presentation entitled "Supernovae, Dark Energy and the Accelerating Universe: How DOE Helped to Win (yet another) Nobel Prize." [Copied with editing from DOE Media Advisory issued January 10th, found at http://energy.gov/articles/energy-department-host-event-2011-physics-nobel-laureate-saul-perlmutter
DOT National Transportation Integrated Search
2004-08-01
This report covers the Fiscal Year 2002 project conducted at the Accelerated Testing Laboratory at Kansas : State University. The project was selected and funded by the Midwest Accelerated Testing Pooled Fund Program , : which includes Iowa, Kansas, ...
Cosmological models constructed by van der Waals fluid approximation and volumetric expansion
NASA Astrophysics Data System (ADS)
Samanta, G. C.; Myrzakulov, R.
The universe modeled with van der Waals fluid approximation, where the van der Waals fluid equation of state contains a single parameter ωv. Analytical solutions to the Einstein’s field equations are obtained by assuming the mean scale factor of the metric follows volumetric exponential and power-law expansions. The model describes a rapid expansion where the acceleration grows in an exponential way and the van der Waals fluid behaves like an inflation for an initial epoch of the universe. Also, the model describes that when time goes away the acceleration is positive, but it decreases to zero and the van der Waals fluid approximation behaves like a present accelerated phase of the universe. Finally, it is observed that the model contains a type-III future singularity for volumetric power-law expansion.
Some Consequences of the Expansion of the Universe on Small Scales
NASA Astrophysics Data System (ADS)
Harutyunian, H. A.
2017-12-01
The possibility of detecting the accelerated expansion of the universe at all its points is examined. Observational data indicative of Hubble expansion on small scales are adduced for this purpose. The validity of current opinion on the equilibrium of systems of cosmic objects is also discussed. It is noted that this opinion is a simple consequence of the unproved Kant-Laplace hypothesis on the formation of cosmic objects and systems of them. It is proposed that a system attached to the cosmological horizon be used as a reference system. It is noted that all points on this sphere are an initial point from which expansion of the observed universe of the given observer began. The numerical value of the acceleration obtained in this way is almost the same as the anomalous acceleration found by space probes.
ERIC Educational Resources Information Center
Gupta, Himani
2017-01-01
The Accelerated Study in Associate Programs (ASAP), developed by the City University of New York (CUNY), is an uncommonly comprehensive and long-term program designed to address low graduation rates among community college students. MDRC has been studying the effects of ASAP on low-income students with developmental (remedial) education needs at…
Supernovae and the Accelerating Universe
NASA Technical Reports Server (NTRS)
Wood, H. John
2003-01-01
Orbiting high above the turbulence of the earth's atmosphere, the Hubble Space Telescope (HST) has provided breathtaking views of astronomical objects never before seen in such detail. The steady diffraction-limited images allow this medium-size telescope to reach faint galaxies of 30th stellar magnitude. Some of these galaxies are seen as early as 2 billion years after the Big Bang in a 15 billion year old universe. Up until recently, astronomers assumed that all of the laws of physics and astronomy applied back then as they do today. Now, using the discovery that certain supernovae are standard candles, astronomers have found that the universe is expanding faster today than it was back then: the universe is accelerating in its expansion.
DEEPEN: A negation detection system for clinical text incorporating dependency relation into NegEx
Mehrabi, Saeed; Krishnan, Anand; Sohn, Sunghwan; Roch, Alexandra M; Schmidt, Heidi; Kesterson, Joe; Beesley, Chris; Dexter, Paul; Schmidt, C. Max; Liu, Hongfang; Palakal, Mathew
2018-01-01
In Electronic Health Records (EHRs), much of valuable information regarding patients’ conditions is embedded in free text format. Natural language processing (NLP) techniques have been developed to extract clinical information from free text. One challenge faced in clinical NLP is that the meaning of clinical entities is heavily affected by modifiers such as negation. A negation detection algorithm, NegEx, applies a simplistic approach that has been shown to be powerful in clinical NLP. However, due to the failure to consider the contextual relationship between words within a sentence, NegEx fails to correctly capture the negation status of concepts in complex sentences. Incorrect negation assignment could cause inaccurate diagnosis of patients’ condition or contaminated study cohorts. We developed a negation algorithm called DEEPEN to decrease NegEx’s false positives by taking into account the dependency relationship between negation words and concepts within a sentence using Stanford dependency parser. The system was developed and tested using EHR data from Indiana University (IU) and it was further evaluated on Mayo Clinic dataset to assess its generalizability. The evaluation results demonstrate DEEPEN, which incorporates dependency parsing into NegEx, can reduce the number of incorrect negation assignment for patients with positive findings, and therefore improve the identification of patients with the target clinical findings in EHRs. PMID:25791500
Can superhorizon cosmological perturbations explain the acceleration of the universe?
NASA Astrophysics Data System (ADS)
Hirata, Christopher M.; Seljak, Uroš
2005-10-01
We investigate the recent suggestions by Barausse et al. and Kolb et al. that the acceleration of the universe could be explained by large superhorizon fluctuations generated by inflation. We show that no acceleration can be produced by this mechanism. We begin by showing how the application of Raychaudhuri equation to inhomogeneous cosmologies results in several “no go” theorems for accelerated expansion. Next we derive an exact solution for a specific case of initial perturbations, for which application of the Kolb et al. expressions leads to an acceleration, while the exact solution reveals that no acceleration is present. We show that the discrepancy can be traced to higher-order terms that were dropped in the Kolb et al. analysis. We proceed with the analysis of initial value formulation of general relativity to argue that causality severely limits what observable effects can be derived from superhorizon perturbations. By constructing a Riemann normal coordinate system on initial slice we show that no infrared divergence terms arise in this coordinate system. Thus any divergences found previously can be eliminated by a local rescaling of coordinates and are unobservable. We perform an explicit analysis of the variance of the deceleration parameter for the case of single-field inflation using usual coordinates and show that the infrared-divergent terms found by Barausse et al. and Kolb et al. cancel against several additional terms not considered in their analysis. Finally, we argue that introducing isocurvature perturbations does not alter our conclusion that the accelerating expansion of the universe cannot be explained by superhorizon modes.
What makes the Universe accelerate? A review on what dark energy could be and how to test it.
Brax, Philippe
2018-01-01
Explaining the origin of the acceleration of the expansion of the Universe remains as challenging as ever. In this review, we present different approaches from dark energy to modified gravity. We also emphasize the quantum nature of the problem and the need for an explanation which should violate Weinberg's no go theorem. This might involve a self-tuning mechanism or the acausal sequestering of the vacuum energy. Laboratory tests of the coupling to matter of nearly massless scalar fields, which could be one of the features required to explain the cosmic acceleration, are also reviewed.
An ion beam facility based on a 3 MV tandetron accelerator in Sichuan University, China
NASA Astrophysics Data System (ADS)
Han, Jifeng; An, Zhu; Zheng, Gaoqun; Bai, Fan; Li, Zhihui; Wang, Peng; Liao, Xiaodong; Liu, Mantian; Chen, Shunli; Song, Mingjiang; Zhang, Jun
2018-03-01
A new ion beam facility based on a 3 MV tandetron accelerator system has been installed in Sichuan University, China. The facility was developed by High Voltage Engineering Europa and consists of three high-energy beam lines including the ion beam analysis, ion implantation and nuclear physics experiment end stations, respectively. The terminal voltage stability of the accelerator is better than ±30 V, and the brightness of the proton beam is approximately 5.06 A/rad2/m2/eV. The system demonstrates a great application potential in fields such as nuclear, material and environmental studies.
What makes the Universe accelerate? A review on what dark energy could be and how to test it
NASA Astrophysics Data System (ADS)
Brax, Philippe
2018-01-01
Explaining the origin of the acceleration of the expansion of the Universe remains as challenging as ever. In this review, we present different approaches from dark energy to modified gravity. We also emphasize the quantum nature of the problem and the need for an explanation which should violate Weinberg’s no go theorem. This might involve a self-tuning mechanism or the acausal sequestering of the vacuum energy. Laboratory tests of the coupling to matter of nearly massless scalar fields, which could be one of the features required to explain the cosmic acceleration, are also reviewed.
Niimi, Shingo; Umezu, Mitsuo; Iseki, Hiroshi; Harada, Hiroshi Kasanuki Noboru; Mitsuishi, Mamoru; Kitamori, Takehiko; Tei, Yuichi; Nakaoka, Ryusuke; Haishima, Yuji
2014-01-01
Division of Medical Devices has been conducting the projects to accelerate the practical use of innovative medical devices to collaborate with TWIns, Center for Advanced Biomedical Sciences, Waseda University and School of Engineering, The University of Tokyo. The TWIns has been studying to aim at establishment of preclinical evaluation methods by "Engineering Based Medicine", and established Regulatory Science Institute for Medical Devices. School of Engineering, The University of Tokyo has been studying to aim at establishment of assessment methodology for innovative minimally invasive therapeutic devices, materials, and nanobio diagnostic devices. This report reviews the exchanges of personnel, the implement systems and the research progress of these projects.
Theory of unfolded cyclotron accelerator
NASA Astrophysics Data System (ADS)
Rax, J.-M.; Robiche, J.
2010-10-01
An acceleration process based on the interaction between an ion, a tapered periodic magnetic structure, and a circularly polarized oscillating electric field is identified and analyzed, and its potential is evaluated. A Hamiltonian analysis is developed in order to describe the interplay between the cyclotron motion, the electric acceleration, and the magnetic modulation. The parameters of this universal class of magnetic modulation leading to continuous acceleration without Larmor radius increase are expressed analytically. Thus, this study provides the basic scaling of what appears as a compact unfolded cyclotron accelerator.
Inhomogeneity-induced cosmic acceleration in a dust universe
NASA Astrophysics Data System (ADS)
Chuang, Chia-Hsun; Gu, Je-An; Hwang, W.-Y. P.
2008-09-01
It is the common consensus that the expansion of a universe always slows down if the gravity provided by the energy sources therein is attractive and accordingly one needs to invoke dark energy as a source of anti-gravity for understanding the cosmic acceleration. To examine this point we find counterexamples for a spherically symmetric dust fluid described by the Lemaître Tolman Bondi solution without singularity. Thus, the validity of this naive consensus is indeed doubtful and the effects of inhomogeneities should be restudied. These counter-intuitive examples open a new perspective on the understanding of the evolution of our universe.
Accelerated Districts--The Next Step. A Summary of Research and Design.
ERIC Educational Resources Information Center
Driver, Cyrus; And Others
The National Center for the Accelerated Schools Project at Stanford University has recognized that district-level change is necessary if changes at accelerated schools are to gain permanence and become widespread. The Center has therefore initiated a research and development project to design a set of models on which districts can reconstitute…
Analyzing Collision Processes with the Smartphone Acceleration Sensor
ERIC Educational Resources Information Center
Vogt, Patrik; Kuhn, Jochen
2014-01-01
It has been illustrated several times how the built-in acceleration sensors of smartphones can be used gainfully for quantitative experiments in school and university settings (see the overview in Ref. 1 ). The physical issues in that case are manifold and apply, for example, to free fall, radial acceleration, several pendula, or the exploitation…
Accelerator-based techniques for the support of senior-level undergraduate physics laboratories
NASA Astrophysics Data System (ADS)
Williams, J. R.; Clark, J. C.; Isaacs-Smith, T.
2001-07-01
Approximately three years ago, Auburn University replaced its aging Dynamitron accelerator with a new 2MV tandem machine (Pelletron) manufactured by the National Electrostatics Corporation (NEC). This new machine is maintained and operated for the University by Physics Department personnel, and the accelerator supports a wide variety of materials modification/analysis studies. Computer software is available that allows the NEC Pelletron to be operated from a remote location, and an Internet link has been established between the Accelerator Laboratory and the Upper-Level Undergraduate Teaching Laboratory in the Physics Department. Additional software supplied by Canberra Industries has also been used to create a second Internet link that allows live-time data acquisition in the Teaching Laboratory. Our senior-level undergraduates and first-year graduate students perform a number of experiments related to radiation detection and measurement as well as several standard accelerator-based experiments that have been added recently. These laboratory exercises will be described, and the procedures used to establish the Internet links between our Teaching Laboratory and the Accelerator Laboratory will be discussed.
The CSU Accelerator and FEL Facility
NASA Astrophysics Data System (ADS)
Biedron, Sandra; Milton, Stephen; D'Audney, Alex; Edelen, Jonathan; Einstein, Josh; Harris, John; Hall, Chris; Horovitz, Kahren; Martinez, Jorge; Morin, Auralee; Sipahi, Nihan; Sipahi, Taylan; Williams, Joel
2014-03-01
The Colorado State University (CSU) Accelerator Facility will include a 6-MeV L-Band electron linear accelerator (linac) with a free-electron laser (FEL) system capable of producing Terahertz (THz) radiation, a laser laboratory, a microwave test stand, and a magnetic test stand. The photocathode drive linac will be used in conjunction with a hybrid undulator capable of producing THz radiation. Details of the systems used in CSU Accelerator Facility are discussed.
The Beginning and End of the Universe
NASA Technical Reports Server (NTRS)
Gardner, Jonathan P.
2007-01-01
Cosmology is the scientific study of how the Universe began more than 13 billion years ago, how its properties have changed, and what its future might be. The balance of forces and energy cause the Universe to expand, first accelerating, then decelerating and then accelerating again. Within this overall structure, the interplay of atoms and light with the mysterious dark matter and dark energy causes stars and galaxies to form and evolve, leading to galaxies like our own home, the Milky Way. Observational cosmology uses telescopes on Earth and in space to reach back in time to find the faint remaining echoes of the Big Bang and to trace the formation and evolution of the galaxies and structures that fill the Universe. In this lecture, Dr. Gardner will give an overview of cosmology, outlining the 13-billion year history of the Universe, and highlighting the very rapid progress this field has made in the last decade. He will discuss the role that NASA space telescopes have played in this progress and will continue to play in the years to come. He will give a time-based history of the Universe, discussing the successive processes that formed matter, particles, atoms, stars and galaxies. In particular, he will focus on cosmological inflation, the rapid accelerated expansion that marks the beginning of the Universe, and dark energy, a tenuous substance that overcomes gravity and whose properties will determine its final fate.
The Beginning and End of the Universe
NASA Technical Reports Server (NTRS)
Gardner, Jonathan
2008-01-01
Cosmology is the scientific study of how the Universe began more than 13 billion years ago, how its properties have changed, and what its future might be. The balance of forces and energy cause the Universe to expand, first accelerating, then decelerating and then accelerating again. Within this overall structure, the interplay of atoms and light with the mysterious dark matter and dark energy causes stars and galaxies to form and evolve, leading to galaxies like our own home, the Milky Way. Observational cosmology uses telescopes on Earth and in space to reach back in time to find the faint remaining echoes of the Big Bang and to trace the formation and evolution of the galaxies and structures that fill the Universe. In this lecture, Dr. Gradner will give an overview of cosmology, outlining the 13-billion year history of the Universe, and highlighting the very rapid progress this field has made i the last decade. He will discuss the role that NASA space telescopes have played in this progress and wil continue to play in the years to come. He will give a time-based history of the Universe, discussing the successive processes that formed matter, particles, atoms, stars and galaxies. In particular, he will focus on cosmological inflation, the rapid accelerated expansion that marks the beginning of the Universe, and dark energy, a tenuous substance that overcomes gravity and whose properties will determine its final fate.
Constraints on backreaction in dust universes
NASA Astrophysics Data System (ADS)
Räsänen, Syksy
2006-03-01
We study backreaction in dust universes using exact equations which do not rely on perturbation theory, concentrating on theoretical and observational constraints. In particular, we discuss the recent suggestion (Kolb et al 2005 Preprint hep-th/0503117) that superhorizon perturbations could explain present-day accelerated expansion as a useful example which can be ruled out. We note that a backreaction explanation of late-time acceleration will have to involve spatial curvature and subhorizon perturbations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thompson, Kelly; Budge, Kent; Lowrie, Rob
2016-03-03
Draco is an object-oriented component library geared towards numerically intensive, radiation (particle) transport applications built for parallel computing hardware. It consists of semi-independent packages and a robust build system. The packages in Draco provide a set of components that can be used by multiple clients to build transport codes. The build system can also be extracted for use in clients. Software includes smart pointers, Design-by-Contract assertions, unit test framework, wrapped MPI functions, a file parser, unstructured mesh data structures, a random number generator, root finders and an angular quadrature component.
Xyce™ Parallel Electronic Simulator Reference Guide, Version 6.5
DOE Office of Scientific and Technical Information (OSTI.GOV)
Keiter, Eric R.; Aadithya, Karthik V.; Mei, Ting
2016-06-01
This document is a reference guide to the Xyce Parallel Electronic Simulator, and is a companion document to the Xyce Users’ Guide. The focus of this document is (to the extent possible) exhaustively list device parameters, solver options, parser options, and other usage details of Xyce. This document is not intended to be a tutorial. Users who are new to circuit simulation are better served by the Xyce Users’ Guide. The information herein is subject to change without notice. Copyright © 2002-2016 Sandia Corporation. All rights reserved.
jmzML, an open-source Java API for mzML, the PSI standard for MS data.
Côté, Richard G; Reisinger, Florian; Martens, Lennart
2010-04-01
We here present jmzML, a Java API for the Proteomics Standards Initiative mzML data standard. Based on the Java Architecture for XML Binding and XPath-based XML indexer random-access XML parser, jmzML can handle arbitrarily large files in minimal memory, allowing easy and efficient processing of mzML files using the Java programming language. jmzML also automatically resolves internal XML references on-the-fly. The library (which includes a viewer) can be downloaded from http://jmzml.googlecode.com.
Sterling Software: An NLToolset-based System for MUC-6
1995-11-01
COCA - COLA ADVERTISING *PERIOD* ) ("’OOUBLEQUOTE"’ *EO-P"’ *SO-P"’ "’CAP* ABBREV _MR *CAP...34 Coca - Cola ". Since we weren’t using the parser, the part-of- speech obtained by a lexical lookup was of interest mainly if it was something like city-name...any contextual clues (such as "White House", "Fannie Mae", "Big Board", " Coca - cola " and "Coke", "Macy’s", "Exxon", etc). 252 SUB 6 0 0
Natural-Language Parser for PBEM
NASA Technical Reports Server (NTRS)
James, Mark
2010-01-01
A computer program called "Hunter" accepts, as input, a colloquial-English description of a set of policy-based-management rules, and parses that description into a form useable by policy-based enterprise management (PBEM) software. PBEM is a rules-based approach suitable for automating some management tasks. PBEM simplifies the management of a given enterprise through establishment of policies addressing situations that are likely to occur. Hunter was developed to have a unique capability to extract the intended meaning instead of focusing on parsing the exact ways in which individual words are used.
Signal Processing Expert Code (SPEC)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ames, H.S.
1985-12-01
The purpose of this paper is to describe a prototype expert system called SPEC which was developed to demonstrate the utility of providing an intelligent interface for users of SIG, a general purpose signal processing code. The expert system is written in NIL, runs on a VAX 11/750 and consists of a backward chaining inference engine and an English-like parser. The inference engine uses knowledge encoded as rules about the formats of SIG commands and about how to perform frequency analyses using SIG. The system demonstrated that expert system can be used to control existing codes.
Parallel File System I/O Performance Testing On LANL Clusters
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wiens, Isaac Christian; Green, Jennifer Kathleen
2016-08-18
These are slides from a presentation on parallel file system I/O performance testing on LANL clusters. I/O is a known bottleneck for HPC applications. Performance optimization of I/O is often required. This summer project entailed integrating IOR under Pavilion and automating the results analysis. The slides cover the following topics: scope of the work, tools utilized, IOR-Pavilion test workflow, build script, IOR parameters, how parameters are passed to IOR, *run_ior: functionality, Python IOR-Output Parser, Splunk data format, Splunk dashboard and features, and future work.
NASA Astrophysics Data System (ADS)
Nesbet, Robert K.
2018-05-01
Velocities in stable circular orbits about galaxies, a measure of centripetal gravitation, exceed the expected Kepler/Newton velocity as orbital radius increases. Standard Λ cold dark matter (ΛCDM) attributes this anomaly to galactic dark matter. McGaugh et al. have recently shown for 153 disc galaxies that observed radial acceleration is an apparently universal function of classical acceleration computed for observed galactic baryonic mass density. This is consistent with the empirical modified Newtonian dynamics (MOND) model, not requiring dark matter. It is shown here that suitably constrained ΛCDM and conformal gravity (CG) also produce such a universal correlation function. ΛCDM requires a very specific dark matter distribution, while the implied CG non-classical acceleration must be independent of galactic mass. All three constrained radial acceleration functions agree with the empirical baryonic v4 Tully-Fisher relation. Accurate rotation data in the nominally flat velocity range could distinguish between MOND, ΛCDM, and CG.
Links We bring the universe to you! University of California Berkeley Cosmology Group Lawrence Computational Cosmology Center Institute for Nuclear & Particle Astrophysics Supernova Acceleration Probe
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lincoln, Don
While the LHC is currently the highest energy particle accelerator ever built, nothing is forever. In this video, Fermilab’s Dr. Don Lincoln discusses a new particle accelerator currently under discussion. This accelerator will dwarf the LHC, fully 60 miles around and will accelerate protons to seven times higher energy. The project is merely in the discussion stages and it is a staggering endeavor, but it is the next natural step in our millennium long journey to understand the universe.
Beam Position Monitoring in the CSU Accelerator Facility
NASA Astrophysics Data System (ADS)
Einstein, Joshua; Vankeuren, Max; Watras, Stephen
2014-03-01
A Beam Position Monitoring (BPM) system is an integral part of an accelerator beamline, and modern accelerators can take advantage of newer technologies and designs when creating a BPM system. The Colorado State University (CSU) Accelerator Facility will include four stripline detectors mounted around the beamline, a low-noise analog front-end, and digitization and interface circuitry. The design will support a sampling rate greater than 10 Hz and sub-100 μm accuracy.
Implications of an Absolute Simultaneity Theory for Cosmology and Universe Acceleration
Kipreos, Edward T.
2014-01-01
An alternate Lorentz transformation, Absolute Lorentz Transformation (ALT), has similar kinematics to special relativity yet maintains absolute simultaneity in the context of a preferred reference frame. In this study, it is shown that ALT is compatible with current experiments to test Lorentz invariance only if the proposed preferred reference frame is locally equivalent to the Earth-centered non-rotating inertial reference frame, with the inference that in an ALT framework, preferred reference frames are associated with centers of gravitational mass. Applying this theoretical framework to cosmological data produces a scenario of universal time contraction in the past. In this scenario, past time contraction would be associated with increased levels of blueshifted light emissions from cosmological objects when viewed from our current perspective. The observation that distant Type Ia supernovae are dimmer than predicted by linear Hubble expansion currently provides the most direct evidence for an accelerating universe. Adjusting for the effects of time contraction on a redshift–distance modulus diagram produces a linear distribution of supernovae over the full redshift spectrum that is consistent with a non-accelerating universe. PMID:25536116
Implications of an absolute simultaneity theory for cosmology and universe acceleration.
Kipreos, Edward T
2014-01-01
An alternate Lorentz transformation, Absolute Lorentz Transformation (ALT), has similar kinematics to special relativity yet maintains absolute simultaneity in the context of a preferred reference frame. In this study, it is shown that ALT is compatible with current experiments to test Lorentz invariance only if the proposed preferred reference frame is locally equivalent to the Earth-centered non-rotating inertial reference frame, with the inference that in an ALT framework, preferred reference frames are associated with centers of gravitational mass. Applying this theoretical framework to cosmological data produces a scenario of universal time contraction in the past. In this scenario, past time contraction would be associated with increased levels of blueshifted light emissions from cosmological objects when viewed from our current perspective. The observation that distant Type Ia supernovae are dimmer than predicted by linear Hubble expansion currently provides the most direct evidence for an accelerating universe. Adjusting for the effects of time contraction on a redshift-distance modulus diagram produces a linear distribution of supernovae over the full redshift spectrum that is consistent with a non-accelerating universe.
The principle of phase stability and the accelerator program at Berkeley, 1945--1954
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lofgren, E.J.
1994-07-01
The discovery of the Principle of Phase Stability by Vladimir Veksler and Edwin McMillian and the end of the war released a surge of accelerator activity at the Lawrence Berkeley Laboratory (then The University of California Radiation Laboratory). Six accelerators incorporating the Principle of Phase Stability were built in the period 1945--1954.
Sustained performance of 8 MeV Microtron
NASA Astrophysics Data System (ADS)
Sanjeev, Ganesh
2012-11-01
Energetic electrons and intense bremsstrahlung radiation from 8 MeV Microtron are being utilized in variety of collaborative research programs in radiation physics and allied sciences involving premier institutions of the country and sister universities of the region. The first of its kind electron accelerator in the country, set up at Mangalore University in collaboration with RRCAT Indore and BARC Mumbai, has been facilitating researchers since its inception with its inherent simplicity, ease of construction, low cost and excellent beam quality. A bird's eye view on the reliable aspects of the machine, efforts behind the continuous operation of the accelerator and important applications of the accelerator in physical and biological sciences are presented in this paper.
Status of the University of Rochester tandem upgrade
NASA Astrophysics Data System (ADS)
Cross, Clinton; Miller, Thomas
1986-05-01
The status of the University of Rochester tandem Van de Graaff accelerator upgrade is reviewed. The accelerator upgrade to 18 MV consists of extended tubes, shielded resistors, dead-section ion pumping, two rotating insulating power shaft systems to provide power to the dead sections and terminal, and a pelletron charging system to replace the charging belt. Control of many of the accelerator operating systems will be done by two IBM personal computers. The negative ion injector diffusion pump, isolation transformer, preacceleration high-voltage power supply, and high-voltage corona enclosure will all be replaced. Finally, the SF6 gas handling system will be improved with the addition of a second set of gas dryers and a larger recirculating pump.
Tsallis holographic dark energy
NASA Astrophysics Data System (ADS)
Tavayef, M.; Sheykhi, A.; Bamba, Kazuharu; Moradpour, H.
2018-06-01
Employing the modified entropy-area relation suggested by Tsallis and Cirto [1], and the holographic hypothesis, a new holographic dark energy (HDE) model is proposed. Considering a flat Friedmann-Robertson-Walker (FRW) universe in which there is no interaction between the cosmos sectors, the cosmic implications of the proposed HDE are investigated. Interestingly enough, we find that the identification of IR-cutoff with the Hubble radius, can lead to the late time accelerated Universe even in the absence of interaction between two dark sectors of the Universe. This is in contrast to the standard HDE model with Hubble cutoff, which does not imply the accelerated expansion, unless the interaction is taken into account.
Accelerated testing for studying pavement design and performance (FY 2004) : research summary.
DOT National Transportation Integrated Search
2009-03-01
The thirteenth full-scale Accelerated Pavement Test (APT) experiment at the Civil Infrastructure Laboratory (CISL) of Kansas State University aimed to determine the response and the failure mode of thin concrete overlays.
Testing Einstein's Gravity on Large Scales
NASA Technical Reports Server (NTRS)
Prescod-Weinstein, Chandra
2011-01-01
A little over a decade has passed since two teams studying high redshift Type Ia supernovae announced the discovery that the expansion of the universe was accelerating. After all this time, we?re still not sure how cosmic acceleration fits into the theory that tells us about the large-scale universe: General Relativity (GR). As part of our search for answers, we have been forced to question GR itself. But how will we test our ideas? We are fortunate enough to be entering the era of precision cosmology, where the standard model of gravity can be subjected to more rigorous testing. Various techniques will be employed over the next decade or two in the effort to better understand cosmic acceleration and the theory behind it. In this talk, I will describe cosmic acceleration, current proposals to explain it, and weak gravitational lensing, an observational effect that allows us to do the necessary precision cosmology.
Friedman, Lee; Rigas, Ioannis; Abdulin, Evgeny; Komogortsev, Oleg V
2018-05-15
Nystrӧm and Holmqvist have published a method for the classification of eye movements during reading (ONH) (Nyström & Holmqvist, 2010). When we applied this algorithm to our data, the results were not satisfactory, so we modified the algorithm (now the MNH) to better classify our data. The changes included: (1) reducing the amount of signal filtering, (2) excluding a new type of noise, (3) removing several adaptive thresholds and replacing them with fixed thresholds, (4) changing the way that the start and end of each saccade was determined, (5) employing a new algorithm for detecting PSOs, and (6) allowing a fixation period to either begin or end with noise. A new method for the evaluation of classification algorithms is presented. It was designed to provide comprehensive feedback to an algorithm developer, in a time-efficient manner, about the types and numbers of classification errors that an algorithm produces. This evaluation was conducted by three expert raters independently, across 20 randomly chosen recordings, each classified by both algorithms. The MNH made many fewer errors in determining when saccades start and end, and it also detected some fixations and saccades that the ONH did not. The MNH fails to detect very small saccades. We also evaluated two additional algorithms: the EyeLink Parser and a more current, machine-learning-based algorithm. The EyeLink Parser tended to find more saccades that ended too early than did the other methods, and we found numerous problems with the output of the machine-learning-based algorithm.
Replacing Fortran Namelists with JSON
NASA Astrophysics Data System (ADS)
Robinson, T. E., Jr.
2017-12-01
Maintaining a log of input parameters for a climate model is very important to understanding potential causes for answer changes during the development stages. Additionally, since modern Fortran is now interoperable with C, a more modern approach to software infrastructure to include code written in C is necessary. Merging these two separate facets of climate modeling requires a quality control for monitoring changes to input parameters and model defaults that can work with both Fortran and C. JSON will soon replace namelists as the preferred key/value pair input in the GFDL model. By adding a JSON parser written in C into the model, the input can be used by all functions and subroutines in the model, errors can be handled by the model instead of by the internal namelist parser, and the values can be output into a single file that is easily parsable by readily available tools. Input JSON files can handle all of the functionality of a namelist while being portable between C and Fortran. Fortran wrappers using unlimited polymorphism are crucial to allow for simple and compact code which avoids the need for many subroutines contained in an interface. Errors can be handled with more detail by providing information about location of syntax errors or typos. The output JSON provides a ground truth for values that the model actually uses by providing not only the values loaded through the input JSON, but also any default values that were not included. This kind of quality control on model input is crucial for maintaining reproducibility and understanding any answer changes resulting from changes in the input.
Educating and Training Accelerator Scientists and Technologists for Tomorrow
NASA Astrophysics Data System (ADS)
Barletta, William; Chattopadhyay, Swapan; Seryi, Andrei
2012-01-01
Accelerator science and technology is inherently an integrative discipline that combines aspects of physics, computational science, electrical and mechanical engineering. As few universities offer full academic programs, the education of accelerator physicists and engineers for the future has primarily relied on a combination of on-the-job training supplemented with intensive courses at regional accelerator schools. This article describes the approaches being used to satisfy the educational curiosity of a growing number of interested physicists and engineers.
Application of particle accelerators in research.
Mazzitelli, Giovanni
2011-07-01
Since the beginning of the past century, accelerators have started to play a fundamental role as powerful tools to discover the world around us, how the universe has evolved since the big bang and to develop fundamental instruments for everyday life. Although more than 15 000 accelerators are operating around the world only a very few of them are dedicated to fundamental research. An overview of the present high energy physics (HEP) accelerator status and prospectives is presented.
Educating and Training Accelerator Scientists and Technologists for Tomorrow
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barletta, William A.; Chattopadhyay, Swapan; Seryi, Andrei
2012-07-01
Accelerator science and technology is inherently an integrative discipline that combines aspects of physics, computational science, electrical and mechanical engineering. As few universities offer full academic programs, the education of accelerator physicists and engineers for the future has primarily relied on a combination of on-the-job training supplemented with intense courses at regional accelerator schools. This paper describes the approaches being used to satisfy the educational interests of a growing number of interested physicists and engineers.
NASA Astrophysics Data System (ADS)
Dossett, Jason Nicholas
Since its discovery more than a decade ago, the problem of cosmic acceleration has become one of the largest in cosmology and physics as a whole. An unknown dark energy component of the universe is often invoked to explain this observation. Mathematically, this works because inserting a cosmic fluid with a negative equation of state into Einstein's equations provides an accelerated expansion. There are, however, alternative explanations for the observed cosmic acceleration. Perhaps the most promising of the alternatives is that, on the very largest cosmological scales, general relativity needs to be extended or a new, modified gravity theory must be used. Indeed, many modified gravity models are not only able to replicate the observed accelerated expansion without dark energy, but are also more compatible with a unified theory of physics. Thus it is the goal of this dissertation to develop and study robust tests that will be able to distinguish between these alternative theories of gravity and the need for a dark energy component of the universe. We will study multiple approaches using the growth history of large-scale structure in the universe as a way to accomplish this task. These approaches include studying what is known as the growth index parameter, a parameter that describes the logarithmic growth rate of structure in the universe, which describes the rate of formation of clusters and superclusters of galaxies over the entire age of the universe. We will explore the effectiveness of this parameter to distinguish between general relativity and modifications to gravity physics given realistic expectations of results from future experiments. Next, we will explore the modified growth formalism wherein deviations from the growth expected in general relativity are parameterized via changes to the growth equations, i.e. the perturbed Einstein's equations. We will also explore the impact of spatial curvature on these tests. Finally, we will study how dark energy with some unusual properties will affect the conclusiveness of these tests.
Ghosts in the self-accelerating brane universe
DOE Office of Scientific and Technical Information (OSTI.GOV)
Koyama, Kazuya; Institute of Cosmology and Gravitation, Portsmouth University, Portsmouth, PO1 2EG
2005-12-15
We study the spectrum of gravitational perturbations about a vacuum de Sitter brane with the induced 4D Einstein-Hilbert term, in a 5D Minkowski spacetime (DGP model). We consider solutions that include a self-accelerating universe, where the accelerating expansion of the universe is realized without introducing a cosmological constant on the brane. The mass of the discrete mode for the spin-2 graviton is calculated for various Hr{sub c}, where H is the Hubble parameter and r{sub c} is the crossover scale determined by the ratio between the 5D Newton constant and the 4D Newton constant. We show that, if we introducemore » a positive cosmological constant on the brane (Hr{sub c}>1), the spin-2 graviton has mass in the range 0
Introduction to Particle Acceleration in the Cosmos
NASA Technical Reports Server (NTRS)
Gallagher, D. L.; Horwitz, J. L.; Perez, J.; Quenby, J.
2005-01-01
Accelerated charged particles have been used on Earth since 1930 to explore the very essence of matter, for industrial applications, and for medical treatments. Throughout the universe nature employs a dizzying array of acceleration processes to produce particles spanning twenty orders of magnitude in energy range, while shaping our cosmic environment. Here, we introduce and review the basic physical processes causing particle acceleration, in astrophysical plasmas from geospace to the outer reaches of the cosmos. These processes are chiefly divided into four categories: adiabatic and other forms of non-stochastic acceleration, magnetic energy storage and stochastic acceleration, shock acceleration, and plasma wave and turbulent acceleration. The purpose of this introduction is to set the stage and context for the individual papers comprising this monograph.
Accelerated Schools Centers: How To Address Challenges to Institutionalization and Growth.
ERIC Educational Resources Information Center
Meza, James, Jr.
The Accelerated Schools Project (ASP) at the University of New Orleans (UNO) was established in spring 1990, funded by a 3-year grant from Chevron. Beginning with 1 pilot school in 1991, the UNO Accelerated Schools Center has expanded to 36 schools representing 19 school districts in Louisiana and 3 schools from the Memphis City Schools district.…
Radioactivities in returned lunar materials and in meteorites
NASA Technical Reports Server (NTRS)
Fireman, E. L.
1986-01-01
A preliminary C-14 study on lunar soil was carried out with the University of Toronto Iso Trace accelerator mass spectrometer. This accelerator was recommended for C-14 work by Dr. R. Schneider of A.S. and E., who was the field engineer during the assemblage and start-up operation of the accelerator. After the preliminary study using CO2 from 10084,937 soil, which had previously been counted with low-level mini-proportional counters, it became clear that the Toronto accelerator could carry out C-14/C-13/C-12 ratio measurements on 1 gram meteorite and lunar samples and that the C-14 measurements are done with higher precision and better reliability than elsewhere. A collaborative program with the University of Toronto Iso Trace accelerator group, which is expected to be scientifically fruitful. Arrangements have been made for Dr. R.P. Beukens of the Toronto Accelerator Group to extract the carbon compounds from Antarctic meteorite and lunar samples and to convert the compounds to CO2. During the past two years, a uranium-series dating method was developed for polar ice, which method is being applied to ice from the Allan Hills site, Byrd core, and the Beardsmore glacier.
NASA Astrophysics Data System (ADS)
Ochoa, Rosibel; DeLong, Hal; Kenyon, Jessica; Wilson, Eli
2011-06-01
The von Liebig Center for Entrepreneurism and Technology Advancement at UC San Diego (vonliebig.ucsd.edu) is focused on accelerating technology transfer and commercialization through programs and education on entrepreneurism. Technology Acceleration Projects (TAPs) that offer pre-venture grants and extensive mentoring on technology commercialization are a key component of its model which has been developed over the past ten years with the support of a grant from the von Liebig Foundation. In 2010, the von Liebig Entrepreneurism Center partnered with the U.S. Army Telemedicine and Advanced Technology Research Center (TATRC), to develop a regional model of Technology Acceleration Program initially focused on military research to be deployed across the nation to increase awareness of military medical needs and to accelerate the commercialization of novel technologies to treat the patient. Participants to these challenges are multi-disciplinary teams of graduate students and faculty in engineering, medicine and business representing universities and research institutes in a region, selected via a competitive process, who receive commercialization assistance and funding grants to support translation of their research discoveries into products or services. To validate this model, a pilot program focused on commercialization of wireless healthcare technologies targeting campuses in Southern California has been conducted with the additional support of Qualcomm, Inc. Three projects representing three different universities in Southern California were selected out of forty five applications from ten different universities and research institutes. Over the next twelve months, these teams will conduct proof of concept studies, technology development and preliminary market research to determine the commercial feasibility of their technologies. This first regional program will help build the needed tools and processes to adapt and replicate this model across other regions in the Country.
2010-10-01
4 8 4 | A Publication of the Defense Acquisition University http://www.dau.mil image designed by Miracle Riese » Keywords: Acceleration Test...Std Z39-18 4 8 6 | A Publication of the Defense Acquisition University http://www.dau.mil Generally speaking, medical devices are designed to...However, because the devices are designed for a controlled environment, concerns they may adversely affect the operation of aircraft systems must be
Dynamical analysis of tachyonic chameleon
NASA Astrophysics Data System (ADS)
Banijamali, Ali; Solbi, Milad
2017-08-01
In the present paper we investigate tachyonic chameleon scalar field and present the phase space analysis for four different combinations of the tachyonic potential V(φ ) and the coupling function f(φ ) of the chameleon field with matter. We find some stable solution in which accelerated expansion of the universe is satisfied. In one case where both f(φ ) and V(φ ) are exponential a scaling attractor was found that can give rise to the late-time acceleration of the universe and alleviate the coincidence problem.
Optimal time travel in the Gödel universe
NASA Astrophysics Data System (ADS)
Natário, José
2012-04-01
Using the theory of optimal rocket trajectories in general relativity, recently developed in Henriques and Natário (2011), we present a candidate for the minimum total integrated acceleration closed timelike curve in the Gödel universe, and give evidence for its minimality. The total integrated acceleration of this curve is lower than Malament's conjectured value (Malament 1984), as was already implicit in the work of Manchak (Gen. Relativ. Gravit. 51-60, 2011); however, Malament's conjecture does seem to hold for periodic closed timelike curves.
Education in a rapidly advancing technology: Accelerators and beams
NASA Astrophysics Data System (ADS)
Month, Mel
2000-06-01
The field of accelerators and beams (A&B) is one of today's fast changing technologies. Because university faculties have not been able to keep pace with the associated advancing knowledge, universities have not been able to play their traditional role of educating the scientists and engineers needed to sustain this technology for use in science, industry, commerce, and defense. This problem for A&B is described and addressed. The solution proposed, a type of "distance" education, is the U.S. Particle Accelerator School (USPAS) created in the early 1980s. USPAS provides the universities with a means of serving the education needs of the institutions using A&B, primarily but not exclusively the national laboratories. The field of A&B is briefly summarized. The need for education outside the university framework, the raison d'être for USPAS, the USPAS method, program structure, and curriculum, and particular USPAS-university connections are explained. The management of USPAS is analyzed, including its unique administrative structure, its institutional ties, and its operations, finance, marketing, and governmental relations. USPAS performance over the years is documented and a business assessment is made. Finally, there is a brief discussion of the future potential for this type of educational program, including possible extrapolation to new areas and/or different environments, in particular, its extra-government potential and its international possibilities.
Seventy Five Years of Particle Accelerators (LBNL Summer Lecture Series)
Sessler, Andy [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)
2017-12-09
Summer Lecture Series 2006: Andy Sessler, Berkeley Lab director from 1973 to 1980, sheds light on the Lab's nearly eight-decade history of inventing and refining particle accelerators, which continue to illuminate the nature of the universe.
Seventy Five Years of Particle Accelerators
Sessler, Andy
2017-12-09
Andy Sessler, Berkeley Lab director from 1973 to 1980, sheds light on the Lab's nearly eight-decade history of inventing and refining particle accelerators, which continue to illuminate the nature of the universe. His talk was presented July 26, 2006.
The Mysterious Universe - Exploring Our World with Particle Accelerators
Brau, James E [University of Oregon
2018-04-24
The universe is dark and mysterious, more so than even Einstein imagined. While modern science has established deep understanding of ordinary matter, unidentified elements ("Dark Matter" and "Dark Energy") dominate the structure of the universe, its behavior and its destiny. What are these curious elements? We are now working on answers to these and other challenging questions posed by the universe with experiments at particle accelerators on Earth. Results of this research may revolutionize our view of nature as dramatically as the advances of Einstein and other quantum pioneers one hundred years ago. Professor Brau will explain for the general audience the mysteries, introduce facilities which explore them experimentally and discuss our current understanding of the underlying science. The presentation is at an introductory level, appropriate for anyone interested in physics and astronomy.
ERIC Educational Resources Information Center
Wadenya, Rose O.; Schwartz, Susan; Lopez, Naty; Fonseca, Raymond
2003-01-01
Describes the university's focus on leadership, financial support, institutional commitment, and creation of an inclusive environment for minority students; an accelerated program leading to combined bachelor's and dental degrees, which includes agreements with Xavier University and Hampton University; and peer mentorship and minority mentorship…
Self-accelerating universe in scalar-tensor theories after GW170817
NASA Astrophysics Data System (ADS)
Crisostomi, Marco; Koyama, Kazuya
2018-04-01
The recent simultaneous detection of gravitational waves and a gamma-ray burst from a neutron star merger significantly shrank the space of viable scalar-tensor theories by demanding that the speed of gravity is equal to that of light. The survived theories belong to the class of degenerate higher order scalar-tensor theories. We study whether these theories are suitable as dark energy candidates. We find scaling solutions in the matter dominated universe that lead to de Sitter solutions at late times without the cosmological constant, realizing self-acceleration. We evaluate quasistatic perturbations around self-accelerating solutions and show that the stringent constraints coming from astrophysical objects and gravitational waves can be satisfied, leaving interesting possibilities to test these theories by cosmological observations.
Projectile Combustion Effects on Ram Accelerator Performance
NASA Astrophysics Data System (ADS)
Chitale, Saarth Anjali
University of Washington Abstract Projectile Combustion Effects on Ram Accelerator Performance Saarth Anjali Chitale Chair of the Supervisory Committee: Prof. Carl Knowlen William E. Boeing Department of Aeronautics and Astronautics The ram accelerator facility at the University of Washington is used to propel projectiles at supersonic velocities. This concept is similar to an air-breathing ramjet engine in that sub-caliber projectiles, shaped like the ramjet engine center-body, are shot through smooth-bore steel-walled tubes having an internal diameter of 38 mm. The ram accelerator propulsive cycles operate between Mach 2 to 10 and have the potential to accelerate projectile to velocities greater than 8 km/s. The theoretical thrust versus Mach number characteristics can be obtained using knowledge of gas dynamics and thermodynamics that goes into the design of the ram accelerator. The corresponding velocity versus distance profiles obtained from the test runs at the University of Washington, however, are often not consistent with the theoretical predictions after the projectiles reach in-tube Mach numbers greater than 4. The experimental velocities are typically greater than the expected theoretical predictions; which has led to the proposition that the combustion process may be moving up onto the projectile. An alternative explanation for higher than predicted thrust, which is explored here, is that the performance differences can be attributed to the ablation of the projectile body which results in molten metal being added to the flow of the gaseous combustible mixture around the projectile. This molten metal is assumed to mix uniformly and react with the gaseous propellant; thereby enhancing the propellant energy release and altering the predicted thrust-Mach characteristics. This theory predicts at what Mach number the projectile will first experience enhanced thrust and the corresponding velocity-distance profile. Preliminary results are in good agreement with projectiles operating in methane/oxygen/nitrogen propellants. Effects of projectile surface to volume ratio are also explored by applying the model to experimental results from smaller (Tohoku University, 25-mm-bore) and larger (Institute of Saint-Louis 90-mm-bore) bore ram accelerators. Due to lower surface-to-volume ratio, large diameter projectiles are predicted to need to reach higher Mach numbers than smaller diameter projectiles before thrust enhancement due to metal ablation and burning would be experienced. This proposition was supported by published experimental data. The theoretical modeling of projectile ablation, metal combustion, and subsequent ram accelerator thrust characteristics are presented along comparisons to experiments from three different sized ram accelerator facilities.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-12
... DEPARTMENT OF COMMERCE International Trade Administration National Superconducting Cyclotron Laboratory of Michigan State University; Notice of Decision on Applications for Duty-Free Entry of Scientific... Cyclotron Laboratory of Michigan State University. Instrument: Radio Frequency Quadropole Accelerator (RFQ...
The entangled accelerating universe
NASA Astrophysics Data System (ADS)
González-Díaz, Pedro F.; Robles-Pérez, Salvador
2009-08-01
Using the known result that the nucleation of baby universes in correlated pairs is equivalent to spacetime squeezing, we show in this Letter that there exists a T-duality symmetry between two-dimensional warp drives, which are physically expressible as localized de Sitter little universes, and two-dimensional Tolman-Hawking and Gidding-Strominger baby universes respectively correlated in pairs, so that the creation of warp drives is also equivalent to spacetime squeezing. Perhaps more importantly, it has been also seen that the nucleation of warp drives entails a violation of the Bell's inequalities, and hence the phenomena of quantum entanglement, complementarity and wave function collapse. These results are generalized to the case of any dynamically accelerating universe filled with dark or phantom energy whose creation is also physically equivalent to spacetime squeezing and to the violation of the Bell's inequalities, so that the universe we are living in should be governed by essential sharp quantum theory laws and must be a quantum entangled system.
The Scanning Electron Microscope As An Accelerator For The Undergraduate Advanced Physics Laboratory
NASA Astrophysics Data System (ADS)
Peterson, Randolph S.; Berggren, Karl K.; Mondol, Mark
2011-06-01
Few universities or colleges have an accelerator for use with advanced physics laboratories, but many of these institutions have a scanning electron microscope (SEM) on site, often in the biology department. As an accelerator for the undergraduate, advanced physics laboratory, the SEM is an excellent substitute for an ion accelerator. Although there are no nuclear physics experiments that can be performed with a typical 30 kV SEM, there is an opportunity for experimental work on accelerator physics, atomic physics, electron-solid interactions, and the basics of modern e-beam lithography.
ERIC Educational Resources Information Center
Colangelo, Nicholas, Ed.; Assouline, Susan G., Ed.; Gross, Miraca U. M., Ed.
2004-01-01
With support from the John Templeton Foundation, the editors held a Summit on Acceleration at The University of Iowa in May 2003. They invited distinguished scholars and educators from around the country to help them formulate a national report on acceleration. Together, they deliberated about what schools need to know in order to make the best…
Neuromuscular Control of Rapid Linear Accelerations in Fish
2016-06-22
2014 30-Apr-2015 Approved for Public Release; Distribution Unlimited Final Report: Neuromuscular Control of Rapid Linear Accelerations in Fish The...it does not display a currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. Tufts University Research... Control of Rapid Linear Accelerations in Fish Report Title In this project, we measured muscle activity, body movements, and flow patterns during linear
ERIC Educational Resources Information Center
Johnson-Campbell, Tanisha
2018-01-01
This study is an investigation into a 15-month accelerated undergraduate nursing program and the minority student experience. Using a mixed methods approach, this research addressed the following questions: 1. What was the retention rate for students enrolled in the accelerated nursing bachelor's program and how did that differ by race? 2. What…
Accelerating dark energy cosmological model in two fluids with hybrid scale factor
NASA Astrophysics Data System (ADS)
Mishra, B.; Sahoo, P. K.; Ray, Pratik P.
In this paper, we have investigated the anisotropic behavior of the accelerating universe in Bianchi V spacetime in the framework of General Relativity (GR). The matter field we have considered is of two non-interacting fluids, i.e. the usual string fluid and dark energy (DE) fluid. In order to represent the pressure anisotropy, the skewness parameters are introduced along three different spatial directions. To achieve a physically realistic solutions to the field equations, we have considered a scale factor, known as hybrid scale factor, which is generated by a time-varying deceleration parameter. This simulates a cosmic transition from early deceleration to late time acceleration. It is observed that the string fluid dominates the universe at early deceleration phase but does not affect nature of cosmic dynamics substantially at late phase, whereas the DE fluid dominates the universe in present time, which is in accordance with the observations results. Hence, we analyzed here the role of two fluids in the transitional phases of universe with respect to time which depicts the reason behind the cosmic expansion and DE. The role of DE with variable equation of state parameter (EoS) and skewness parameters, is also discussed along with physical and geometrical properties.
pymzML--Python module for high-throughput bioinformatics on mass spectrometry data.
Bald, Till; Barth, Johannes; Niehues, Anna; Specht, Michael; Hippler, Michael; Fufezan, Christian
2012-04-01
pymzML is an extension to Python that offers (i) an easy access to mass spectrometry (MS) data that allows the rapid development of tools, (ii) a very fast parser for mzML data, the standard data format in MS and (iii) a set of functions to compare or handle spectra. pymzML requires Python2.6.5+ and is fully compatible with Python3. The module is freely available on http://pymzml.github.com or pypi, is published under LGPL license and requires no additional modules to be installed. christian@fufezan.net.
KEGGParser: parsing and editing KEGG pathway maps in Matlab.
Arakelyan, Arsen; Nersisyan, Lilit
2013-02-15
KEGG pathway database is a collection of manually drawn pathway maps accompanied with KGML format files intended for use in automatic analysis. KGML files, however, do not contain the required information for complete reproduction of all the events indicated in the static image of a pathway map. Several parsers and editors of KEGG pathways exist for processing KGML files. We introduce KEGGParser-a MATLAB based tool for KEGG pathway parsing, semiautomatic fixing, editing, visualization and analysis in MATLAB environment. It also works with Scilab. The source code is available at http://www.mathworks.com/matlabcentral/fileexchange/37561.
Using a CLIPS expert system to automatically manage TCP/IP networks and their components
NASA Technical Reports Server (NTRS)
Faul, Ben M.
1991-01-01
A expert system that can directly manage networks components on a Transmission Control Protocol/Internet Protocol (TCP/IP) network is described. Previous expert systems for managing networks have focused on managing network faults after they occur. However, this proactive expert system can monitor and control network components in near real time. The ability to directly manage network elements from the C Language Integrated Production System (CLIPS) is accomplished by the integration of the Simple Network Management Protocol (SNMP) and a Abstract Syntax Notation (ASN) parser into the CLIPS artificial intelligence language.
Applications of the Strategic Defense Initiative's compact accelerators
NASA Technical Reports Server (NTRS)
Montanarelli, Nick; Lynch, Ted
1991-01-01
The Strategic Defense Initiative's (SDI) investment in particle accelerator technology for its directed energy weapons program has produced breakthroughs in the size and power of new accelerators. These accelerators, in turn, have produced spinoffs in several areas: the radio frequency quadrupole linear accelerator (RFQ linac) was recently incorporated into the design of a cancer therapy unit at the Loma Linda University Medical Center, an SDI-sponsored compact induction linear accelerator may replace Cobalt-60 radiation and hazardous ethylene-oxide as a method for sterilizing medical products, and other SDIO-funded accelerators may be used to produce the radioactive isotopes oxygen-15, nitrogen-13, carbon-11, and fluorine-18 for positron emission tomography (PET). Other applications of these accelerators include bomb detection, non-destructive inspection, decomposing toxic substances in contaminated ground water, and eliminating nuclear waste.
Use of General-purpose Negation Detection to Augment Concept Indexing of Medical Documents
Mutalik, Pradeep G.; Deshpande, Aniruddha; Nadkarni, Prakash M.
2001-01-01
Objectives: To test the hypothesis that most instances of negated concepts in dictated medical documents can be detected by a strategy that relies on tools developed for the parsing of formal (computer) languages—specifically, a lexical scanner (“lexer”) that uses regular expressions to generate a finite state machine, and a parser that relies on a restricted subset of context-free grammars, known as LALR(1) grammars. Methods: A diverse training set of 40 medical documents from a variety of specialties was manually inspected and used to develop a program (Negfinder) that contained rules to recognize a large set of negated patterns occurring in the text. Negfinder's lexer and parser were developed using tools normally used to generate programming language compilers. The input to Negfinder consisted of medical narrative that was preprocessed to recognize UMLS concepts: the text of a recognized concept had been replaced with a coded representation that included its UMLS concept ID. The program generated an index with one entry per instance of a concept in the document, where the presence or absence of negation of that concept was recorded. This information was used to mark up the text of each document by color-coding it to make it easier to inspect. The parser was then evaluated in two ways: 1) a test set of 60 documents (30 discharge summaries, 30 surgical notes) marked-up by Negfinder was inspected visually to quantify false-positive and false-negative results; and 2) a different test set of 10 documents was independently examined for negatives by a human observer and by Negfinder, and the results were compared. Results: In the first evaluation using marked-up documents, 8,358 instances of UMLS concepts were detected in the 60 documents, of which 544 were negations detected by the program and verified by human observation (true-positive results, or TPs). Thirteen instances were wrongly flagged as negated (false-positive results, or FPs), and the program missed 27 instances of negation (false-negative results, or FNs), yielding a sensitivity of 95.3 percent and a specificity of 97.7 percent. In the second evaluation using independent negation detection, 1,869 concepts were detected in 10 documents, with 135 TPs, 12 FPs, and 6 FNs, yielding a sensitivity of 95.7 percent and a specificity of 91.8 percent. One of the words “no,” “denies/denied,” “not,” or “without” was present in 92.5 percent of all negations. Conclusions: Negation of most concepts in medical narrative can be reliably detected by a simple strategy. The reliability of detection depends on several factors, the most important being the accuracy of concept matching. PMID:11687566
The Layered Structure of The Universe
NASA Astrophysics Data System (ADS)
Kursunoglu, Behram N.
2003-06-01
It has now become a habit for the cosmologists to introduce attraction or repulsion generating substances to describe the observed cosmological behavior of matter. Examples are dark energy to provide repulsive force to cause increasing acceleration accompanying the expansion of the universe, quintessence providing repulsive force. In this paper we believe that what is needed in the final analysis is attraction and repulsion. We show here that universe can be conceived to consist of attractive and repulsive layers of matter expanding with increasing acceleration. The generalized theory of gravitation as developed originally by Einstein and Schrödinger as a non-symmetric theory was modified by this author using Bianchi-Einstein Identities yielding coupling between the field and electric charge as well as between the field and magnetic charge, and there appears a fundamental length parameter ro where quintessence constitute magnetic repulsive layers while dark energy and all other kinds of names invented by cosmologists refer to attractive electric layers. This layered structure of the universe resembles the layered structure of the elementary particle predicted by this theory decades ago (1, 3, and 6). This implies a layer Doughnut structure of the universe. We have therefore, obtained a unification of the structure of the universe and the structure of elementary particles. Overall the forces consist of long range attractive, long range repulsive, short-range attractive, and short-range repulsive variety. We further discovered the existence of space oscillations whose roles in the expansion of the universe with increasing acceleration and further the impact in the propagation of the gravitational waves can be expected to play a role in their observation.
Cosmic acceleration in a dust only universe via energy-momentum powered gravity
NASA Astrophysics Data System (ADS)
Akarsu, Özgür; Katırcı, Nihan; Kumar, Suresh
2018-01-01
We propose a modified theory of gravitation constructed by the addition of the term f (Tμ νTμ ν) to the Einstein-Hilbert action, and elaborate a particular case f (Tμ νTμ ν)=α (Tμ νTμ ν)η, where α and η are real constants, dubbed energy-momentum powered gravity (EMPG). We search for viable cosmologies arising from EMPG, especially in the context of the late-time accelerated expansion of the Universe. We investigate the ranges of the EMPG parameters (α ,η ) on theoretical as well as observational grounds leading to the late-time acceleration of the Universe with pressureless matter only, while keeping the successes of standard general relativity at early times. We find that η =0 corresponds to the Λ CDM model, whereas η ≠0 leads to a w CDM -type model. However, the underlying physics of the EMPG model is entirely different in the sense that the energy in the EMPG Universe is sourced by pressureless matter only. Moreover, the energy of the pressureless matter is not conserved, namely, in general it does not dilute as ρ ∝a-3 with the expansion of the Universe. Finally, we constrain the parameters of an EMPG-based cosmology with a recent compilation of 28 Hubble parameter measurements, and find that this model describes an evolution of the Universe similar to that in the Λ CDM model. We briefly discuss that EMPG can be unified with Starobinsky gravity to describe the complete history of the Universe including the inflationary era.
The Age of Precision Cosmology
NASA Technical Reports Server (NTRS)
Chuss, David T.
2012-01-01
In the past two decades, our understanding of the evolution and fate of the universe has increased dramatically. This "Age of Precision Cosmology" has been ushered in by measurements that have both elucidated the details of the Big Bang cosmology and set the direction for future lines of inquiry. Our universe appears to consist of 5% baryonic matter; 23% of the universe's energy content is dark matter which is responsible for the observed structure in the universe; and 72% of the energy density is so-called "dark energy" that is currently accelerating the expansion of the universe. In addition, our universe has been measured to be geometrically flat to 1 %. These observations and related details of the Big Bang paradigm have hinted that the universe underwent an epoch of accelerated expansion known as Uinflation" early in its history. In this talk, I will review the highlights of modern cosmology, focusing on the contributions made by measurements of the cosmic microwave background, the faint afterglow of the Big Bang. I will also describe new instruments designed to measure the polarization of the cosmic microwave background in order to search for evidence of cosmic inflation.
Southern California Regional Technology Acceleration Program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ochoa, Rosibel; Rasochova, Lada
2014-09-30
UC San Diego and San Diego State University are partnering to address these deficiencies in the renewable energy space in the greater San Diego region, accelerating the movement of clean energy innovation from the university laboratory into the marketplace, building on the proven model of the William J. von Liebig Center’s (vLC’s) Proof of Concept (POC) program and virtualizing the effort to enable a more inclusive environment for energy innovation and expansion of the number of clean energy start-ups and/or technology licenses in greater California.
Time arrow is influenced by the dark energy.
Allahverdyan, A E; Gurzadyan, V G
2016-05-01
The arrow of time and the accelerated expansion are two fundamental empirical facts of the universe. We advance the viewpoint that the dark energy (positive cosmological constant) accelerating the expansion of the universe also supports the time asymmetry. It is related to the decay of metastable states under generic perturbations, as we show on example of a microcanonical ensemble. These states will not be metastable without dark energy. The latter also ensures a hyperbolic motion leading to dynamic entropy production with the rate determined by the cosmological constant.
Quantum computational complexity, Einstein's equations and accelerated expansion of the Universe
NASA Astrophysics Data System (ADS)
Ge, Xian-Hui; Wang, Bin
2018-02-01
We study the relation between quantum computational complexity and general relativity. The quantum computational complexity is proposed to be quantified by the shortest length of geodesic quantum curves. We examine the complexity/volume duality in a geodesic causal ball in the framework of Fermi normal coordinates and derive the full non-linear Einstein equation. Using insights from the complexity/action duality, we argue that the accelerated expansion of the universe could be driven by the quantum complexity and free from coincidence and fine-tunning problems.
Dark Energy and the Fate of the Universe
NASA Astrophysics Data System (ADS)
Linde, A.
2002-12-01
The present stage of acceleration of the universe may continue forever. However, we have found a broad class of theories of dark energy that lead to a global collapse of the universe 10-30 billion years from now. I will discuss the possibility to find our destiny using cosmological observations.
Assuring Quality in Online Offerings: Insights from a University's Faculty
ERIC Educational Resources Information Center
Budden, Connie B.; Budden, Heather L.; Hall, Michelle; Longman, Debbie G.
2015-01-01
As the growth of online education offered by universities accelerates and spreads, universities are increasingly grappling with concerns related to widespread availability and the maintenance of academic quality. The "Quality Matters at Southeastern" Program fosters quality through a peer review process and offers a certification process…
Exploring Entrepreneurial Activity at Cape Town and Stellenbosch Universities, South Africa
ERIC Educational Resources Information Center
Jafta, Rachel; Uctu, Ramazan
2013-01-01
Entrepreneurial activity at universities, especially spin-off formation, has emerged as an important mechanism for accelerating the transfer of technology and knowledge to commercial markets. With some exceptions, such as China, studies on university entrepreneurship have tended to concentrate on the experiences of developed countries. Perhaps…
ERIC Educational Resources Information Center
Davidson, Betty M.; Allen-Haynes, Leetta
Critical milestones in the university facilitation of meaningful school reform in schools serving at-risk students--schoolwide assessment, cadre-based planning, and pilot testing of new strategies--are examined in this paper. A training and facilitation mechanism developed by the University of New Orleans' (UNO) Louisiana Accelerated Schools…
Brane with variable tension as a possible solution to the problem of the late cosmic acceleration
NASA Astrophysics Data System (ADS)
García-Aspeitia, Miguel A.; Hernandez-Almada, A.; Magaña, Juan; Amante, Mario H.; Motta, V.; Martínez-Robles, C.
2018-05-01
Braneworld models have been proposed as a possible solution to the problem of the accelerated expansion of the Universe. The idea is to dispense the dark energy (DE) and drive the late-time cosmic acceleration with a five-dimensional geometry. We investigate a brane model with variable brane tension as a function of redshift called chrono-brane. We propose the polynomial λ =(1 +z )n function inspired in tracker-scalar-field potentials. To constrain the n exponent we use the latest observational Hubble data from cosmic chronometers, Type Ia Supernovae from the full joint-light-analysis sample, baryon acoustic oscillations and the posterior distance from the cosmic microwave background of Planck 2015 measurements. A joint analysis of these data estimates n ≃6.19 ±0.12 which generates a DE-like (cosmological-constantlike at late times) term, in the Friedmann equation arising from the extra dimensions. This model is consistent with these data and can drive the Universe to an accelerated phase at late times.
Education in a rapidly advancing technology: Accelerators and beams
DOE Office of Scientific and Technical Information (OSTI.GOV)
Month, Mel
2000-06-01
The field of accelerators and beams (A and B) is one of today's fast changing technologies. Because university faculties have not been able to keep pace with the associated advancing knowledge, universities have not been able to play their traditional role of educating the scientists and engineers needed to sustain this technology for use in science, industry, commerce, and defense. This problem for A and B is described and addressed. The solution proposed, a type of ''distance'' education, is the U.S. Particle Accelerator School (USPAS) created in the early 1980s. USPAS provides the universities with a means of serving themore » education needs of the institutions using A and B, primarily but not exclusively the national laboratories. The field of A and B is briefly summarized. The need for education outside the university framework, the raison d'etre for USPAS, the USPAS method, program structure, and curriculum, and particular USPAS-university connections are explained. The management of USPAS is analyzed, including its unique administrative structure, its institutional ties, and its operations, finance, marketing, and governmental relations. USPAS performance over the years is documented and a business assessment is made. Finally, there is a brief discussion of the future potential for this type of educational program, including possible extrapolation to new areas and/or different environments, in particular, its extra-government potential and its international possibilities. (c) 2000 American Association of Physics Teachers.« less
The Origin Of Most Cosmic Rays: The Acceleration By E(parallel)
NASA Astrophysics Data System (ADS)
Colgate, Stirling A.; Li, H.
2008-03-01
We suggest a universal view of the origin of almost all cosmic rays. We propose that nearly every accelerated CRs was initially part of the parallel current that maintains most all force-free, twisted magnetic fields. We point out the greatest fraction of the free energy of magnetic fields in the universe likely resides in force-free fields as opposed to force-bounded ones, because the velocity of twisting, the ponder motive force, is small compared to local Alven speed. We suggest that these helical fields and the particles that they accelerate are distributed nearly uniformly and consequently are near space-filling with some notable exceptions. Charged particles are accelerated by the E( parallel to the magnetic field B) produced by the dissipation of the free energy of these fields by the progressive diffusive loss of "run-away" accelerated current-carrying charged particles from the "core" of the helical fields. Such diffusive loss is first identified as reconnection, but instead potentiates a much larger irreversible loss of highly accelerated anisotropic run-away current carrier particles. We suggest, as in fusion confinement experiments, that there exists a universal, highly robust, diffusion coefficient, D, resulting in D 1% of Bohm diffusion, as has been found in all confinement experiments, possibly driven by drift waves and, or collision-less, tearing modes. The consequential current carrier loss along the resulting tangled field lines is sufficient to account for the energy, number and spectrum of nearly all CR acceleration, both galactic as well as extra galactic. The spectrum is determined by a loss fraction dn/n -dE/E where dn D E-3/2 resulting in dn/dE = E/E0-2.5 up to 1022 ev. Only mass accretion onto SMBHs can supply the energy necessary, 1060 ergs, to fill the IGM with a CR spectrum of Γ 2.6. (Supported by the DOE)
Can Accelerators Accelerate Learning?
NASA Astrophysics Data System (ADS)
Santos, A. C. F.; Fonseca, P.; Coelho, L. F. S.
2009-03-01
The 'Young Talented' education program developed by the Brazilian State Funding Agency (FAPERJ) [1] makes it possible for high-schools students from public high schools to perform activities in scientific laboratories. In the Atomic and Molecular Physics Laboratory at Federal University of Rio de Janeiro (UFRJ), the students are confronted with modern research tools like the 1.7 MV ion accelerator. Being a user-friendly machine, the accelerator is easily manageable by the students, who can perform simple hands-on activities, stimulating interest in physics, and getting the students close to modern laboratory techniques.
The Early Universe: Searching for Evidence of Cosmic Inflation
NASA Technical Reports Server (NTRS)
Chuss, David T.
2012-01-01
In the past two decades, our understanding of the evolution and fate of the universe has increased dramatically. This "Age of Precision Cosmology" has been ushered in by measurements that have both elucidated the details of the Big Bang cosmology and set the direction for future lines of inquiry. Our universe appears to consist of 5% baryonic matter; 23% of the universe's energy content is dark matter which is responsible for the observed structure in the universe; and 72% of the energy density is so-called "dark energy" that is currently accelerating the expansion of the universe. In addition, our universe has been measured to be geometrically flat to 1 %. These observations and related details of the Big Bang paradigm have hinted that the universe underwent an epoch of accelerated expansion known as "inflation" early in its history. In this talk, I will review the highlights of modern cosmology, focusing on the contributions made by measurements of the cosmic microwave background, the faint afterglow of the Big Bang. I will also describe new instruments designed to measure the polarization of the cosmic microwave background in order to search for evidence of cosmic inflation.
Almas, Muhammad Shoaib; Vanfretti, Luigi
2017-01-01
Synchrophasor measurements from Phasor Measurement Units (PMUs) are the primary sensors used to deploy Wide-Area Monitoring, Protection and Control (WAMPAC) systems. PMUs stream out synchrophasor measurements through the IEEE C37.118.2 protocol using TCP/IP or UDP/IP. The proposed method establishes a direct communication between two PMUs, thus eliminating the requirement of an intermediate phasor data concentrator, data mediator and/or protocol parser and thereby ensuring minimum communication latency without considering communication link delays. This method allows utilizing synchrophasor measurements internally in a PMU to deploy custom protection and control algorithms. These algorithms are deployed using protection logic equations which are supported by all the PMU vendors. Moreover, this method reduces overall equipment cost as the algorithms execute internally in a PMU and therefore does not require any additional controller for their deployment. The proposed method can be utilized for fast prototyping of wide-area measurements based protection and control applications. The proposed method is tested by coupling commercial PMUs as Hardware-in-the-Loop (HIL) with Opal-RT's eMEGAsim Real-Time Simulator (RTS). As illustrative example, anti-islanding protection application is deployed using proposed method and its performance is assessed. The essential points in the method are: •Bypassing intermediate phasor data concentrator or protocol parsers as the synchrophasors are communicated directly between the PMUs (minimizes communication delays).•Wide Area Protection and Control Algorithm is deployed using logic equations in the client PMU, therefore eliminating the requirement for an external hardware controller (cost curtailment)•Effortless means to exploit PMU measurements in an environment familiar to protection engineers.
Towards comprehensive syntactic and semantic annotations of the clinical narrative
Albright, Daniel; Lanfranchi, Arrick; Fredriksen, Anwen; Styler, William F; Warner, Colin; Hwang, Jena D; Choi, Jinho D; Dligach, Dmitriy; Nielsen, Rodney D; Martin, James; Ward, Wayne; Palmer, Martha; Savova, Guergana K
2013-01-01
Objective To create annotated clinical narratives with layers of syntactic and semantic labels to facilitate advances in clinical natural language processing (NLP). To develop NLP algorithms and open source components. Methods Manual annotation of a clinical narrative corpus of 127 606 tokens following the Treebank schema for syntactic information, PropBank schema for predicate-argument structures, and the Unified Medical Language System (UMLS) schema for semantic information. NLP components were developed. Results The final corpus consists of 13 091 sentences containing 1772 distinct predicate lemmas. Of the 766 newly created PropBank frames, 74 are verbs. There are 28 539 named entity (NE) annotations spread over 15 UMLS semantic groups, one UMLS semantic type, and the Person semantic category. The most frequent annotations belong to the UMLS semantic groups of Procedures (15.71%), Disorders (14.74%), Concepts and Ideas (15.10%), Anatomy (12.80%), Chemicals and Drugs (7.49%), and the UMLS semantic type of Sign or Symptom (12.46%). Inter-annotator agreement results: Treebank (0.926), PropBank (0.891–0.931), NE (0.697–0.750). The part-of-speech tagger, constituency parser, dependency parser, and semantic role labeler are built from the corpus and released open source. A significant limitation uncovered by this project is the need for the NLP community to develop a widely agreed-upon schema for the annotation of clinical concepts and their relations. Conclusions This project takes a foundational step towards bringing the field of clinical NLP up to par with NLP in the general domain. The corpus creation and NLP components provide a resource for research and application development that would have been previously impossible. PMID:23355458
Synonym set extraction from the biomedical literature by lexical pattern discovery.
McCrae, John; Collier, Nigel
2008-03-24
Although there are a large number of thesauri for the biomedical domain many of them lack coverage in terms and their variant forms. Automatic thesaurus construction based on patterns was first suggested by Hearst 1, but it is still not clear how to automatically construct such patterns for different semantic relations and domains. In particular it is not certain which patterns are useful for capturing synonymy. The assumption of extant resources such as parsers is also a limiting factor for many languages, so it is desirable to find patterns that do not use syntactical analysis. Finally to give a more consistent and applicable result it is desirable to use these patterns to form synonym sets in a sound way. We present a method that automatically generates regular expression patterns by expanding seed patterns in a heuristic search and then develops a feature vector based on the occurrence of term pairs in each developed pattern. This allows for a binary classifications of term pairs as synonymous or non-synonymous. We then model this result as a probability graph to find synonym sets, which is equivalent to the well-studied problem of finding an optimal set cover. We achieved 73.2% precision and 29.7% recall by our method, out-performing hand-made resources such as MeSH and Wikipedia. We conclude that automatic methods can play a practical role in developing new thesauri or expanding on existing ones, and this can be done with only a small amount of training data and no need for resources such as parsers. We also concluded that the accuracy can be improved by grouping into synonym sets.
Did Cosmology Trigger the Origin of the Solar System?
NASA Technical Reports Server (NTRS)
Blome, H.-J.; Wilson, T. L.
2011-01-01
It is a matter of curious coincidence that the Solar System formed 4.6 billion years ago around the same epoch that the Friedmann-Lemaitre (FL) universe became -dominated or dark-energy-dominated, where is the cosmological constant. This observation was made in the context of known gravitational anomalies that affect spacecraft orbits during planetary flyby's and the Pioneer anomaly, both possibly having connections with cosmology. In addition, it has been known for some time that the Universe is not only expanding but accelerating as well. Hence one must add the onset of cosmological acceleration in the FL universe as having a possible influence on the origin of the Solar System. These connections will now be examined in greater detail.
Stability of the accelerated expansion in nonlinear electrodynamics
NASA Astrophysics Data System (ADS)
Sharif, M.; Mumtaz, Saadia
2017-02-01
This paper is devoted to the phase space analysis of an isotropic and homogeneous model of the universe by taking a noninteracting mixture of the electromagnetic and viscous radiating fluids whose viscous pressure satisfies a nonlinear version of the Israel-Stewart transport equation. We establish an autonomous system of equations by introducing normalized dimensionless variables. In order to analyze the stability of the system, we find corresponding critical points for different values of the parameters. We also evaluate the power-law scale factor whose behavior indicates different phases of the universe in this model. It is concluded that the bulk viscosity as well as electromagnetic field enhances the stability of the accelerated expansion of the isotropic and homogeneous model of the universe.
Supergravity, dark energy, and the fate of the universe
NASA Astrophysics Data System (ADS)
Kallosh, Renata; Linde, Andrei; Prokushkin, Sergey; Shmakova, Marina
2002-12-01
We propose a description of dark energy and acceleration of the universe in extended supergravities with de Sitter (dS) solutions. Some of them are related to M theory with noncompact internal spaces. Masses of ultralight scalars in these models are quantized in units of the Hubble constant: m2=nH2. If the dS solution corresponds to a minimum of the effective potential, the universe eventually becomes dS space. If the dS solution corresponds to a maximum or a saddle point, which is the case in all known models based on N=8 supergravity, the flat universe eventually stops accelerating and collapses to a singularity. We show that in these models, as well as in the simplest models of dark energy based on N=1 supergravity, the typical time remaining before the global collapse is comparable to the present age of the universe, t=O(1010) yr. We discuss the possibility of distinguishing between various models and finding our destiny using cosmological observations.
Is ΛCDM an effective CCDM cosmology?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lima, J.A.S.; Santos, R.C.; Cunha, J.V., E-mail: limajas@astro.iag.usp.br, E-mail: cliviars@gmail.com, E-mail: jvcunha@ufpa.br
We show that a cosmology driven by gravitationally induced particle production of all non-relativistic species existing in the present Universe mimics exactly the observed flat accelerating ΛCDM cosmology with just one dynamical free parameter. This kind of scenario includes the creation cold dark matter (CCDM) model [1] as a particular case and also provides a natural reduction of the dark sector since the vacuum component is not needed to accelerate the Universe. The new cosmic scenario is equivalent to ΛCDM both at the background and perturbative levels and the associated creation process is also in agreement with the universality ofmore » the gravitational interaction and equivalence principle. Implicitly, it also suggests that the present day astronomical observations cannot be considered the ultimate proof of cosmic vacuum effects in the evolved Universe because ΛCDM may be only an effective cosmology.« less
Presidents of Asian Universities Call for More International Partnerships
ERIC Educational Resources Information Center
Hvistendahl, Mara
2009-01-01
The global economic crisis has accelerated the need for Asian universities not only to engage internationally, but also to create regional mechanisms through which students and faculty members can move more easily from one country to another, said Asian university presidents at a conference here in April. The importance of internationalization to…
Bringing Career Support into the Undergraduate Academic Experience
ERIC Educational Resources Information Center
Davis, Aimée Eubanks
2017-01-01
Braven partners with universities to help students put their hard-earned degrees to work. The credit-bearing career acceleration course is embedded within the undergraduate experience at San José State University and Rutgers-University Newark. This format allows students--many of whom are commuters and work full-time outside school--to fit career…
ERIC Educational Resources Information Center
Tran, Tam; Bowman-Carpio, LeeAnna; Buscher, Nate; Davidson, Pamela; Ford, Jennifer J.; Jenkins, Erick; Kalay, Hillary Noll; Nakazono, Terry; Orescan, Helene; Sak, Rachael; Shin, Irene
2017-01-01
In 2013, the University of California, Biomedical Research, Acceleration, Integration, and Development (UC BRAID) convened a regional network of contracting directors from the five University of California (UC) health campuses to: (i) increase collaboration, (ii) operationalize and measure common metrics as a basis for performance improvement…
Beyond 2020: Envisioning the Future of Universities in America
ERIC Educational Resources Information Center
Darden, Mary Landon
2009-01-01
In a world progressing with dizzying acceleration into the Information Age, the slow, measured approach of the traditional university can place administrator, faculty member, and student alike at a disadvantage. To move into this brave new world, the academic animal needs tools. "Beyond 2020: Envisioning the Future of Universities in America" is…
Accelerating universe with time variation of G and Λ
NASA Astrophysics Data System (ADS)
Darabi, F.
2012-03-01
We study a gravitational model in which scale transformations play the key role in obtaining dynamical G and Λ. We take a non-scale invariant gravitational action with a cosmological constant and a gravitational coupling constant. Then, by a scale transformation, through a dilaton field, we obtain a new action containing cosmological and gravitational coupling terms which are dynamically dependent on the dilaton field with Higgs type potential. The vacuum expectation value of this dilaton field, through spontaneous symmetry breaking on the basis of anthropic principle, determines the time variations of G and Λ. The relevance of these time variations to the current acceleration of the universe, coincidence problem, Mach's cosmological coincidence and those problems of standard cosmology addressed by inflationary models, are discussed. The current acceleration of the universe is shown to be a result of phase transition from radiation toward matter dominated eras. No real coincidence problem between matter and vacuum energy densities exists in this model and this apparent coincidence together with Mach's cosmological coincidence are shown to be simple consequences of a new kind of scale factor dependence of the energy momentum density as ρ˜ a -4. This model also provides the possibility for a super fast expansion of the scale factor at very early universe by introducing exotic type matter like cosmic strings.
The Adelphi Experiment: Accelerating Social Work Education.
ERIC Educational Resources Information Center
Rosenblatt, Aaron; And Others
The educational program adopted at Adelphi University School of Social Work provides students interested in obtaining the master's degree in social work with an opportunity to accelerate their professional education. As undergraduate students they can elect to major in social welfare, and if they do, some courses usually available only to graduate…
Journal of Accelerative Learning and Teaching, 1995.
ERIC Educational Resources Information Center
Journal of Accelerative Learning and Teaching, 1995
1995-01-01
Issues 1 and 2 (combined) of the 1995 journal contain these articles: "Accelerated Learning in a Beginning College-Level French Class at the University of Houston" (Patrice Caux); "The Psychobiology of Learning and Memory" (Don Schuster); "Do the Seeds of Accelerated Language Learning and Teaching Lie in a Behavioral…
DOT National Transportation Integrated Search
2009-03-01
The thirteenth full-scale Accelerated Pavement Test (APT) experiment at the Civil Infrastructure Laboratory (CISL) : of Kansas State University aimed to determine the response and the failure mode of thin concrete overlays. Four : pavement structures...
Gifted Students' Perceptions of an Accelerated Summer Program and Social Support
ERIC Educational Resources Information Center
Lee, Seon-Young; Olszewski-Kubilius, Paula; Makel, Matthew C.; Putallaz, Martha
2015-01-01
Using survey responses from students who participated in the summer programs at two university-based gifted education institutions, this study examined changes in gifted students' perceptions of their learning environments, accelerated summer programs and regular schools, and social support in lives after participation in the summer programs. Our…
Robert's Rules for Optimal Learning: Model Development, Field Testing, Implications!
ERIC Educational Resources Information Center
McGinty, Robert L.
The value of accelerated learning techniques developed by the national organization for Suggestive Accelerated Learning Techniques (SALT) was tested in a study using Administrative Policy students taking the capstone course in the Eastern Washington University School of Business. Educators have linked the brain and how it functions to various…
ERIC Educational Resources Information Center
Hertzog, Nancy B.; Chung, Rachel U.
2015-01-01
Radical acceleration from middle school to university is an unusual option in the United States. The Early Entrance Program and the University of Washington (UW) Academy for Young Scholars housed in the Halbert and Nancy Robinson Center for Young Scholars are two of only 21 early university entrance programs offered in the United States. Due to…
Knowledge Acquisition and Management for the NASA Earth Exchange (NEX)
NASA Astrophysics Data System (ADS)
Votava, P.; Michaelis, A.; Nemani, R. R.
2013-12-01
NASA Earth Exchange (NEX) is a data, computing and knowledge collaboratory that houses NASA satellite, climate and ancillary data where a focused community can come together to share modeling and analysis codes, scientific results, knowledge and expertise on a centralized platform with access to large supercomputing resources. As more and more projects are being executed on NEX, we are increasingly focusing on capturing the knowledge of the NEX users and provide mechanisms for sharing it with the community in order to facilitate reuse and accelerate research. There are many possible knowledge contributions to NEX, it can be a wiki entry on the NEX portal contributed by a developer, information extracted from a publication in an automated way, or a workflow captured during code execution on the supercomputing platform. The goal of the NEX knowledge platform is to capture and organize this information and make it easily accessible to the NEX community and beyond. The knowledge acquisition process consists of three main faucets - data and metadata, workflows and processes, and web-based information. Once the knowledge is acquired, it is processed in a number of ways ranging from custom metadata parsers to entity extraction using natural language processing techniques. The processed information is linked with existing taxonomies and aligned with internal ontology (which heavily reuses number of external ontologies). This forms a knowledge graph that can then be used to improve users' search query results as well as provide additional analytics capabilities to the NEX system. Such a knowledge graph will be an important building block in creating a dynamic knowledge base for the NEX community where knowledge is both generated and easily shared.
Learning to Understand Natural Language with Less Human Effort
2015-05-01
j ); if one of these has the correct logical form, ` j = `i, then tj is taken as the approximate maximizer. 29 2.3 Discussion This chapter...where j indexes entity tuples (e1, e2). Training optimizes the semantic parser parameters θ to predict Y = yj,Z = zj given S = sj . The parameters θ...be au tif ul / J J N 1 /N 1 λ f .f L on do n /N N P N λ x .M (x ,“ lo nd on ”, C IT Y ) N : λ x .M (x ,“ lo nd on ”, C IT Y ) (S [d cl ]\\N
How Architecture-Driven Modernization Is Changing the Game in Information System Modernization
2010-04-01
Health Administration MUMPS to Java 300K 4 mo. State of OR Employee Retirement System COBOL to C# .Net 250K 4 mo. Civilian State of WA Off. of Super of...Jovial, Mumps , A MagnaX, Natural, B PVL, P owerBuilder, A SQL, Vax Basic, s V B 6, + Others E revolution, inc. C, Target System "To Be" C#, C...successfully completed in 4 months • Created a new JANUSTM MUMPS parser TM , Implementation • Final “To-Be” Documentation • JANUS rules engine
Speed up of XML parsers with PHP language implementation
NASA Astrophysics Data System (ADS)
Georgiev, Bozhidar; Georgieva, Adriana
2012-11-01
In this paper, authors introduce PHP5's XML implementation and show how to read, parse, and write a short and uncomplicated XML file using Simple XML in a PHP environment. The possibilities for mutual work of PHP5 language and XML standard are described. The details of parsing process with Simple XML are also cleared. A practical project PHP-XML-MySQL presents the advantages of XML implementation in PHP modules. This approach allows comparatively simple search of XML hierarchical data by means of PHP software tools. The proposed project includes database, which can be extended with new data and new XML parsing functions.
A Risk Assessment System with Automatic Extraction of Event Types
NASA Astrophysics Data System (ADS)
Capet, Philippe; Delavallade, Thomas; Nakamura, Takuya; Sandor, Agnes; Tarsitano, Cedric; Voyatzi, Stavroula
In this article we describe the joint effort of experts in linguistics, information extraction and risk assessment to integrate EventSpotter, an automatic event extraction engine, into ADAC, an automated early warning system. By detecting as early as possible weak signals of emerging risks ADAC provides a dynamic synthetic picture of situations involving risk. The ADAC system calculates risk on the basis of fuzzy logic rules operated on a template graph whose leaves are event types. EventSpotter is based on a general purpose natural language dependency parser, XIP, enhanced with domain-specific lexical resources (Lexicon-Grammar). Its role is to automatically feed the leaves with input data.
QUEST/Ada: Query utility environment for software testing of Ada
NASA Technical Reports Server (NTRS)
Brown, David B.
1989-01-01
Results of research and development efforts are presented for Task 1, Phase 2 of a general project entitled, The Development of a Program Analysis Environment for Ada. A prototype of the QUEST/Ada system was developed to collect data to determine the effectiveness of the rule-based testing paradigm. The prototype consists of five parts: the test data generator, the parser/scanner, the test coverage analyzer, a symbolic evaluator, and a data management facility, known as the Librarian. These components are discussed at length. Also presented is an experimental design for the evaluations, an overview of the project, and a schedule for its completion.
Automatic Speech Recognition in Air Traffic Control: a Human Factors Perspective
NASA Technical Reports Server (NTRS)
Karlsson, Joakim
1990-01-01
The introduction of Automatic Speech Recognition (ASR) technology into the Air Traffic Control (ATC) system has the potential to improve overall safety and efficiency. However, because ASR technology is inherently a part of the man-machine interface between the user and the system, the human factors issues involved must be addressed. Here, some of the human factors problems are identified and related methods of investigation are presented. Research at M.I.T.'s Flight Transportation Laboratory is being conducted from a human factors perspective, focusing on intelligent parser design, presentation of feedback, error correction strategy design, and optimal choice of input modalities.
Meeting the needs of our best and brightest: curriculum acceleration in tertiary mathematics
NASA Astrophysics Data System (ADS)
Hannah, John; James, Alex; Montelle, Clemency; Nokes, Jacqui
2011-04-01
For many years, it has been a common practice to recognize students with high potential by according them targeted privileges and opportunities. This includes the practice of allowing students to accelerate their high school programme and, in the case of New Zealand students, to take university courses during their final school years. This work assesses the wisdom of this practice of acceleration in mathematics both for the student and the tertiary institution.
Reduced Contact Hour Accelerated Courses and Student Learning
ERIC Educational Resources Information Center
Thornton, Barry; Demps, Julius; Jadav, Arpita
2017-01-01
Undergraduate instruction in the Davis College of Business at Jacksonville University utilizes two course delivery methods. Traditional daytime classes are 15 weeks long and have approximately 40 contact hours, while evening courses are offered in the Accelerated Degree program in a compressed 8-week format with 24 contact hours. The curriculum is…
ERIC Educational Resources Information Center
Trekles, Anastasia M.; Sims, Roderick
2013-01-01
The purpose of this exploratory case study was to explore instructional design strategies and characteristics of online, asynchronous accelerated courses and students' choices of deep or surface learning approaches within this environment. An increasing number of university programs, particularly at the graduate level, are moving to an…
2016-09-01
AWARD NUMBER: W81XWH-13-1-0309 TITLE: Acceleration of Regeneration of Large-Gap Peripheral Nerve Injuries Using Acellular Nerve Allografts...plus amniotic Fluid Derived Stem Cells (AFS). PRINCIPAL INVESTIGATOR: Thomas L. Smith, PhD RECIPIENT: Wake Forest University Health Sciences
Osaka Symposium and New Accelerator Projects in Japan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wei, Jie
1997-04-25
The purpose of this presentation was to participate as an invited speaker at the XV RCNP Osaka International Symposium on Multi-GeV High-Performance Accelerators and Related Technology to collaborate with Kyoto University on laser cooling and beam crystallization projects and to give seminars in Beijing and Shanghai on the Relativistic Heavy Ion Collider.
An Experiment in ''Less Time, More Options": A Study of Accelerated University Students.
ERIC Educational Resources Information Center
Litwin, James L.; And Others
This study investigated the characteristics and experiences of 59 college students accelerated from their freshman to their junior year. The students showed high academic performance and few social problems, but questions of personal identity remained problematic; the best single predictor of academic success was found to be freshman grade-point…
Dark Energy and the Cosmological Constant: A Brief Introduction
ERIC Educational Resources Information Center
Harvey, Alex
2009-01-01
The recently observed acceleration of the expansion of the universe is a topic of intense interest. The favoured causes are the "cosmological constant" or "dark energy". The former, which appears in the Einstein equations as the term [lambda]g[subscript [mu]v], provides an extremely simple, well-defined mechanism for the acceleration. However,…
ERIC Educational Resources Information Center
Gupta, Kalpana
2012-01-01
This study was focused on investigating inclusive learning environments in accelerated classroom formats. Three 8-week sections of an undergraduate course at Regis University were examined. Results from observations and surveys were analyzed to determine the effectiveness and consistency of 13 inclusive strategies derived from Wlodkowski and…
Learning at the Speed of Light: Deep Learning and Accelerated Online Graduate Courses
ERIC Educational Resources Information Center
Trekles, Anastasia M.
2013-01-01
An increasing number of university programs, particularly at the graduate level, are moving to an accelerated, time-compressed model for online degree offerings. However, the literature revealed that research in distance education effectiveness is still confounded by many variables, including course design and student approach to learning.…
Extended DBI massive gravity with generalized fiducial metric
NASA Astrophysics Data System (ADS)
Chullaphan, Tossaporn; Tannukij, Lunchakorn; Wongjun, Pitayuth
2015-06-01
We consider an extended model of DBI massive gravity by generalizing the fiducial metric to be an induced metric on the brane corresponding to a domain wall moving in five-dimensional Schwarzschild-Anti-de Sitter spacetime. The model admits all solutions of FLRW metric including flat, closed and open geometries while the original one does not. The background solutions can be divided into two branches namely self-accelerating branch and normal branch. For the self-accelerating branch, the graviton mass plays the role of cosmological constant to drive the late-time acceleration of the universe. It is found that the number degrees of freedom of gravitational sector is not correct similar to the original DBI massive gravity. There are only two propagating degrees of freedom from tensor modes. For normal branch, we restrict our attention to a particular class of the solutions which provides an accelerated expansion of the universe. It is found that the number of degrees of freedom in the model is correct. However, at least one of them is ghost degree of freedom which always present at small scale implying that the theory is not stable.
Gravitationally influenced particle creation models and late-time cosmic acceleration
NASA Astrophysics Data System (ADS)
Pan, Supriya; Kumar Pal, Barun; Pramanik, Souvik
In this work, we focus on the gravitationally influenced adiabatic particle creation process, a mechanism that does not need any dark energy or modified gravity models to explain the current accelerating phase of the universe. Introducing some particle creation models that generalize some previous models in the literature, we constrain the cosmological scenarios using the latest compilation of the Type Ia Supernovae data only, the first indicator of the accelerating universe. Aside from the observational constraints on the models, we examine the models using two model independent diagnoses, namely the cosmography and Om. Further, we establish the general conditions to test the thermodynamic viabilities of any particle creation model. Our analysis shows that at late-time, the models have close resemblance to that of the ΛCDM cosmology, and the models always satisfy the generalized second law of thermodynamics under certain conditions.
Accelerator Science: Collider vs. Fixed Target
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lincoln, Don
Particle physics experiments employ high energy particle accelerators to make their measurements. However there are many kinds of particle accelerators with many interesting techniques. One important dichotomy is whether one takes a particle beam and have it hit a stationary target of atoms, or whether one takes two counter rotating beams of particles and smashes them together head on. In this video, Fermilab’s Dr. Don Lincoln explains the pros and cons of these two powerful methods of exploring the rules of the universe.
Is Cosmic Acceleration Telling Us Something About Gravity?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Trodden, Mark
2006-03-02
Among the possible explanations for the observed acceleration of the universe, perhaps the boldest is the idea that new gravitational physics might be the culprit. In this colloquium I will discuss some of the challenges of constructing a sensible phenomenological extension of General Relativity, give examples of some candidate models of modified gravity and survey existing observational constraints on this approach. I will conclude by discussing how we might hope to distinguish between modifications of General Relativity and dark energy as competing hypotheses to explain cosmic acceleration.
2016-09-01
AWARD NUMBER: W811XWH-13-1-0310 TITLE: Acceleration of Regeneration of Large-Gap Peripheral Nerve Injuries Using Acellular Nerve Allografts...plus amniotic Fluid Derived Stem Cells (AFS). PRINCIPAL INVESTIGATOR: Zhongyu Li, MD, PhD RECIPIENT: Wake Forest University Health Sciences...REPORT DATE September 2016 2. REPORT TYPE Annual 3. DATES COVERED 1Sep2015 - 31Aug2016 4. TITLE AND SUBTITLE Acceleration of Regeneration of Large
Is Cosmic Acceleration Telling Us Something About Gravity?
Trodden, Mark
2018-05-11
Among the possible explanations for the observed acceleration of the universe, perhaps the boldest is the idea that new gravitational physics might be the culprit. In this colloquium I will discuss some of the challenges of constructing a sensible phenomenological extension of General Relativity, give examples of some candidate models of modified gravity and survey existing observational constraints on this approach. I will conclude by discussing how we might hope to distinguish between modifications of General Relativity and dark energy as competing hypotheses to explain cosmic acceleration.
High Energy Ion Acceleration by Extreme Laser Radiation Pressure
2017-03-14
and was published in Nuclear Instruments and Methods A [11]. For similar targets, it was found that by monitoring the divergence of a low- energy ...AFRL-AFOSR-UK-TR-2017-0015 High energy ion acceleration by extreme laser radiation pressure Paul McKenna UNIVERSITY OF STRATHCLYDE VIZ ROYAL COLLEGE...MM-YYYY) 14-03-2017 2. REPORT TYPE Final 3. DATES COVERED (From - To) 01 May 2013 to 31 Dec 2016 4. TITLE AND SUBTITLE High energy ion acceleration
Accelerator Science: Collider vs. Fixed Target
Lincoln, Don
2018-01-16
Particle physics experiments employ high energy particle accelerators to make their measurements. However there are many kinds of particle accelerators with many interesting techniques. One important dichotomy is whether one takes a particle beam and have it hit a stationary target of atoms, or whether one takes two counter rotating beams of particles and smashes them together head on. In this video, Fermilabâs Dr. Don Lincoln explains the pros and cons of these two powerful methods of exploring the rules of the universe.
Guided post-acceleration of laser-driven ions by a miniature modular structure
Kar, Satyabrata; Ahmed, Hamad; Prasad, Rajendra; Cerchez, Mirela; Brauckmann, Stephanie; Aurand, Bastian; Cantono, Giada; Hadjisolomou, Prokopis; Lewis, Ciaran L. S.; Macchi, Andrea; Nersisyan, Gagik; Robinson, Alexander P. L.; Schroer, Anna M.; Swantusch, Marco; Zepf, Matt; Willi, Oswald; Borghesi, Marco
2016-01-01
All-optical approaches to particle acceleration are currently attracting a significant research effort internationally. Although characterized by exceptional transverse and longitudinal emittance, laser-driven ion beams currently have limitations in terms of peak ion energy, bandwidth of the energy spectrum and beam divergence. Here we introduce the concept of a versatile, miniature linear accelerating module, which, by employing laser-excited electromagnetic pulses directed along a helical path surrounding the laser-accelerated ion beams, addresses these shortcomings simultaneously. In a proof-of-principle experiment on a university-scale system, we demonstrate post-acceleration of laser-driven protons from a flat foil at a rate of 0.5 GeV m−1, already beyond what can be sustained by conventional accelerator technologies, with dynamic beam collimation and energy selection. These results open up new opportunities for the development of extremely compact and cost-effective ion accelerators for both established and innovative applications. PMID:27089200
NASA Astrophysics Data System (ADS)
Rout, Bibhudutta; Dhoubhadel, Mangal S.; Poudel, Prakash R.; Kummari, Venkata C.; Pandey, Bimal; Deoli, Naresh T.; Lakshantha, Wickramaarachchige J.; Mulware, Stephen J.; Baxley, Jacob; Manuel, Jack E.; Pacheco, Jose L.; Szilasi, Szabolcs; Weathers, Duncan L.; Reinert, Tilo; Glass, Gary A.; Duggan, Jerry L.; McDaniel, Floyd D.
2013-07-01
The Ion Beam Modification and Analysis Laboratory (IBMAL) at the University of North Texas includes several accelerator facilities with capabilities of producing a variety of ion beams from tens of keV to several MeV in energy. The four accelerators are used for research, graduate and undergraduate education, and industrial applications. The NEC 3MV Pelletron tandem accelerator has three ion sources for negative ions: He Alphatross and two different SNICS-type sputter ion sources. Presently, the tandem accelerator has four high-energy beam transport lines and one low-energy beam transport line directly taken from the negative ion sources for different research experiments. For the low-energy beam line, the ion energy can be varied from ˜20 to 80 keV for ion implantation/modification of materials. The four post-acceleration beam lines include a heavy-ion nuclear microprobe; multi-purpose PIXE, RBS, ERD, NRA, and broad-beam single-event upset; high-energy ion implantation line; and trace-element accelerator mass spectrometry. The NEC 3MV single-ended Pelletron accelerator has an RF ion source mainly for hydrogen, helium and heavier inert gases. We recently installed a capacitive liner to the terminal potential stabilization system for high terminal voltage stability and high-resolution microprobe analysis. The accelerator serves a beam line for standard RBS and RBS/C. Another beamline for high energy focused ion beam application using a magnetic quadrupole lens system is currently under construction. This beam line will also serve for developmental work on an electrostatic lens system. The third accelerator is a 200 kV Cockcroft-Walton accelerator with an RF ion source. The fourth accelerator is a 2.5 MV Van de Graaff accelerator, which was in operation for last several decades is currently planned to be used mainly for educational purpose. Research projects that will be briefly discussed include materials synthesis/modification for photonic, electronic, and magnetic applications, surface sputtering and micro-fabrication of materials, development of high-energy ion microprobe systems, and educational and outreach activities.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rout, Bibhudutta; Dhoubhadel, Mangal S.; Poudel, Prakash R.
2013-07-03
The Ion Beam Modification and Analysis Laboratory (IBMAL) at the University of North Texas includes several accelerator facilities with capabilities of producing a variety of ion beams from tens of keV to several MeV in energy. The four accelerators are used for research, graduate and undergraduate education, and industrial applications. The NEC 3MV Pelletron tandem accelerator has three ion sources for negative ions: He Alphatross and two different SNICS-type sputter ion sources. Presently, the tandem accelerator has four high-energy beam transport lines and one low-energy beam transport line directly taken from the negative ion sources for different research experiments. Formore » the low-energy beam line, the ion energy can be varied from {approx}20 to 80 keV for ion implantation/modification of materials. The four post-acceleration beam lines include a heavy-ion nuclear microprobe; multi-purpose PIXE, RBS, ERD, NRA, and broad-beam single-event upset; high-energy ion implantation line; and trace-element accelerator mass spectrometry. The NEC 3MV single-ended Pelletron accelerator has an RF ion source mainly for hydrogen, helium and heavier inert gases. We recently installed a capacitive liner to the terminal potential stabilization system for high terminal voltage stability and high-resolution microprobe analysis. The accelerator serves a beam line for standard RBS and RBS/C. Another beamline for high energy focused ion beam application using a magnetic quadrupole lens system is currently under construction. This beam line will also serve for developmental work on an electrostatic lens system. The third accelerator is a 200 kV Cockcroft-Walton accelerator with an RF ion source. The fourth accelerator is a 2.5 MV Van de Graaff accelerator, which was in operation for last several decades is currently planned to be used mainly for educational purpose. Research projects that will be briefly discussed include materials synthesis/modification for photonic, electronic, and magnetic applications, surface sputtering and micro-fabrication of materials, development of high-energy ion microprobe systems, and educational and outreach activities.« less
NASA Astrophysics Data System (ADS)
Schlickeiser, R.; Oppotsch, J.
2017-12-01
The analytical theory of diffusive acceleration of cosmic rays at parallel stationary shock waves of arbitrary speed with magnetostatic turbulence is developed from first principles. The theory is based on the diffusion approximation to the gyrotropic cosmic-ray particle phase-space distribution functions in the respective rest frames of the up- and downstream medium. We derive the correct cosmic-ray jump conditions for the cosmic-ray current and density, and match the up- and downstream distribution functions at the position of the shock. It is essential to account for the different particle momentum coordinates in the up- and downstream media. Analytical expressions for the momentum spectra of shock-accelerated cosmic rays are calculated. These are valid for arbitrary shock speeds including relativistic shocks. The correctly taken limit for nonrelativistic shock speeds leads to a universal broken power-law momentum spectrum of accelerated particles with velocities well above the injection velocity threshold, where the universal power-law spectral index q≃ 2-{γ }1-4 is independent of the flow compression ratio r. For nonrelativistic shock speeds, we calculate for the first time the injection velocity threshold, settling the long-standing injection problem for nonrelativistic shock acceleration.
In Defense of an Accelerating Universe: Model Insensitivity of the Hubble Diagram
NASA Astrophysics Data System (ADS)
Ringermacher, Harry I.; Mead, Lawrence R.
2018-01-01
A recently published paper by Nielsen, Guffanti & Sarkar (. Sci. Rep. 6, 35596, Oct. 2016) argues that the evidence for cosmic acceleration is marginal and that a coasting universe - the “Milne Universe” - fits the same supernovae data in a Hubble diagram nearly as well. Other papers have since jumped on the bandwagon. The Milne Universe has negative spatial curvature, but is Riemann-flat. Nevertheless, we confirm that the Milne model fits the data just as well as LCDM. We show that this unexpected result points to a weakness in the Hubble diagram rather than to a failure in LCDM. It seems the Hubble diagram is insensitive to spatial curvature. To be specific, the spatial curvature dependences of the comoving radius in the luminosity distance nearly exactly cancel the energy density differences. That is, r(LCDM) = sinh[r(Milne)]. By transforming the distance modulus vs. redshift data to scale factor vs. cosmological time data, for each curvature, k = {+1, 0, -1}, the curvature dependence of the data is effectively separated thus permitting a more precise fit of the Omega parameters to the scale factor data to decide the correct model. Here we present the data and both models in a scale factor vs. cosmological time plot. The difference of the means of the k = 0 and k =-1 data separate at a 2-sigma confidence level. The LCDM fit to the k = 0 data are consistent with an accelerating universe to 99% confidence. The Milne universe fits the k =-1 data to no better than about 70% confidence. This is consistent with independent CMB and BAO observations supporting a flat universe.
ILU industrial electron accelerators for medical-product sterilization and food treatment
NASA Astrophysics Data System (ADS)
Bezuglov, V. V.; Bryazgin, A. A.; Vlasov, A. Yu.; Voronin, L. A.; Panfilov, A. D.; Radchenko, V. M.; Tkachenko, V. O.; Shtarklev, E. A.
2016-12-01
Pulse linear electron accelerators of the ILU type have been developed and produced by the Institute of Nuclear Physics, Siberian Branch, Russian Academy of Sciences, for more than 30 years. Their distinctive features are simplicity of design, convenience in operation, and reliability during long work under conditions of industrial production. ILU accelerators have a range of energy of 0.7-10 MeV at a power of accelerated beam of up to 100 kW and they are optimally suitable for use as universal sterilizing complexes. The scientific novelty of these accelerators consists of their capability to work both in the electron-treatment mode of production and in the bremsstrahlung generation mode, which has high penetrating power.
Anisotropic universe with magnetized dark energy
NASA Astrophysics Data System (ADS)
Goswami, G. K.; Dewangan, R. N.; Yadav, Anil Kumar
2016-04-01
In the present work we have searched the existence of the late time acceleration of the Universe filled with cosmic fluid and uniform magnetic field as source of matter in anisotropic Heckmann-Schucking space-time. The observed acceleration of universe has been explained by introducing a positive cosmological constant Λ in the Einstein's field equation which is mathematically equivalent to vacuum energy with equation of state (EOS) parameter set equal to -1. The present values of the matter and the dark energy parameters (Ωm)0 & (Ω_{Λ})0 are estimated in view of the latest 287 high red shift (0.3 ≤ z ≤1.4) SN Ia supernova data's of observed apparent magnitude along with their possible error taken from Union 2.1 compilation. It is found that the best fit value for (Ωm)0 & (Ω_{Λ})0 are 0.2820 & 0.7177 respectively which are in good agreement with recent astrophysical observations in the latest surveys like WMAP [2001-2013], Planck [latest 2015] & BOSS. Various physical parameters such as the matter and dark energy densities, the present age of the universe and deceleration parameter have been obtained on the basis of the values of (Ωm)0 & (Ω_{Λ})0. Also we have estimated that the acceleration would have begun in the past at z = 0.71131 ˜6.2334 Gyrs before from present.
NASA Astrophysics Data System (ADS)
Conley, A.; Goldhaber, G.; Wang, L.; Aldering, G.; Amanullah, R.; Commins, E. D.; Fadeyev, V.; Folatelli, G.; Garavini, G.; Gibbons, R.; Goobar, A.; Groom, D. E.; Hook, I.; Howell, D. A.; Kim, A. G.; Knop, R. A.; Kowalski, M.; Kuznetsova, N.; Lidman, C.; Nobili, S.; Nugent, P. E.; Pain, R.; Perlmutter, S.; Smith, E.; Spadafora, A. L.; Stanishev, V.; Strovink, M.; Thomas, R. C.; Wood-Vasey, W. M.; Supernova Cosmology Project
2006-06-01
We present measurements of Ωm and ΩΛ from a blind analysis of 21 high-redshift supernovae using a new technique (CMAGIC) for fitting the multicolor light curves of Type Ia supernovae, first introduced by Wang and coworkers. CMAGIC takes advantage of the remarkably simple behavior of Type Ia supernovae on color-magnitude diagrams and has several advantages over current techniques based on maximum magnitudes. Among these are a reduced sensitivity to host galaxy dust extinction, a shallower luminosity-width relation, and the relative simplicity of the fitting procedure. This allows us to provide a cross-check of previous supernova cosmology results, despite the fact that current data sets were not observed in a manner optimized for CMAGIC. We describe the details of our novel blindness procedure, which is designed to prevent experimenter bias. The data are broadly consistent with the picture of an accelerating universe and agree with a flat universe within 1.7 σ, including systematics. We also compare the CMAGIC results directly with those of a maximum magnitude fit to the same supernovae, finding that CMAGIC favors more acceleration at the 1.6 σ level, including systematics and the correlation between the two measurements. A fit for w assuming a flat universe yields a value that is consistent with a cosmological constant within 1.2 σ.
A simple cosmology with a varying fine structure constant.
Sandvik, Håvard Bunes; Barrow, John D; Magueijo, João
2002-01-21
We investigate the cosmological consequences of a theory in which the electric charge e can vary. In this theory the fine structure "constant," alpha, remains almost constant in the radiation era, undergoes a small increase in the matter era, but approaches a constant value when the universe starts accelerating because of a positive cosmological constant. This model satisfies geonuclear, nucleosynthesis, and cosmic microwave background constraints on time variation in alpha, while fitting the observed accelerating Universe and evidence for small alpha variations in quasar spectra. It also places specific restrictions on the nature of the dark matter. Further tests, involving stellar spectra and Eötvös experiments, are proposed.
Explaining the Supernova Data Without Accelerating Expansion
NASA Astrophysics Data System (ADS)
Stuckey, W. M.; McDevitt, T. J.; Silberstein, M.
2012-10-01
The 2011 Nobel Prize in Physics was awarded "for the discovery of the accelerating expansion of the universe through observations of distant supernovae." However, it is not the case that the type Ia supernova data necessitates accelerating expansion. Since we do not have a successful theory of quantum gravity, we should not assume general relativity (GR) will survive unification intact, especially on cosmological scales where tests are scarce. We provide a simple example of how GR cosmology may be modified to produce a decelerating Einstein-de Sitter cosmology (EdS) that accounts for the Union2 Compilation data as well as the accelerating ΛCDM (EdS plus a cosmological constant).
Analyzing collision processes with the smartphone acceleration sensor
NASA Astrophysics Data System (ADS)
Vogt, Patrik; Kuhn, Jochen
2014-02-01
It has been illustrated several times how the built-in acceleration sensors of smartphones can be used gainfully for quantitative experiments in school and university settings (see the overview in Ref. 1). The physical issues in that case are manifold and apply, for example, to free fall,2 radial acceleration,3 several pendula, or the exploitation of everyday contexts.6 This paper supplements these applications and presents an experiment to study elastic and inelastic collisions. In addition to the masses of the two impact partners, their velocities before and after the collision are of importance, and these velocities can be determined by numerical integration of the measured acceleration profile.
Digital Technologies as Education Innovation at Universities
ERIC Educational Resources Information Center
Kryukov, Vladimir; Gorin, Alexey
2017-01-01
This paper analyses the use of digital technology-based education innovations in higher education. It demonstrated that extensive implementation of digital technologies in universities is the main factor conditioning the acceleration of innovative changes in educational processes, while digital technologies themselves become one of the key…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pereira, S.H.; Guimarães, T.M., E-mail: shpereira@feg.unesp.br, E-mail: thiago.mogui@gmail.com
In this paper we construct the complete evolution of the universe driven by the mass dimension one dark spinor called Elko, starting with inflation, passing by the matter dominated era and finishing with the recent accelerated expansion. The dynamic of the fermionic Elko field with a symmetry breaking type potential can reproduce all phases of the universe in a natural and elegant way. The dynamical equations in general case and slow roll conditions in the limit H || m {sub pl} are also presented for the Elko system. Numerical analysis for the number of e-foldings during inflation, energy density aftermore » inflation and for present time and also the actual size of the universe are in good agreement with the standard model of cosmology. An interpretation of the inflationary phase as a result of Pauli exclusion principle is also possible if the Elko field is treated as an average value of its quantum analogue.« less
Sequestration of vacuum energy and the end of the universe.
Kaloper, Nemanja; Padilla, Antonio
2015-03-13
Recently, we proposed a mechanism for sequestering the standard model vacuum energy that predicts that the Universe will collapse. Here we present a simple mechanism for bringing about this collapse, employing a scalar field whose potential is linear and becomes negative, providing the negative energy density required to end the expansion. The slope of the potential is chosen to allow for the expansion to last until the current Hubble time, about 10^{10} years, to accommodate our Universe. Crucially, this choice is technically natural due to a shift symmetry. Moreover, vacuum energy sequestering selects radiatively stable initial conditions for the collapse, which guarantee that immediately before the turnaround the Universe is dominated by the linear potential which drives an epoch of accelerated expansion for at least an e fold. Thus, a single, technically natural choice for the slope ensures that the collapse is imminent and is preceded by the current stage of cosmic acceleration, giving a new answer to the "why now?"
ERIC Educational Resources Information Center
Morreale, Sherwyn P.; And Others
This paper examines the impact of traditional and accelerated public speaking instruction on undergraduate-level students' self-perceptions of communication apprehension and self-esteem. Subjects, students at the University of Colorado at Colorado Springs were enrolled in the same semester in either a 16-week traditional public speaking course…
NASA Astrophysics Data System (ADS)
Avagyan, R. M.; Harutyunyan, G. H.
2018-03-01
The cosmological dynamics of a quasi-de Sitter model is described in an "Einstein" representation of the modified Jordan theory using a qualitative theory of dynamic systems. An inflationary picture of the expansion is obtained for a range of the dimensionless acceleration parameter from one to zero.
PET - radiopharmaceutical facilities at Washington University Medical School - an overview
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dence, C.S.; Welch, M.J.
1994-12-31
The PET program at Washington University has evolved over more than three decades of research and development in the use of positron-emitting isotopes in medicine and biology. In 1962 the installation of the first hospital cyclotron in the USA was accomplished. This first machine was an Allis Chalmers (AC) cyclotron and it was operated until July, 1990. Simultaneously with this cyclotron the authors also ran a Cyclotron Corporation (TCC) CS-15 cyclotron that was purchased in 1977. Both of these cyclotrons were maintained in-house and operated with a relatively small downtime (approximately 3.5%). After the dismantling of the AC machine inmore » 1990, a Japanese Steel Works 16/8 (JSW-16/8) cyclotron was installed in the vault. Whereas the AC cyclotron could only accelerate deuterons (6.2 MeV), the JSW - 16/8 machine can accelerate both protons and deuterons, so all of the radiopharmaceuticals can be produced on either of the two presently owned accelerators. At the end of May 1993, the medical school installed the first clinical Tandem Cascade Accelerator (TCA) a collaboration with Science Research Laboratories (SRL) of Somerville, MA. Preliminary target testing, design and development are presently under way. In 1973, the University installed the first operational PETT device in the country, and at present there is a large basic science and clinical research program involving more than a hundred staff in nuclear medicine, radiation sciences, neurology, neurosurgery, psychiatry, cardiology, pulmonary medicine, oncology, and surgery.« less
'Isotopo' a database application for facile analysis and management of mass isotopomer data.
Ahmed, Zeeshan; Zeeshan, Saman; Huber, Claudia; Hensel, Michael; Schomburg, Dietmar; Münch, Richard; Eylert, Eva; Eisenreich, Wolfgang; Dandekar, Thomas
2014-01-01
The composition of stable-isotope labelled isotopologues/isotopomers in metabolic products can be measured by mass spectrometry and supports the analysis of pathways and fluxes. As a prerequisite, the original mass spectra have to be processed, managed and stored to rapidly calculate, analyse and compare isotopomer enrichments to study, for instance, bacterial metabolism in infection. For such applications, we provide here the database application 'Isotopo'. This software package includes (i) a database to store and process isotopomer data, (ii) a parser to upload and translate different data formats for such data and (iii) an improved application to process and convert signal intensities from mass spectra of (13)C-labelled metabolites such as tertbutyldimethylsilyl-derivatives of amino acids. Relative mass intensities and isotopomer distributions are calculated applying a partial least square method with iterative refinement for high precision data. The data output includes formats such as graphs for overall enrichments in amino acids. The package is user-friendly for easy and robust data management of multiple experiments. The 'Isotopo' software is available at the following web link (section Download): http://spp1316.uni-wuerzburg.de/bioinformatics/isotopo/. The package contains three additional files: software executable setup (installer), one data set file (discussed in this article) and one excel file (which can be used to convert data from excel to '.iso' format). The 'Isotopo' software is compatible only with the Microsoft Windows operating system. http://spp1316.uni-wuerzburg.de/bioinformatics/isotopo/. © The Author(s) 2014. Published by Oxford University Press.
NASA Astrophysics Data System (ADS)
Manzella, Giuseppe M. R.; Bartolini, Andrea; Bustaffa, Franco; D'Angelo, Paolo; De Mattei, Maurizio; Frontini, Francesca; Maltese, Maurizio; Medone, Daniele; Monachini, Monica; Novellino, Antonio; Spada, Andrea
2016-04-01
The MAPS (Marine Planning and Service Platform) project is aiming at building a computer platform supporting a Marine Information and Knowledge System. One of the main objective of the project is to develop a repository that should gather, classify and structure marine scientific literature and data thus guaranteeing their accessibility to researchers and institutions by means of standard protocols. In oceanography the cost related to data collection is very high and the new paradigm is based on the concept to collect once and re-use many times (for re-analysis, marine environment assessment, studies on trends, etc). This concept requires the access to quality controlled data and to information that is provided in reports (grey literature) and/or in relevant scientific literature. Hence, creation of new technology is needed by integrating several disciplines such as data management, information systems, knowledge management. In one of the most important EC projects on data management, namely SeaDataNet (www.seadatanet.org), an initial example of knowledge management is provided through the Common Data Index, that is providing links to data and (eventually) to papers. There are efforts to develop search engines to find author's contributions to scientific literature or publications. This implies the use of persistent identifiers (such as DOI), as is done in ORCID. However very few efforts are dedicated to link publications to the data cited or used or that can be of importance for the published studies. This is the objective of MAPS. Full-text technologies are often unsuccessful since they assume the presence of specific keywords in the text; in order to fix this problem, the MAPS project suggests to use different semantic technologies for retrieving the text and data and thus getting much more complying results. The main parts of our design of the search engine are: • Syntactic parser - This module is responsible for the extraction of "rich words" from the text: the whole document gets parsed to extract the words which are more meaningful for the main argument of the document, and applies the extraction in the form of N-grams (mono-grams, bi-grams, tri-grams). • MAPS database - This module is a simple database which contains all the N-grams used by MAPS (physical parameters from SeaDataNet vocabularies) to define our marine "ontology". • Relation identifier - This module performs the most important task of identifying relationships between the N-gram extracted from the text by the parser and the provided oceanographic terminology. It checks N-grams supplied by the Syntactic parser and then matches them with the terms stored in the MAPS database. Found matches are returned back to the parser with flexed form appearing in the source text. • A "relaxed" extractor - This option can be activated when the search engine is launched. It was introduced to give the user a chance to create new N-grams combining existing mono-grams and bi-grams in the database with rich-words found within the source text. The innovation of a semantic engine lies in the fact that the process is not just about the retrieval of already known documents by means of a simple term query but rather the retrieval of a population of documents whose existence was unknown. The system answers by showing a screenshot of results ordered according to the following criteria: • Relevance - of the document with respect to the concept that is searched • Date - of publication of the paper • Source - data provider as defined in the SeaDataNet Common Data Index • Matrix - environmental matrices as defined in the oceanographic field • Geographic area - area specified in the text • Clustering - the process of organizing objects into groups whose members are similar The clustering returns as the output the related documents. For each document the MAPS visualization provides: • Title, author, source/provider of data, web address • Tagging of key terms or concepts • Summary of the document • Visualization of the whole document The possibility of inserting the number of citations for each document among the criteria of the advanced search is currently undergoing; in this case the engine should be able to connect to any of the existing bibliographic citation systems (such as Google Scholar, Scopus, etc.).
Understanding of Particle Acceleration by Foreshock Transients (invited)
NASA Astrophysics Data System (ADS)
Liu, T. Z.; Angelopoulos, V.; Hietala, H.; Lu, S.; Wilson, L. B., III
2017-12-01
Although plasma shocks are known to be a major particle accelerator at Earth's environment (e.g., the bow shock) and elsewhere in the universe, how particles are accelerated to very large energies compared to the shock potential is still not fully understood. Significant new information on such acceleration in the vicinity of Earth's bow shock has recently emerged due to the availability of multi-point observations, in particular from Cluster and THEMIS. These have revealed numerous types of foreshock transients, formed by shock-reflected ions, which could play a crucial role in particle pre-acceleration, i.e. before the particles reach the shock to be subjected again to even further acceleration. Foreshock bubbles (FBs) and hot flow anomalies (HFAs), are a subset of such foreshock transients that are especially important due to their large spatial scale (1-10 Earth radii), and their ability to have global effects at Earth.s geospace. These transients can accelerate particles that can become a particle source for the parent shock. Here we introduce our latest progress in understanding particle acceleration by foreshock transients including their statistical characteristics and acceleration mechanisms.
Understanding of Particle Acceleration by Foreshock Transients
NASA Astrophysics Data System (ADS)
Liu, T. Z.; Angelopoulos, V.; Hietala, H.; Lu, S.; Wilson, L. B., III
2017-12-01
Although plasma shocks are known to be a major particle accelerator at Earth's environment (e.g., the bow shock) and elsewhere in the universe, how particles are accelerated to very large energies compared to the shock potential is still not fully understood. Significant new information on such acceleration in the vicinity of Earth's bow shock has recently emerged due to the availability of multi-point observations, in particular from Cluster and THEMIS. These have revealed numerous types of foreshock transients, formed by shock-reflected ions, which could play a crucial role in particle pre-acceleration, i.e. before the particles reach the shock to be subjected again to even further acceleration. Foreshock bubbles (FBs) and hot flow anomalies (HFAs), are a subset of such foreshock transients that are especially important due to their large spatial scale (1-10 Earth radii), and their ability to have global effects at Earth's geospace. These transients can accelerate particles that can become a particle source for the parent shock. Here we introduce our latest progress in understanding particle acceleration by foreshock transients including their statistical characteristics and acceleration mechanisms.
An automatic indexing method for medical documents.
Wagner, M. M.
1991-01-01
This paper describes MetaIndex, an automatic indexing program that creates symbolic representations of documents for the purpose of document retrieval. MetaIndex uses a simple transition network parser to recognize a language that is derived from the set of main concepts in the Unified Medical Language System Metathesaurus (Meta-1). MetaIndex uses a hierarchy of medical concepts, also derived from Meta-1, to represent the content of documents. The goal of this approach is to improve document retrieval performance by better representation of documents. An evaluation method is described, and the performance of MetaIndex on the task of indexing the Slice of Life medical image collection is reported. PMID:1807564
NOBLAST and JAMBLAST: New Options for BLAST and a Java Application Manager for BLAST results.
Lagnel, Jacques; Tsigenopoulos, Costas S; Iliopoulos, Ioannis
2009-03-15
NOBLAST (New Options for BLAST) is an open source program that provides a new user-friendly tabular output format for various NCBI BLAST programs (Blastn, Blastp, Blastx, Tblastn, Tblastx, Mega BLAST and Psi BLAST) without any use of a parser and provides E-value correction in case of use of segmented BLAST database. JAMBLAST using the NOBLAST output allows the user to manage, view and filter the BLAST hits using a number of selection criteria. A distribution package of NOBLAST and JAMBLAST including detailed installation procedure is freely available from http://sourceforge.net/projects/JAMBLAST/ and http://sourceforge.net/projects/NOBLAST. Supplementary data are available at Bioinformatics online.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Simunovic, Srdjan
2015-02-16
CASL's modeling and simulation technology, the Virtual Environment for Reactor Applications (VERA), incorporates coupled physics and science-based models, state-of-the-art numerical methods, modern computational science, integrated uncertainty quantification (UQ) and validation against data from operating pressurized water reactors (PWRs), single-effect experiments, and integral tests. The computational simulation component of VERA is the VERA Core Simulator (VERA-CS). The core simulator is the specific collection of multi-physics computer codes used to model and deplete a LWR core over multiple cycles. The core simulator has a single common input file that drives all of the different physics codes. The parser code, VERAIn, converts VERAmore » Input into an XML file that is used as input to different VERA codes.« less
XAFSmass: a program for calculating the optimal mass of XAFS samples
NASA Astrophysics Data System (ADS)
Klementiev, K.; Chernikov, R.
2016-05-01
We present a new implementation of the XAFSmass program that calculates the optimal mass of XAFS samples. It has several improvements as compared to the old Windows based program XAFSmass: 1) it is truly platform independent, as provided by Python language, 2) it has an improved parser of chemical formulas that enables parentheses and nested inclusion-to-matrix weight percentages. The program calculates the absorption edge height given the total optical thickness, operates with differently determined sample amounts (mass, pressure, density or sample area) depending on the aggregate state of the sample and solves the inverse problem of finding the elemental composition given the experimental absorption edge jump and the chemical formula.
Parsing Citations in Biomedical Articles Using Conditional Random Fields
Zhang, Qing; Cao, Yong-Gang; Yu, Hong
2011-01-01
Citations are used ubiquitously in biomedical full-text articles and play an important role for representing both the rhetorical structure and the semantic content of the articles. As a result, text mining systems will significantly benefit from a tool that automatically extracts the content of a citation. In this study, we applied the supervised machine-learning algorithms Conditional Random Fields (CRFs) to automatically parse a citation into its fields (e.g., Author, Title, Journal, and Year). With a subset of html format open-access PubMed Central articles, we report an overall 97.95% F1-score. The citation parser can be accessed at: http://www.cs.uwm.edu/~qing/projects/cithit/index.html. PMID:21419403
Development of clinical contents model markup language for electronic health records.
Yun, Ji-Hyun; Ahn, Sun-Ju; Kim, Yoon
2012-09-01
To develop dedicated markup language for clinical contents models (CCM) to facilitate the active use of CCM in electronic health record systems. Based on analysis of the structure and characteristics of CCM in the clinical domain, we designed extensible markup language (XML) based CCM markup language (CCML) schema manually. CCML faithfully reflects CCM in both the syntactic and semantic aspects. As this language is based on XML, it can be expressed and processed in computer systems and can be used in a technology-neutral way. CCML HAS THE FOLLOWING STRENGTHS: it is machine-readable and highly human-readable, it does not require a dedicated parser, and it can be applied for existing electronic health record systems.
An efficient representation of spatial information for expert reasoning in robotic vehicles
NASA Technical Reports Server (NTRS)
Scott, Steven; Interrante, Mark
1987-01-01
The previous generation of robotic vehicles and drones was designed for a specific task, with limited flexibility in executing their mission. This limited flexibility arises because the robotic vehicles do not possess the intelligence and knowledge upon which to make significant tactical decisions. Current development of robotic vehicles is toward increased intelligence and capabilities, adapting to a changing environment and altering mission objectives. The latest techniques in artificial intelligence (AI) are being employed to increase the robotic vehicle's intelligent decision-making capabilities. This document describes the design of the SARA spatial database tool, which is composed of request parser, reasoning, computations, and database modules that collectively manage and derive information useful for robotic vehicles.
Betatron Application in Mobile and Relocatable Inspection Systems for Freight Transport Control
NASA Astrophysics Data System (ADS)
Chakhlov, S. V.; Kasyanov, S. V.; Kasyanov, V. A.; Osipov, S. P.; Stein, M. M.; Stein, A. M.; Xiaoming, Sun
2016-01-01
Accelerators with energy level up to 4 MeV having high level of penetration ability by steel equivalent are the popular to control oversize cargo transported by road, by railway and by river. Betatron's usage as cyclic induction accelerator has some advantages in comparison with linear accelerators and other sources. Tomsk Polytechnic University has developed many types of betatrons, most of them are being produced by separate affiliated company " Foton ". Article is shown the results of application of the betatrons in inspection custom systems.
Acceleration of low order finite element computation with GPUs (Invited)
NASA Astrophysics Data System (ADS)
Knepley, M. G.
2010-12-01
Considerable effort has been focused on the acceleration using GPUs of high order spectral element methods and discontinuous Galerkin finite element methods. However, these methods are not universally applicable, and much of the existing FEM software base employs low order methods. In this talk, we present a formulation of FEM, using the PETSc framework from ANL, which is amenable to GPU acceleration even at very low order. In addition, using the FEniCS system for FEM, we show that the relevant kernels can be automatically generated and optimized using a symbolic manipulation system.
The last large pelletron accelerator of the Herb era
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chopra, S.; Narayanan, M. M.; Joshi, R.
1999-04-26
Prof. Ray Herb pioneered the concept and design of the tandem Pelletron accelerator in the late sixties at NEC. The 15UD Pelletron at Nuclear Science Centre (NSC), upgraded for 16MV operation using compressed geometry accelerating tubes is the last such large Pelletron. It has unique features like offset and matching quadrupoles after the stripper for charge state selection inside the high voltage terminal and consequently the option of further stripping the ion species of the selected charge states at high energy dead section, and elaborate pulsing system in the pre-acceleration region consisting of a beam chopper, a travelling wave deflector,more » a light ion buncher (1-80 amu) and a heavy ion buncher (>80 amu). NSC was established as a heavy ion accelerator based inter university centre in 1985. It became operational in July 1991 to cater to the research requirements of a large user community which at present includes about fifty universities, twenty-eight colleges and a dozen other academic institutes and research laboratories. The number of users in Materials and allied sciences is about 500. Various important modifications have been made to improve the performance of the accelerator in the last seven years. These include replacement of the corona voltage grading system by a resistor based one, a pick-up loop to monitor charging system performance, conversion from basic double unit structure to singlet, installation of a spiral cavity based phase detector system with post-accelerator stripper after the analyzing magnet, and a high efficiency multi harmonic buncher. Installation of a turbo pump based stripper gas recirculation system in the terminal is also planned. A brief description of utilization of the machine will be given.« less
Implementing Accelerated Schools in New Orleans: The Satellite Center Project as an Agent of Change.
ERIC Educational Resources Information Center
Miron, Louis F.; And Others
An overview is provided of the Accelerated Schools Project (ASP) as implemented in one urban elementary school in New Orleans, emphasizing the role of the University of New Orleans Satellite Center. The present student population of the school studied is 405 students in grades pre-kindergarten through six. The ASP is a non-traditional strategy for…
More on ghosts in the Dvali-Gabadaze-Porrati model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gorbunov, Dmitry; Sibiryakov, Sergei; Koyama, Kazuya
2006-02-15
It is shown by an explicit calculation that the excitations about the self-accelerating cosmological solution of the Dvali-Gabadaze-Porrati model contain a ghost mode. This raises serious doubts about viability of this solution. Our analysis reveals the similarity between the quadratic theory for the perturbations around the self-accelerating universe and an Abelian gauge model with two Stueckelberg fields.
ERIC Educational Resources Information Center
Lee, Michael C.
2017-01-01
The purpose of this quantitative experimental posttest-only control group research study was to determine the degree to which differences exist in outcomes between students using a video game-based instruction and students using a traditional non-video game-based instruction in accelerated degree program courses at a 4-year university in Illinois…
ERIC Educational Resources Information Center
Kolenovic, Zineta; Linderman, Donna; Karp, Melinda Mechur
2013-01-01
Community colleges are grappling with low rates of degree completion and transfer. The City University of New York's (CUNY) Accelerated Study in Associate Programs (ASAP) aims to improve graduation rates by providing a range of comprehensive support services to community college students in select majors. Using student-unit record data, we…
ERIC Educational Resources Information Center
Collins, Anita; Hay, Iain; Heiner, Irmgard
2013-01-01
In response to changes government funding and policies over the past five years, the Australian tertiary sector has entered an increasingly competitive climate. This has forced many universities to become more strategic in attracting increased numbers of PSTs. Providing accelerated learning opportunities for PSTs is viewed as one way to gain…
Boundary-Work between Work and Life in the High-Speed University
ERIC Educational Resources Information Center
Ylijoki, Oili-Helena
2013-01-01
Drawing upon the notion of acceleration of time in late capitalism, the article addresses the different forms and driving forces of the speeding up of the tempo and rhythm in research work in academia, and the impact of the temporal acceleration on how academics perceive their work and its connection to the private sphere of life. Based on 40…
The Transformative Role of Universities in a Knowledge Society
ERIC Educational Resources Information Center
Walshok, Mary Lindenstein
2005-01-01
This article is an edited version of the Bynum Tudor Lecture given by Mary L. Walshok in November 2004 during a Visiting Fellowship at Oxford University's Kellogg College. Against the background of ever-accelerating change -- technological, social, economic, geopolitical and cultural -- and the consequent need for constant adjustment and new…
ERIC Educational Resources Information Center
Baker, Sally; Stirling, Eve
2016-01-01
As technological developments accelerate, and neoliberal ideologies shift the ways that universities "do business," higher education is facing radical changes. Within this context, students' need to 'succeed' at university is more important than ever. Consequently, understanding students' transitions within this shifting higher education…
ERIC Educational Resources Information Center
Gray, Denis; Sundstrom, Eric; Tornatzky, Louis G.; McGowen, Lindsey
2011-01-01
Cooperative research centres (CRCs) increasingly foster Triple Helix (industry-university-government) collaboration and represent significant vehicles for cooperation across sectors, the promotion of knowledge and technology transfer and ultimately the acceleration of innovation. A growing social science literature on CRCs focuses on their…
ERIC Educational Resources Information Center
Ulbricht, Kurt; Zimmermann, Peter
1981-01-01
Problems encountered in testing in aerospace engineering courses in an accelerated technical program of a German military university are outlined. Four common grading procedures are compared, and the optimum length of written tests is discussed. (MSE)
The Problem of Inertia in a Friedmann Universe
NASA Technical Reports Server (NTRS)
Kazanas, Demosthenes
2012-01-01
In this talk I will discuss the origin of inertia in a curved spacetime, particularly the spatially flat, open and closed Friedmann universes. This is done using Sciama's law of inertial induction, which is based on Mach's principle, and expresses the analogy between the retarded far fields of electrodynamics and those of gravitation. After obtaining covariant expressions for electromagnetic fields due to an accelerating point charge in Friedmann models, we adopt Sciama's law to obtain the inertial force on an accelerating mass $m$ by integrating over the contributions from all the matter in the universe. The resulting inertial force has the form $F = -kma$ where the constant $k < 1 $ depends on the choice of the cosmological parameters such as $\\Omega_{M},\\ \\Omega_{\\Lambda}, $ and $\\Omega_{R}$. The values of $k$ obtained suggest that inertial contribution from dark matter can be the source for the missing part of the inertial force.
REVIEWS OF TOPICAL PROBLEMS: Cosmic vacuum
NASA Astrophysics Data System (ADS)
Chernin, Artur D.
2001-11-01
Recent observational studies of distant supernovae have suggested the existence of cosmic vacuum whose energy density exceeds the total density of all the other energy components in the Universe. The vacuum produces the field of antigravity that causes the cosmological expansion to accelerate. It is this accelerated expansion that has been discovered in the observations. The discovery of cosmic vacuum radically changes our current understanding of the present state of the Universe. It also poses new challenges to both cosmology and fundamental physics. Why is the density of vacuum what it is? Why do the densities of the cosmic energy components differ in exact value but agree in order of magnitude? On the other hand, the discovery made at large cosmological distances of hundreds and thousands Mpc provides new insights into the dynamics of the nearby Universe, the motions of galaxies in the local volume of 10 - 20 Mpc where the cosmological expansion was originally discovered.
Cosmological implications of scalar field dark energy models in f(T,𝒯 ) gravity
NASA Astrophysics Data System (ADS)
Salako, Ines G.; Jawad, Abdul; Moradpour, Hooman
After reviewing the f(T,𝒯 ) gravity, in which T is the torsion scalar and 𝒯 is the trace of the energy-momentum tensor, we refer to two cosmological models of this theory in agreement with observational data. Thereinafter, we consider a flat Friedmann-Robertson-Walker (FRW) universe filled by a pressureless source and look at the terms other than the Einstein terms in the corresponding Friedmann equations, as the dark energy (DE) candidate. In addition, some cosmological features of models, including equation of states and deceleration parameters, are addressed helping us in getting the accelerated expansion of the universe in quintessence era. Finally, we extract the scalar field as well as potential of quintessence, tachyon, K-essence and dilatonic fields for both f(T,𝒯 ) models. It is observed that the dynamics of scalar field as well as the scalar potential of these models indicate an accelerated expanding universe in these models.
Scalar field cosmology in f(R,T) gravity via Noether symmetry
NASA Astrophysics Data System (ADS)
Sharif, M.; Nawazish, Iqra
2018-04-01
This paper investigates the existence of Noether symmetries of isotropic universe model in f(R,T) gravity admitting minimal coupling of matter and scalar fields. The scalar field incorporates two dark energy models such as quintessence and phantom models. We determine symmetry generators and corresponding conserved quantities for two particular f(R,T) models. We also evaluate exact solutions and investigate their physical behavior via different cosmological parameters. For the first model, the graphical behavior of these parameters indicate consistency with recent observations representing accelerated expansion of the universe. For the second model, these parameters identify a transition form accelerated to decelerated expansion of the universe. The potential function is found to be constant for the first model while it becomes V(φ )≈ φ 2 for the second model. We conclude that the Noether symmetry generators and corresponding conserved quantities appear in all cases.
NASA Astrophysics Data System (ADS)
Rout, Bibhudutta; Dhoubhadel, Mangal S.; Poudel, Prakash R.; Kummari, Venkata C.; Lakshantha, Wickramaarachchige J.; Manuel, Jack E.; Bohara, Gyanendra; Szilasi, Szabolcs Z.; Glass, Gary A.; McDaniel, Floyd D.
2014-02-01
The University of North Texas (UNT) Ion Beam Modification and Analysis Laboratory (IBMAL) has four particle accelerators including a National Electrostatics Corporation (NEC) 9SDH-2 3 MV tandem Pelletron, a NEC 9SH 3 MV single-ended Pelletron, and a 200 kV Cockcroft-Walton. A fourth HVEC AK 2.5 MV Van de Graaff accelerator is presently being refurbished as an educational training facility. These accelerators can produce and accelerate almost any ion in the periodic table at energies from a few keV to tens of MeV. They are used to modify materials by ion implantation and to analyze materials by numerous atomic and nuclear physics techniques. The NEC 9SH accelerator was recently installed in the IBMAL and subsequently upgraded with the addition of a capacitive-liner and terminal potential stabilization system to reduce ion energy spread and therefore improve spatial resolution of the probing ion beam to hundreds of nanometers. Research involves materials modification and synthesis by ion implantation for photonic, electronic, and magnetic applications, micro-fabrication by high energy (MeV) ion beam lithography, microanalysis of biomedical and semiconductor materials, development of highenergy ion nanoprobe focusing systems, and educational and outreach activities. An overview of the IBMAL facilities and some of the current research projects are discussed.
Studies of industrial emissions by accelerator-based techniques: A review of applications at CEDAD
NASA Astrophysics Data System (ADS)
Calcagnile, L.; Quarta, G.
2012-04-01
Different research activities are in progress at the Centre for Dating and Diagnostics (CEDAD), University of Salento, in the field of environmental monitoring by exploiting the potentialities given by the different experimental beam lines implemented on the 3 MV Tande-tron accelerator and dedicated to AMS (Accelerator Mass Spectrome-try) radiocarbon dating and IB A (Ion Beam Analysis). An overview of these activities is presented by showing how accelerator-based analytical techniques can be a powerful tool for monitoring the anthropogenic carbon dioxide emissions from industrial sources and for the assessment of the biogenic content in SRF (Solid Recovered Fuel) burned in WTE (Waste to Energy) plants.
Spinor Field Nonlinearity and Space-Time Geometry
NASA Astrophysics Data System (ADS)
Saha, Bijan
2018-03-01
Within the scope of Bianchi type VI,VI0,V, III, I, LRSBI and FRW cosmological models we have studied the role of nonlinear spinor field on the evolution of the Universe and the spinor field itself. It was found that due to the presence of non-trivial non-diagonal components of the energy-momentum tensor of the spinor field in the anisotropic space-time, there occur some severe restrictions both on the metric functions and on the components of the spinor field. In this report we have considered a polynomial nonlinearity which is a function of invariants constructed from the bilinear spinor forms. It is found that in case of a Bianchi type-VI space-time, depending of the sign of self-coupling constants, the model allows either late time acceleration or oscillatory mode of evolution. In case of a Bianchi VI 0 type space-time due to the specific behavior of the spinor field we have two different scenarios. In one case the invariants constructed from bilinear spinor forms become trivial, thus giving rise to a massless and linear spinor field Lagrangian. This case is equivalent to the vacuum solution of the Bianchi VI 0 type space-time. The second case allows non-vanishing massive and nonlinear terms and depending on the sign of coupling constants gives rise to accelerating mode of expansion or the one that after obtaining some maximum value contracts and ends in big crunch, consequently generating space-time singularity. In case of a Bianchi type-V model there occur two possibilities. In one case we found that the metric functions are similar to each other. In this case the Universe expands with acceleration if the self-coupling constant is taken to be a positive one, whereas a negative coupling constant gives rise to a cyclic or periodic solution. In the second case the spinor mass and the spinor field nonlinearity vanish and the Universe expands linearly in time. In case of a Bianchi type-III model the space-time remains locally rotationally symmetric all the time, though the isotropy of space-time can be attained for a large proportionality constant. As far as evolution is concerned, depending on the sign of coupling constant the model allows both accelerated and oscillatory mode of expansion. A negative coupling constant leads to an oscillatory mode of expansion, whereas a positive coupling constant generates expanding Universe with late time acceleration. Both deceleration parameter and EoS parameter in this case vary with time and are in agreement with modern concept of space-time evolution. In case of a Bianchi type-I space-time the non-diagonal components lead to three different possibilities. In case of a full BI space-time we find that the spinor field nonlinearity and the massive term vanish, hence the spinor field Lagrangian becomes massless and linear. In two other cases the space-time evolves into either LRSBI or FRW Universe. If we consider a locally rotationally symmetric BI( LRSBI) model, neither the mass term nor the spinor field nonlinearity vanishes. In this case depending on the sign of coupling constant we have either late time accelerated mode of expansion or oscillatory mode of evolution. In this case for an expanding Universe we have asymptotical isotropization. Finally, in case of a FRW model neither the mass term nor the spinor field nonlinearity vanishes. Like in LRSBI case we have either late time acceleration or cyclic mode of evolution. These findings allow us to conclude that the spinor field is very sensitive to the gravitational one.
NASA Astrophysics Data System (ADS)
Mazzitelli, Francisco D.; Trombetta, Leonardo G.
2018-03-01
In a recent paper [Q. Wang, Z. Zhu, and W. G. Unruh, Phys. Rev. D 95, 103504 (2017), 10.1103/PhysRevD.95.103504] it was argued that, due to the fluctuations around its mean value, vacuum energy gravitates differently from what was previously assumed. As a consequence, the Universe would accelerate with a small Hubble expansion rate, solving the cosmological constant and dark energy problems. We point out here that the results depend on the type of cutoff used to evaluate the vacuum energy. In particular, they are not valid when one uses a covariant cutoff such that the zero-point energy density is positive definite.
NASA Astrophysics Data System (ADS)
Petit, J. P.; D'Agostini, G.
2014-10-01
An extension of a previously published model of a bimetric Universe is presented, where the speeds of light associated to positive and negative mass species are different. As shown earlier, the asymmetry of the model explains the acceleration of the positive species, while the negative one slows down. Asymmetry affects scale factors linked to lengths, times and speeds of light; so that if a mass inversion of a craft can be achieved, then interstellar travels would become non-impossible at a velocity less than the speed of light of the negative sector, and possibly much higher than that of the positive sector.
Report of the Dark Energy Task Force
DOE R&D Accomplishments Database
Albrecht, Andreas; Bernstein, Gary; Cahn, Robert; Freedman, Wendy L.; Hewitt, Jacqueline; Hu, Wayne; Huth, John; Kamionkowski, Marc; Kolb, Edward W.; Knox, Lloyd; Mather, John C.
2006-01-01
Dark energy appears to be the dominant component of the physical Universe, yet there is no persuasive theoretical explanation for its existence or magnitude. The acceleration of the Universe is, along with dark matter, the observed phenomenon that most directly demonstrates that our theories of fundamental particles and gravity are either incorrect or incomplete. Most experts believe that nothing short of a revolution in our understanding of fundamental physics will be required to achieve a full understanding of the cosmic acceleration. For these reasons, the nature of dark energy ranks among the very most compelling of all outstanding problems in physical science. These circumstances demand an ambitious observational program to determine the dark energy properties as well as possible.
NASA Astrophysics Data System (ADS)
Mellier, Yannick
2016-07-01
The ESA Euclid mission aims to understand why the expansion of the Universe is accelerating and pin down the source responsible for the acceleration. It will uncover the very nature of dark energy and gravitation by measuring with exquisite accuracy the expansion rate of the Universe and the growth rate of structure formation in the Universe. To achieve its objectives Euclid will observe the distribution of dark matter in the Universe by measuring shapes of weakly distorted distant galaxies lensed by foreground cosmic structures with the VIS imaging instrument. In parallel, Euclid will analyse the clustering of galaxies and the distribution of clusters of galaxies by using spectroscopy and measuring redshifts of galaxies with the NISP photometer and spectrometer instrument. The Euclid mission will observe one third of the sky (15,000 deg2) to collect data on several billion galaxies spread over the last ten billion years. In this presentation I will report on the considerable technical and scientific progresses made since COSPAR 2014, on behalf of the Euclid Collaboration. The recent mission PDR that has been passed successfully shows that Euclid should meet its requirements and achieve its primary scientific objectives to map the dark universe. The most recent forecasts and constraints on dark energy, gravity, dark matter and inflation will be presented.
Beyond Inflation:. A Cyclic Universe Scenario
NASA Astrophysics Data System (ADS)
Turok, Neil; Steinhardt, Paul J.
2005-08-01
Inflation has been the leading early universe scenario for two decades, and has become an accepted element of the successful 'cosmic concordance' model. However, there are many puzzling features of the resulting theory. It requires both high energy and low energy inflation, with energy densities differing by a hundred orders of magnitude. The questions of why the universe started out undergoing high energy inflation, and why it will end up in low energy inflation, are unanswered. Rather than resort to anthropic arguments, we have developed an alternative cosmology, the cyclic universe [1], in which the universe exists in a very long-lived attractor state determined by the laws of physics. The model shares inflation's phenomenological successes without requiring an epoch of high energy inflation. Instead, the universe is made homogeneous and flat, and scale-invariant adiabatic perturbations are generated during an epoch of low energy acceleration like that seen today, but preceding the last big bang. Unlike inflation, the model requires low energy acceleration in order for a periodic attractor state to exist. The key challenge facing the scenario is that of passing through the cosmic singularity at t = 0. Substantial progress has been made at the level of linearised gravity, which is reviewed here. The challenge of extending this to nonlinear gravity and string theory remains.
Beyond Inflation: A Cyclic Universe Scenario
NASA Astrophysics Data System (ADS)
Turok, Neil; Steinhardt, Paul J.
2005-01-01
Inflation has been the leading early universe scenario for two decades, and has become an accepted element of the successful `cosmic concordance' model. However, there are many puzzling features of the resulting theory. It requires both high energy and low energy inflation, with energy densities differing by a hundred orders of magnitude. The questions of why the universe started out undergoing high energy inflation, and why it will end up in low energy inflation, are unanswered. Rather than resort to anthropic arguments, we have developed an alternative cosmology, the cyclic universe, in which the universe exists in a very long-lived attractor state determined by the laws of physics. The model shares inflation's phenomenological successes without requiring an epoch of high energy inflation. Instead, the universe is made homogeneous and flat, and scale-invariant adiabatic perturbations are generated during an epoch of low energy acceleration like that seen today, but preceding the last big bang. Unlike inflation, the model requires low energy acceleration in order for a periodic attractor state to exist. The key challenge facing the scenario is that of passing through the cosmic singularity at t = 0. Substantial progress has been made at the level of linearised gravity, which is reviewed here. The challenge of extending this to nonlinear gravity and string theory remains.
Λ(t) CDM and the present accelerating expansion of the universe from 5D scalar vacuum
NASA Astrophysics Data System (ADS)
Madriz Aguilar, José Edgar; Zamarripa, J.; Peraza, A.; Licea, J. A.
2017-12-01
In this letter we investigate some consequences of considering our 4D observable universe as locally and isometrically embedded in a 5D spacetime, where gravity is described by a Brans-Dicke theory in vacuum. Once we impose the embedding conditions we obtain that gravity on the 4D spacetime is governed by the Einstein field equations modified by an extra term that can play the role of a dynamical cosmological constant. Two examples were studied. In the first we derive a cosmological model of a universe filled only with a cosmological constant. In the second we obtain a cosmological solution describing a universe filled with matter, radiation and a dynamical cosmological constant. We constrain the model by using the current observational data combination Planck + WP + BAO + SN. The present acceleration in the expansion of the universe is explained by the geometrically induced dynamical cosmological constant avoiding the introduction of a dark energy component and without addressing the underlying cosmological constant problem. Moreover, all 4D matter sources are geometrically induced in the same manner as it is usually done in the Wesson's induced matter theory.
Steinberg, Michael; Morin, Anna K
2015-10-25
Objective. To evaluate the impact of admission characteristics on graduation in an accelerated doctor of pharmacy (PharmD) program. Methods. Selected prematriculation characteristics of students entering the graduation class years of 2009-2012 on the Worcester and Manchester campuses of MCPHS University were analyzed and compared for on-time graduation. Results. Eighty-two percent of evaluated students (699 of 852) graduated on time. Students who were most likely to graduate on-time attended a 4-year school, previously earned a bachelor's degree, had an overall prematriculation grade point average (GPA) greater than or equal to 3.6, and graduated in the spring just prior to matriculating to the university. Factors that reduced the likelihood of graduating on time were also identified. Work experience had a marginal impact on graduating on time. Conclusion. Although there is no certainty in college admission decisions, prematriculation characteristics can help predict the likelihood for academic success of students in an accelerated PharmD program.
Morin, Anna K.
2015-01-01
Objective. To evaluate the impact of admission characteristics on graduation in an accelerated doctor of pharmacy (PharmD) program. Methods. Selected prematriculation characteristics of students entering the graduation class years of 2009-2012 on the Worcester and Manchester campuses of MCPHS University were analyzed and compared for on-time graduation. Results. Eighty-two percent of evaluated students (699 of 852) graduated on time. Students who were most likely to graduate on-time attended a 4-year school, previously earned a bachelor’s degree, had an overall prematriculation grade point average (GPA) greater than or equal to 3.6, and graduated in the spring just prior to matriculating to the university. Factors that reduced the likelihood of graduating on time were also identified. Work experience had a marginal impact on graduating on time. Conclusion. Although there is no certainty in college admission decisions, prematriculation characteristics can help predict the likelihood for academic success of students in an accelerated PharmD program. PMID:26689686
Cosmology of a holographic induced gravity model with curvature effects
NASA Astrophysics Data System (ADS)
Bouhmadi-López, Mariam; Errahmani, Ahmed; Ouali, Taoufiq
2011-10-01
We present a holographic model of the Dvali-Gabadadze-Porrati scenario with a Gauss-Bonnet term in the bulk. We concentrate on the solution that generalizes the normal Dvali-Gabadadze-Porrati branch. It is well known that this branch cannot describe the late-time acceleration of the universe even with the inclusion of a Gauss-Bonnet term. Here, we show that this branch in the presence of a Gauss-Bonnet curvature effect and a holographic dark energy with the Hubble scale as the infrared cutoff can describe the late-time acceleration of the universe. It is worthwhile to stress that such an energy density component cannot do the same job on the normal Dvali-Gabadadze-Porrati branch (without Gauss-Bonnet modifications) nor in a standard four-dimensional relativistic model. The acceleration on the brane is also presented as being induced through an effective dark energy which corresponds to a balance between the holographic one and geometrical effects encoded through the Hubble parameter.
Laboratory Astrophysics Prize: Laboratory Astrophysics with Nuclei
NASA Astrophysics Data System (ADS)
Wiescher, Michael
2018-06-01
Nuclear astrophysics is concerned with nuclear reaction and decay processes from the Big Bang to the present star generation controlling the chemical evolution of our universe. Such nuclear reactions maintain stellar life, determine stellar evolution, and finally drive stellar explosion in the circle of stellar life. Laboratory nuclear astrophysics seeks to simulate and understand the underlying processes using a broad portfolio of nuclear instrumentation, from reactor to accelerator from stable to radioactive beams to map the broad spectrum of nucleosynthesis processes. This talk focuses on only two aspects of the broad field, the need of deep underground accelerator facilities in cosmic ray free environments in order to understand the nucleosynthesis in stars, and the need for high intensity radioactive beam facilities to recreate the conditions found in stellar explosions. Both concepts represent the two main frontiers of the field, which are being pursued in the US with the CASPAR accelerator at the Sanford Underground Research Facility in South Dakota and the FRIB facility at Michigan State University.
COBRA accelerator for Sandia ICF diode research at Cornell University
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, D.L.; Ingwersen, P.; Bennett, L.F.
1995-05-01
The new COBRA accelerator is being built in stages at the Laboratory of Plasma Studies in Cornell University where its applications will include extraction diode and ion beam research in support of the light ion inertial confinement fusion (ICF) program at Sandia National Laboratories. The 4- to 5-MV, 125- to 250-kA accelerator is based on a four-cavity inductive voltage adder (IVA) design. It is a combination of new ferromagnetically-isolated cavities and self magnetically insulated transmission line (MITL) hardware and components from existing Sandia and Cornell facilities: Marx generator capacitors, hardware, and power supply from the DEMON facility; water pulse formingmore » lines (PFL) and gas switch from the Subsystem Test Facility (STF); a HERMES-III intermediate store capacitor (ISC); and a modified ion diode from Cornell`s LION. The present accelerator consists of a single modified cavity similar to those of the Sandia SABRE accelerator and will be used to establish an operating system for the first stage initial lower voltage testing. Four new cavities will be fabricated and delivered in the first half of FY96 to complete the COBRA accelerator. COBRA is unique in the sense that each cavity is driven by a single pulse forming line, and the IVA output polarity may be reversed by rotating the cavities 180{degrees} about their vertical axis. The site preparations, tank construction, and diode design and development are taking place at Cornell with growing enthusiasm as this machine becomes a reality. Preliminary results with the single cavity and short positive inner cylinder MITL configuration will soon be available.« less
Can-AMS: The New Accelerator Mass Spectrometry Facility At The University Of Ottawa
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kieser, W. E.; Zhao, X.-L.; Clark, I. D.
2011-06-01
The Canadian Centre for Accelerator Mass Spectrometry (AMS) at the University of Ottawa will be equipped with a new, 3 MV tandem accelerator with peripheral equipment for the analysis of elements ranging from tritium to the actinides. This facility, along with a wide array of support instrumentation recently funded by the Canada Foundation for Innovation, will be located in a new science building on the downtown campus of the University of Ottawa. In addition to providing the standard AMS measurements on {sup 14}C, {sup 10}Be, {sup 26}Al, {sup 36}Cl and {sup 129}I for earth, environmental, cultural and biomedical sciences, thismore » facility will incorporate the new technologies of anion isobar separation at low energies using RFQ chemical reaction cells for {sup 36}Cl and new heavy element applications, integrated sample combustion and gas ion source for biomedical and environmental {sup 14}C analysis and the use of novel target matrices for expanding the range of applicable elements and simplifying sample preparation, all currently being developed at IsoTrace. This paper will outline the design goals for the new facility, present some details of the new AMS technologies, in particular the Isobar Separator for Anions and discuss the design of the AMS system resulting from these requirements.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
None, None
Over a full two day period, February 2–3, 2016, the Office of High Energy Physics convened a workshop in Gaithersburg, MD to seek community input on development of an Advanced Accelerator Concepts (AAC) research roadmap. The workshop was in response to a recommendation by the HEPAP Accelerator R&D Subpanel [1] [2] to “convene the university and laboratory proponents of advanced acceleration concepts to develop R&D roadmaps with a series of milestones and common down selection criteria towards the goal for constructing a multi-TeV e+e– collider” (the charge to the workshop can be found in Appendix A). During the workshop, proponentsmore » of laser-driven plasma wakefield acceleration (LWFA), particle-beam-driven plasma wakefield acceleration (PWFA), and dielectric wakefield acceleration (DWFA), along with a limited number of invited university and laboratory experts, presented and critically discussed individual concept roadmaps. The roadmap workshop was preceded by several preparatory workshops. The first day of the workshop featured presentation of three initial individual roadmaps with ample time for discussion. The individual roadmaps covered a time period extending until roughly 2040, with the end date assumed to be roughly appropriate for initial operation of a multi-TeV e+e– collider. The second day of the workshop comprised talks on synergies between the roadmaps and with global efforts, potential early applications, diagnostics needs, simulation needs, and beam issues and challenges related to a collider. During the last half of the day the roadmaps were revisited but with emphasis on the next five to ten years (as specifically requested in the charge) and on common challenges. The workshop concluded with critical and unanimous endorsement of the individual roadmaps and an extended discussion on the characteristics of the common challenges. (For the agenda and list of participants see Appendix B.)« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, Zhirong; Hogan, Mark
Essentially all we know today and will learn in the future about the fundamental nature of matter is derived from probing it with directed beams of particles such as electrons, protons, neutrons, heavy ions, and photons. The resulting ability to “see” the building blocks of matter has had an immense impact on society and our standard of living. Over the last century, particle accelerators have changed the way we look at nature and the universe we live in and have become an integral part of the Nation’s technical infrastructure. Today, particle accelerators are essential tools of modern science and technology.more » The cost and capabilities of accelerators would be greatly enhanced by breakthroughs in acceleration methods and technology. For the last 32 years, the Advanced Accelerator Concepts (AAC) Workshop has acted as the focal point for discussion and development of the most promising acceleration physics and technology. It is a particularly effective forum where the discussion is leveraged and promoted by the unique and demanding feature of the AAC Workshop: the working group structure, in which participants are asked to consider their contributions in terms of even larger problems to be solved. The 16th Advanced Accelerator Concepts (AAC2014) Workshop was organized by Stanford University from July 13 - 18, 2014 at the Dolce Hays Mansion in San Jose, California. The conference had a record 282 attendees including 62 students. Attendees came from 11 countries representing 66 different institutions. The workshop format consisted of plenary sessions in the morning with topical leaders from around the world presenting the latest breakthroughs to the entire workshop. In the late morning and afternoons attendees broke out into eight different working groups for more detailed presentations and discussions that were summarized on the final day of the workshop. In addition, there were student tutorial presentations on two afternoons to provide in depth education and training for the next generation of accelerator scientists. This is the final technical report on the organization and outcome of AAC2014.« less
Searching for a Cosmological Preferred Direction with 147 Rotationally Supported Galaxies
NASA Astrophysics Data System (ADS)
Zhou, Yong; Zhao, Zhi-Chao; Chang, Zhe
2017-10-01
It is well known that the Milgrom’s modified Newtonian dynamics (MOND) explains well the mass discrepancy problem in galaxy rotation curves. The MOND predicts a universal acceleration scale below which the Newtonian dynamics is still invalid. We get the universal acceleration scale of 1.02 × 10-10 m s-2 by using the Spitzer Photometry and Accurate Rotation Curves (SPARC) data set. Milgrom suggested that the acceleration scale may be a fingerprint of cosmology on local dynamics and related to the Hubble constant g † ˜ cH 0. In this paper, we use the hemisphere comparison method with the SPARC data set to investigate possible spatial anisotropy on the acceleration scale. It is found that the hemisphere of the maximum acceleration scale is in the direction (l,b)=(175\\buildrel{\\circ}\\over{.} {5}-{10^\\circ }+{6^\\circ }, -6\\buildrel{\\circ}\\over{.} {5}-{3^\\circ }+{9^\\circ }) with g †,max = 1.10 × 10-10 m s-2, while the hemisphere of the minimum acceleration scale is in the opposite direction (l,b)=(355\\buildrel{\\circ}\\over{.} {5}-{10^\\circ }+{6^\\circ }, 6\\buildrel{\\circ}\\over{.} {5}-{9^\\circ }+{3^\\circ }) with g †,min = 0.76 × 10-10 m s-2. The level of anisotropy reaches up to 0.37 ± 0.04. Robust tests show that such an anisotropy cannot be reproduced by a statistically isotropic data set. We also show that the spatial anisotropy on the acceleration scale is less correlated with the non-uniform distribution of the SPARC data points in the sky. In addition, we confirm that the anisotropy of the acceleration scale does not depend significantly on other physical parameters of the SPARC galaxies. It is interesting to note that the maximum anisotropy direction found in this paper is close with other cosmological preferred directions, particularly the direction of the “Australia dipole” for the fine structure constant.
Semi-automated ontology generation and evolution
NASA Astrophysics Data System (ADS)
Stirtzinger, Anthony P.; Anken, Craig S.
2009-05-01
Extending the notion of data models or object models, ontology can provide rich semantic definition not only to the meta-data but also to the instance data of domain knowledge, making these semantic definitions available in machine readable form. However, the generation of an effective ontology is a difficult task involving considerable labor and skill. This paper discusses an Ontology Generation and Evolution Processor (OGEP) aimed at automating this process, only requesting user input when un-resolvable ambiguous situations occur. OGEP directly attacks the main barrier which prevents automated (or self learning) ontology generation: the ability to understand the meaning of artifacts and the relationships the artifacts have to the domain space. OGEP leverages existing lexical to ontological mappings in the form of WordNet, and Suggested Upper Merged Ontology (SUMO) integrated with a semantic pattern-based structure referred to as the Semantic Grounding Mechanism (SGM) and implemented as a Corpus Reasoner. The OGEP processing is initiated by a Corpus Parser performing a lexical analysis of the corpus, reading in a document (or corpus) and preparing it for processing by annotating words and phrases. After the Corpus Parser is done, the Corpus Reasoner uses the parts of speech output to determine the semantic meaning of a word or phrase. The Corpus Reasoner is the crux of the OGEP system, analyzing, extrapolating, and evolving data from free text into cohesive semantic relationships. The Semantic Grounding Mechanism provides a basis for identifying and mapping semantic relationships. By blending together the WordNet lexicon and SUMO ontological layout, the SGM is given breadth and depth in its ability to extrapolate semantic relationships between domain entities. The combination of all these components results in an innovative approach to user assisted semantic-based ontology generation. This paper will describe the OGEP technology in the context of the architectural components referenced above and identify a potential technology transition path to Scott AFB's Tanker Airlift Control Center (TACC) which serves as the Air Operations Center (AOC) for the Air Mobility Command (AMC).
ERIC Educational Resources Information Center
Scrivener, Susan; Weiss, Michael J.
2013-01-01
This policy brief presents results from a random assignment evaluation of the City University of New York's Accelerated Study in Associate Programs (ASAP). An ambitious and promising endeavor, ASAP provides a comprehensive array of services and supports to help community college students graduate and to help them graduate sooner. The evaluation…
Cosmological solutions in spatially curved universes with adiabatic particle production
NASA Astrophysics Data System (ADS)
Aresté Saló, Llibert; de Haro, Jaume
2017-03-01
We perform a qualitative and thermodynamic study of two models when one takes into account adiabatic particle production. In the first one, there is a constant particle production rate, which leads to solutions depicting the current cosmic acceleration but without inflation. The other one has solutions that unify the early and late time acceleration. These solutions converge asymptotically to the thermal equilibrium.
Collaborative Model for Acceleration of Individualized Therapy of Colon Cancer
2015-12-01
preclinical models are representative of actual patient samples and may be useful in early drug development and predictive biomarker discovery...Award Number: W81XWH-11-1-0527 TITLE: Collaborative Model for Acceleration of Individualized Therapy of Colon Cancer PRINCIPAL INVESTIGATOR: Aik...Choon Tan CONTRACTING ORGANIZATION: UNIVERSITY OF COLORADO, DENVER AURORA, CO 80045-2505 REPORT DATE: December 2015 TYPE OF REPORT: FINAL REPORT
ERIC Educational Resources Information Center
Brunner, Ilse; And Others
Organizations are the product of the ideas and interactions of those who work in them. The challenge for learning in organizations is to have a shared purpose and vision of the organization, to develop new ideas arising out of the vision and purpose, to test the ideas in the organizational reality, and to communicate that knowledge to other…
ERIC Educational Resources Information Center
Weiss, Michael; Scrivener, Susan; Fresques, Hannah; Ratledge, Alyssa; Rudd, Tim; Sommo, Colleen
2014-01-01
The City University of New York's (CUNY's) Accelerated Study in Associate Programs (ASAP) combines many of the ideas from a range of programs into a comprehensive model that requires students to attend school full-time, and provides supports and incentives for three years. ASAP's financial aid reforms, enhanced student services, and scheduling…
Optimization of a ΔE - E detector for 41Ca AMS
NASA Astrophysics Data System (ADS)
Hosoya, Seiji; Sasa, Kimikazu; Matsunaka, Tetsuya; Takahashi, Tsutomu; Matsumura, Masumi; Matsumura, Hiroshi; Sundquist, Mark; Stodola, Mark; Sueki, Keisuke
2017-09-01
A series of nuclides (14C, 26Al, and 36Cl) was measured using the 12UD Pelletron tandem accelerator before replacement by the horizontal 6 MV tandem accelerator at the University of Tsukuba Tandem Accelerator Complex (UTTAC). This paper considers the modification of the accelerator mass spectrometry (AMS) measurement parameters to suit the current 6 MV tandem accelerator setup (e.g., terminal voltage, detected ion charge state, gas pressure, and entrance window material in detector). The Particle and Heavy Ion Transport code System (PHITS) was also used to simulate AMS measurement to determine the best conditions to suppress isobaric interference. The spectra of 41Ca and 41K were then successfully separated and their nuclear spectra were identified; the system achieved a background level of 41Ca/40Ca ∼ 6 ×10-14 .
Transforming America's Universities to Compete in the "New Normal"
ERIC Educational Resources Information Center
Bruininks, Robert H.; Keeney, Brianne; Thorp, Jim
2010-01-01
The challenges faced today by U.S. colleges and universities have been accelerated by the current economic downturn, but they are not the result of it. Consequently, we should not expect a sudden reversal of fortune when the economy rebounds. Changing demographics and spending priorities coupled with increasing competition and demands for…
The Efficacy of Strategy in Higher Education--A Methodology. Professional File. Number 27
ERIC Educational Resources Information Center
Litwin, Jeffrey
2006-01-01
All research intensive universities (RIU's) want to expand their scope of operations. Research performance is a key driver of institutional reputation which underpins a university's ability to generate revenues from all sources. Achieving an accelerating rate of growth of the virtuous cycle, in which increasing research performance enhances…
2015-10-01
amniotic Fluid Derived Stem Cells (AFS). PRINCIPAL INVESTIGATOR: Thomas L. Smith, PhD CONTRACTING ORGANIZATION: Wake Forest University Health Sciences...UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Wake Forest University Health Sciences Medical Center Boulevard Winston-Salem, NC 27157
ERIC Educational Resources Information Center
Scaramanga, Jonny; Reiss, Michael J.
2017-01-01
Increasing numbers of students are applying to university with the International Certificate of Christian Education (ICCE), an alternative to mainstream qualifications based on a biblically-based, individualised curriculum called Accelerated Christian Education (ACE). No formal validity arguments exist for the ICCE, but it claims to prepare…
Teaching Universal Gravitation with Vector Games
ERIC Educational Resources Information Center
Lowry, Matthew
2008-01-01
Like many high school and college physics teachers, I have found playing vector games to be a useful way of illustrating the concepts of inertia, velocity, and acceleration. Like many, I have also had difficulty in trying to get students to understand Newton's law of universal gravitation, specifically the inverse-square law and its application to…
Facing Financial Difficulties, African Virtual U. Revamps Itself
ERIC Educational Resources Information Center
Kigotho, Wachira
2006-01-01
This article talks about serious financial problems faced by the African Virtual University, the continent's largest online institution, forcing it to accelerate a major restructuring. The university was established in 1997 by the World Bank as a link between foreign and African institutions, and has been impeded by insufficient funds. As a…
Understanding a Resistance to Change: A Challenge for Universities
ERIC Educational Resources Information Center
Caruth, Gail D.; Caruth, Donald L.
2013-01-01
Change is inevitable. Today more than ever the pace of change is accelerating. Where there is organizational change there will be resistance to this change. To deal with the resistance effectively university administrators must understand the nature and causes of resistance to change. Only by dealing effectively with resistance to change can…
Central State University: Phase I Report
ERIC Educational Resources Information Center
Ohio Board of Regents, 2012
2012-01-01
In December of 2011, a team of eight consultants authored a report to the Ohio Board of Regents and Central State University titled "Accentuating Strengths/Accelerating Progress (AS/AP)." AS/AP provided a road map for the administration, faculty, and staff of CSU to achieve the excellence it has sought under the leadership of President…
NASA Astrophysics Data System (ADS)
Jaiswal, Rekha; Zia, Rashid
2018-04-01
In this paper, we have proposed a cosmological model, which is consistent with the new findings of `The Supernova Cosmology project' headed by Saul Perlmutter, and the `High-Z Supernova Search team', headed by Brian Schimdt. According to these new findings, the universe is undergoing an expansion with an increasing rate, in contrast to the earlier belief that the rate of expansion is constant or the expansion is slowing down. We have considered spatially homogeneous and anisotropic Bianchi-V dark energy model in Brans-Dicke theory of gravitation. We have taken the scale factor a(t)=k t^α e^{β t} , which results into variable deceleration parameter (DP). The graph of DP shows a transition from positive to negative, which shows that universe has passed through the past decelerated expansion to the current accelerated expansion phase. In this context, we have also calculated and plotted various parameters and observed that these are in good agreement with physical and kinematic properties of the universe and are also consistent with recent observations.
NASA Astrophysics Data System (ADS)
Akarsu, Özgür; Dereli, Tekin
2013-02-01
We present cosmological solutions for (1+3+n)-dimensional steady state universe in dilaton gravity with an arbitrary dilaton coupling constant w and exponential dilaton self-interaction potentials in the string frame. We focus particularly on the class in which the 3-space expands with a time varying deceleration parameter. We discuss the number of the internal dimensions and the value of the dilaton coupling constant to determine the cases that are consistent with the observed universe and the primordial nucleosynthesis. The 3-space starts with a decelerated expansion rate and evolves into accelerated expansion phase subject to the values of w and n, but ends with a Big Rip in all cases. We discuss the cosmological evolution in further detail for the cases w = 1 and w = ½ that permit exact solutions. We also comment on how the universe would be conceived by an observer in four dimensions who is unaware of the internal dimensions and thinks that the conventional general relativity is valid at cosmological scales.
Universal properties from a local geometric structure of a Killing horizon
NASA Astrophysics Data System (ADS)
Koga, Jun-ichirou
2007-06-01
We consider universal properties that arise from a local geometric structure of a Killing horizon, and analyse whether such universal properties give rise to degeneracy of classical configurations. We first introduce a non-perturbative definition of such a local geometric structure, which we call an asymptotic Killing horizon. It is then shown that infinitely many asymptotic Killing horizons reside on a common null hypersurface, once there exists one asymptotic Killing horizon, which is thus considered as degeneracy. In order to see how this degeneracy is physically meaningful, we analyse also the acceleration of the orbits of the vector that generates an asymptotic Killing horizon. It is shown that there exists the diff(S1) or diff(R1) sub-algebra on an asymptotic Killing horizon universally, which is picked out naturally, based on the behaviour of the acceleration. We argue that the discrepancy between string theory and the Euclidean approach in the entropy of an extreme black hole may be resolved, if the microscopic states responsible for black hole thermodynamics are connected with asymptotic Killing horizons.
Development of a Dielectric-Loaded Accelerator Test Facility Based on an X-Band Magnicon Amplifier
NASA Astrophysics Data System (ADS)
Gold, S. H.; Kinkead, A. K.; Gai, W.; Power, J. G.; Konecny, R.; Jing, C.; Tantawi, S. G.; Nantista, C. D.; Hu, Y.; Du, X.; Tang, C.; Lin, Y.; Bruce, R. W.; Bruce, R. L.; Fliflet, A. W.; Lewis, D.
2006-01-01
The Naval Research Laboratory (NRL) and Argonne National Laboratory (ANL), in collaboration with the Stanford Linear Accelerator Center (SLAC), are developing a dielectric-loaded accelerator (DLA) test facility powered by the 11.424-GHz magnicon amplifier that was developed jointly by NRL and Omega-P, Inc. Thus far, DLA structures developed by ANL have been tested at the NRL Magnicon Facility without injected electrons, including tests of alumina and magnesium calcium titanate structures at gradients up to ˜8 MV/m. The next step is to inject electrons in order to build a compact DLA test accelerator. The Accelerator Laboratory of Tsinghua University in Beijing, China has developed a 5-MeV electron injector for the accelerator, and SLAC is developing a means to combine the two magnicon output arms, and to drive the injector and an accelerator section with separate control of the power ratio and relative phase. Also, RWBruce Associates, working with NRL, is developing a means to join ceramic tubes to produce long accelerating sections using a microwave brazing process. The installation and commissioning of the first dielectric-loaded test accelerator, including injector, DLA structure, and spectrometer, should take place within the next year.
Field Work Proposal: PUBLIC OUTREACH EVENT FOR ACCELERATOR STEWARDSHIP TEST FACILITY PILOT PROGRAM
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hutton, Andrew; Areti, Hari
2015-03-05
Jefferson Lab’s outreach efforts towards the goals of Accelerator Stewardship Test Facility Pilot Program consist of the lab’s efforts in three venues. The first venue, at the end of March is to meet with the members of Virginia Tech Corporate Research Center (VTCRC) (http://www.vtcrc.com/tenant-directory/) in Blacksburg, Virginia. Of the nearly 160 members, we expect that many engineering companies (including mechanical, electrical, bio, software) will be present. To this group, we will describe the capabilities of Jefferson Lab’s accelerator infrastructure. The description will include not only the facilities but also the intellectual expertise. No funding is requested for this effort. Themore » second venue is to reach the industrial exhibitors at the 6th International Particle Accelerator Conference (IPAC’15). Jefferson Lab will host a booth at the conference to reach out to the >75 industrial exhibitors (https://www.jlab.org/conferences/ipac2015/SponsorsExhibitors.php) who represent a wide range of technologies. A number of these industries could benefit if they can access Jefferson Lab’s accelerator infrastructure. In addition to the booth, where written material will be available, we plan to arrange a session A/V presentation to the industry exhibitors. The booth will be hosted by Jefferson Lab’s Public Relations staff, assisted on a rotating basis by the lab’s scientists and engineers. The budget with IPAC’15 designations represents the request for funds for this effort. The third venue is the gathering of Southeastern Universities Research Association (SURA) university presidents. Here we plan to reach the research departments of the universities who can benefit by availing themselves to the infrastructure (material sciences, engineering, medical schools, material sciences, to name a few). Funding is requested to allow for attendance at the SURA Board Meeting. We are coordinating with DOE regarding these costs to raise the projected conference management cost ceiling in the Conference Management Tool.« less
Particle-in-cell/accelerator code for space-charge dominated beam simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
2012-05-08
Warp is a multidimensional discrete-particle beam simulation program designed to be applicable where the beam space-charge is non-negligible or dominant. It is being developed in a collaboration among LLNL, LBNL and the University of Maryland. It was originally designed and optimized for heave ion fusion accelerator physics studies, but has received use in a broader range of applications, including for example laser wakefield accelerators, e-cloud studies in high enery accelerators, particle traps and other areas. At present it incorporates 3-D, axisymmetric (r,z) planar (x-z) and transverse slice (x,y) descriptions, with both electrostatic and electro-magnetic fields, and a beam envelope model.more » The code is guilt atop the Python interpreter language.« less
Bianchi type string cosmological models in f(R,T) gravity
NASA Astrophysics Data System (ADS)
Sahoo, P. K.; Mishra, B.; Sahoo, Parbati; Pacif, S. K. J.
2016-09-01
In this work we have studied Bianchi-III and - VI 0 cosmological models with string fluid source in f( R, T) gravity (T. Harko et al., Phys. Rev. D 84, 024020 (2011)), where R is the Ricci scalar and T the trace of the stress energy-momentum tensor in the context of late time accelerating expansion of the universe as suggested by the present observations. The exact solutions of the field equations are obtained by using a time-varying deceleration parameter. The universe is anisotropic and free from initial singularity. Our model initially shows acceleration for a certain period of time and then decelerates consequently. Several dynamical and physical behaviors of the model are also discussed in detail.
Recent Progress on Supernova Remnants - Progenitors, Evolution, Cosmic-ray Acceleration
NASA Astrophysics Data System (ADS)
Bamba, A.
2017-10-01
Supernova remnants supplies heavy elements, kinetic and thermal energies, and cosmic rays, into the universe, and are the key sources to make the diversity of the universe. On the other hand, we do not know the fundamental issues of supernova remnants, such as (1) what their main progenitors are, (2) how they evolve into the realistic (non-uniform) interstellar space, and (3) which type of supernova remnants can accelerate cosmic rays to the knee energy. Recent X-ray studies with XMM-Newton, Chandra, Suzaku, NuSTAR, and Hitomi, progressed understandings of these issues, and found that each issue connect others tightly. In this paper, we will overview these progresses with focusing the above three topics, and discuss what we should do next.
Heuer, R.-D.
2018-02-19
Summer Student Lecture Programme Introduction. The mission of CERN; push back the frontiers of knowledge, e.g. the secrets of the Big Bang...what was the matter like within the first moments of the Universe's existence? You have to develop new technologies for accelerators and detectors (also information technology--the Web and the GRID and medicine--diagnosis and therapy). There are three key technology areas at CERN; accelerating, particle detection, large-scale computing.
ERIC Educational Resources Information Center
Therrien, Mona; Calder, Beth L.; Castonguay, Zakkary J.
2018-01-01
Students in the Didactic Program in Dietetics (DPD) at the University of Maine were exposed to the cheese-making process, within a lab setting of two hours, utilizing an accelerated recipe for a Queso Fresco-style cheese. The purpose of this project was to provide students with a novel, hands-on learning experience, which covered concepts of…
Brain responses to filled gaps.
Hestvik, Arild; Maxfield, Nathan; Schwartz, Richard G; Shafer, Valerie
2007-03-01
An unresolved issue in the study of sentence comprehension is whether the process of gap-filling is mediated by the construction of empty categories (traces), or whether the parser relates fillers directly to the associated verb's argument structure. We conducted an event-related potentials (ERP) study that used the violation paradigm to examine the time course and spatial distribution of brain responses to ungrammatically filled gaps. The results indicate that the earliest brain response to the violation is an early left anterior negativity (eLAN). This ERP indexes an early phase of pure syntactic structure building, temporally preceding ERPs that reflect semantic integration and argument structure satisfaction. The finding is interpreted as evidence that gap-filling is mediated by structurally predicted empty categories, rather than directly by argument structure operations.
Development of Clinical Contents Model Markup Language for Electronic Health Records
Yun, Ji-Hyun; Kim, Yoon
2012-01-01
Objectives To develop dedicated markup language for clinical contents models (CCM) to facilitate the active use of CCM in electronic health record systems. Methods Based on analysis of the structure and characteristics of CCM in the clinical domain, we designed extensible markup language (XML) based CCM markup language (CCML) schema manually. Results CCML faithfully reflects CCM in both the syntactic and semantic aspects. As this language is based on XML, it can be expressed and processed in computer systems and can be used in a technology-neutral way. Conclusions CCML has the following strengths: it is machine-readable and highly human-readable, it does not require a dedicated parser, and it can be applied for existing electronic health record systems. PMID:23115739
Modeling target normal sheath acceleration using handoffs between multiple simulations
NASA Astrophysics Data System (ADS)
McMahon, Matthew; Willis, Christopher; Mitchell, Robert; King, Frank; Schumacher, Douglass; Akli, Kramer; Freeman, Richard
2013-10-01
We present a technique to model the target normal sheath acceleration (TNSA) process using full-scale LSP PIC simulations. The technique allows for a realistic laser, full size target and pre-plasma, and sufficient propagation length for the accelerated ions and electrons. A first simulation using a 2D Cartesian grid models the laser-plasma interaction (LPI) self-consistently and includes field ionization. Electrons accelerated by the laser are imported into a second simulation using a 2D cylindrical grid optimized for the initial TNSA process and incorporating an equation of state. Finally, all of the particles are imported to a third simulation optimized for the propagation of the accelerated ions and utilizing a static field solver for initialization. We also show use of 3D LPI simulations. Simulation results are compared to recent ion acceleration experiments using SCARLET laser at The Ohio State University. This work was performed with support from ASOFR under contract # FA9550-12-1-0341, DARPA, and allocations of computing time from the Ohio Supercomputing Center.
NASA Astrophysics Data System (ADS)
Poehlman, W. F. S.; Garland, Wm. J.; Stark, J. W.
1993-06-01
In an era of downsizing and a limited pool of skilled accelerator personnel from which to draw replacements for an aging workforce, the impetus to integrate intelligent computer automation into the accelerator operator's repertoire is strong. However, successful deployment of an "Operator's Companion" is not trivial. Both graphical and human factors need to be recognized as critical areas that require extra care when formulating the Companion. They include interactive graphical user's interface that mimics, for the operator, familiar accelerator controls; knowledge of acquisition phases during development must acknowledge the expert's mental model of machine operation; and automated operations must be seen as improvements to the operator's environment rather than threats of ultimate replacement. Experiences with the PACES Accelerator Operator Companion developed at two sites over the past three years are related and graphical examples are given. The scale of the work involves multi-computer control of various start-up/shutdown and tuning procedures for Model FN and KN Van de Graaff accelerators. The response from licensing agencies has been encouraging.
A proton medical accelerator by the SBIR route — an example of technology transfer
NASA Astrophysics Data System (ADS)
Martin, R. L.
1989-04-01
Medical facilities for radiation treatment of cancer with protons have been established in many laboratories throughout the world. Essentially all of these have been designed as physics facilities, however, because of the requirement for protons up to 250 MeV. Most of the experience in this branch of accelerator technology lies in the national laboratories and a few large universities. A major issue is the transfer of this technology to the commercial sector to provide hospitals with simple, reliable and relatively inexpensive accelerators for this application. The author has chosen the SBIR route to accomplish this goal. ACCTEK Associates has received grants from the National Cancer Institute for development of the medical accelerator and beam delivery systems. Considerable encouragement and help has been received from Argonne National Laboratory and the Department of Energy. The experiences to date and the pros and cons on this approach to commercializing medical accelerators are described.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vargas, M.; Schumaker, W.; He, Z.-H.
2014-04-28
High intensity, short pulse lasers can be used to accelerate electrons to ultra-relativistic energies via laser wakefield acceleration (LWFA) [T. Tajima and J. M. Dawson, Phys. Rev. Lett. 43, 267 (1979)]. Recently, it was shown that separating the injection and acceleration processes into two distinct stages could prove beneficial in obtaining stable, high energy electron beams [Gonsalves et al., Nat. Phys. 7, 862 (2011); Liu et al., Phys. Rev. Lett. 107, 035001 (2011); Pollock et al., Phys. Rev. Lett. 107, 045001 (2011)]. Here, we use a stereolithography based 3D printer to produce two-stage gas targets for LWFA experiments on themore » HERCULES laser system at the University of Michigan. We demonstrate substantial improvements to the divergence, pointing stability, and energy spread of a laser wakefield accelerated electron beam compared with a single-stage gas cell or gas jet target.« less
Peculiar motions, accelerated expansion, and the cosmological axis
NASA Astrophysics Data System (ADS)
Tsagas, Christos G.
2011-09-01
Peculiar velocities change the expansion rate of any observer moving relative to the smooth Hubble flow. As a result, observers in a galaxy like our Milky Way can experience accelerated expansion within a globally decelerating universe, even when the drift velocities are small. The effect is local, but the affected scales can be large enough to give the false impression that the whole cosmos has recently entered an accelerating phase. Generally, peculiar velocities are also associated with dipolelike anisotropies, triggered by the fact that they introduce a preferred spatial direction. This implies that observers experiencing locally accelerated expansion, as a result of their own drift motion, may also find that the acceleration is maximized in one direction and minimized in the opposite. We argue that, typically, such a dipole anisotropy should be relatively small and the axis should probably lie fairly close to the one seen in the spectrum of the cosmic microwave background.
An introduction to the physics of high energy accelerators
DOE Office of Scientific and Technical Information (OSTI.GOV)
Edwards, D.A.; Syphers, J.J.
1993-01-01
This book is an outgrowth of a course given by the authors at various universities and particle accelerator schools. It starts from the basic physics principles governing particle motion inside an accelerator, and leads to a full description of the complicated phenomena and analytical tools encountered in the design and operation of a working accelerator. The book covers acceleration and longitudinal beam dynamics, transverse motion and nonlinear perturbations, intensity dependent effects, emittance preservation methods and synchrotron radiation. These subjects encompass the core concerns of a high energy synchrotron. The authors apparently do not assume the reader has much previous knowledgemore » about accelerator physics. Hence, they take great care to introduce the physical phenomena encountered and the concepts used to describe them. The mathematical formulae and derivations are deliberately kept at a level suitable for beginners. After mastering this course, any interested reader will not find it difficult to follow subjects of more current interests. Useful homework problems are provided at the end of each chapter. Many of the problems are based on actual activities associated with the design and operation of existing accelerators.« less
Distribution uniformity of laser-accelerated proton beams
NASA Astrophysics Data System (ADS)
Zhu, Jun-Gao; Zhu, Kun; Tao, Li; Xu, Xiao-Han; Lin, Chen; Ma, Wen-Jun; Lu, Hai-Yang; Zhao, Yan-Ying; Lu, Yuan-Rong; Chen, Jia-Er; Yan, Xue-Qing
2017-09-01
Compared with conventional accelerators, laser plasma accelerators can generate high energy ions at a greatly reduced scale, due to their TV/m acceleration gradient. A compact laser plasma accelerator (CLAPA) has been built at the Institute of Heavy Ion Physics at Peking University. It will be used for applied research like biological irradiation, astrophysics simulations, etc. A beamline system with multiple quadrupoles and an analyzing magnet for laser-accelerated ions is proposed here. Since laser-accelerated ion beams have broad energy spectra and large angular divergence, the parameters (beam waist position in the Y direction, beam line layout, drift distance, magnet angles etc.) of the beamline system are carefully designed and optimised to obtain a radially symmetric proton distribution at the irradiation platform. Requirements of energy selection and differences in focusing or defocusing in application systems greatly influence the evolution of proton distributions. With optimal parameters, radially symmetric proton distributions can be achieved and protons with different energy spread within ±5% have similar transverse areas at the experiment target. Supported by National Natural Science Foundation of China (11575011, 61631001) and National Grand Instrument Project (2012YQ030142)
Transforming Community College Education at The City University of New York
ERIC Educational Resources Information Center
Schmidt, Benno
2013-01-01
The City University of New York (CUNY) developed and implemented two evidence-based, educational initiatives at its community colleges. Accelerated Study in Associate Programs (ASAP), on six campuses, helped 55 percent of students who enter with one or two developmental needs earn an associate degree within three years. This compares with 20…
A Study of the Semiannual Admissions System at the University of Tennessee College of Medicine.
ERIC Educational Resources Information Center
Rittenhouse, Carl H.; Weiner, Samuel
This report describes the program and examines the advantages and disadvantages of the semiannual admissions system used by the University of Tennessee College of Medicine (UTCM). It also considers effects of an accelerated program which together with the use of a semiannual admissions system permit more efficient use of facilities and the…
University Governance, Leadership and Management in a Decade of Diversification and Uncertainty
ERIC Educational Resources Information Center
Shattock, Michael
2013-01-01
The last decade has seen an acceleration of change in the way British universities have been governed, led and managed. This has substantially been driven by the instability of the external environment, which has encouraged a greater centralisation of decision-making leading to less governance and more management, but it is also a consequence of…
Promotion at Canadian Universities: The Intersection of Gender, Discipline, and Institution
ERIC Educational Resources Information Center
Ornstein, Michael; Stewart, Penni; Drakich, Janice
2007-01-01
Statistics Canada's annual census of full-time faculty at all Canadian universities, between 1984 to 1999, is used to measure the effect of gender, discipline, and institution on promotion from assistant to associate professor and from associate to full professor. Accelerated failure time models show that gender has some effect on rates of…
Growth of matter perturbation in quintessence cosmology
NASA Astrophysics Data System (ADS)
Mulki, Fargiza A. M.; Wulandari, Hesti R. T.
2017-01-01
Big bang theory states that universe emerged from singularity with very high temperature and density, then expands homogeneously and isotropically. This theory gives rise standard cosmological principle which declares that universe is homogeneous and isotropic on large scales. However, universe is not perfectly homogeneous and isotropic on small scales. There exist structures starting from clusters, galaxies even to stars and planetary system scales. Cosmological perturbation theory is a fundamental theory that explains the origin of structures. According to this theory, the structures can be regarded as small perturbations in the early universe, which evolves as the universe expands. In addition to the problem of inhomogeneities of the universe, observations of supernovae Ia suggest that our universe is being accelerated. Various models of dark energy have been proposed to explain cosmic acceleration, one of them is cosmological constant. Because of several problems arise from cosmological constant, the alternative models have been proposed, one of these models is quintessence. We reconstruct growth of structure model following quintessence scenario at several epochs of the universe, which is specified by the effective equation of state parameters for each stage. Discussion begins with the dynamics of quintessence, in which exponential potential is analytically derived, which leads to various conditions of the universe. We then focus on scaling and quintessence dominated solutions. Subsequently, we review the basics of cosmological perturbation theory and derive formulas to investigate how matter perturbation evolves with time in subhorizon scales which leads to structure formation, and also analyze the influence of quintessence to the structure formation. From analytical exploration, we obtain the growth rate of matter perturbation and the existence of quintessence as a dark energy that slows down the growth of structure formation of the universe.
Essay: Robert H. Siemann As Leader of the Advanced Accelerator Research Department
DOE Office of Scientific and Technical Information (OSTI.GOV)
Colby, Eric R.; Hogan, Mark J.; /SLAC
Robert H. Siemann originally conceived of the Advanced Accelerator Research Department (AARD) as an academic, experimental group dedicated to probing the technical limitations of accelerators while providing excellent educational opportunities for young scientists. The early years of the Accelerator Research Department B, as it was then known, were dedicated to a wealth of mostly student-led experiments to examine the promise of advanced accelerator techniques. High-gradient techniques including millimeter-wave rf acceleration, beam-driven plasma acceleration, and direct laser acceleration were pursued, including tests of materials under rf pulsed heating and short-pulse laser radiation, to establish the ultimate limitations on gradient. As themore » department and program grew, so did the motivation to found an accelerator research center that brought experimentalists together in a test facility environment to conduct a broad range of experiments. The Final Focus Test Beam and later the Next Linear Collider Test Accelerator provided unique experimental facilities for AARD staff and collaborators to carry out advanced accelerator experiments. Throughout the evolution of this dynamic program, Bob maintained a department atmosphere and culture more reminiscent of a university research group than a national laboratory department. His exceptional ability to balance multiple roles as scientist, professor, and administrator enabled the creation and preservation of an environment that fostered technical innovation and scholarship.« less
NASA Astrophysics Data System (ADS)
Romaniuk, Ryszard S.
2013-10-01
Accelerator science and technology is one of a key enablers of the developments in the particle physic, photon physics and also applications in medicine and industry. The paper presents a digest of the research results in the domain of accelerator science and technology in Europe, shown during the realization of CARE (Coordinated Accelerator R&D), EuCARD (European Coordination of Accelerator R&D) and during the national annual review meeting of the TIARA - Test Infrastructure of European Research Area in Accelerator R&D. The European projects on accelerator technology started in 2003 with CARE. TIARA is an European Collaboration of Accelerator Technology, which by running research projects, technical, networks and infrastructural has a duty to integrate the research and technical communities and infrastructures in the global scale of Europe. The Collaboration gathers all research centers with large accelerator infrastructures. Other ones, like universities, are affiliated as associate members. TIARA-PP (preparatory phase) is an European infrastructural project run by this Consortium and realized inside EU-FP7. The paper presents a general overview of CARE, EuCARD and especially TIARA activities, with an introduction containing a portrait of contemporary accelerator technology and a digest of its applications in modern society. CARE, EuCARD and TIARA activities integrated the European accelerator community in a very effective way. These projects are expected very much to be continued.
NASA Astrophysics Data System (ADS)
Symon, Keith R.
2005-04-01
In the late 1950's and the 1960's the MURA (Midwestern Universities Research Association) working group developed fixed field alternating gradient (FFAG) particle accelerators. FFAG accelerators are a natural corollary of the invention of alternating gradient focusing. The fixed guide field accommodates all orbits from the injection to the final energy. For this reason, the transverse motion in the guide field is nearly decoupled from the longitudinal acceleration. This allows a wide variety of acceleration schemes, using betatron or rf accelerating fields, beam stacking, bucket lifts, phase displacement, etc. It also simplifies theoretical and experimental studies of accelerators. Theoretical studies included an extensive analysis of rf acceleration processes, nonlinear orbit dynamics, and collective instabilities. Two FFAG designs, radial sector and spiral sector, were invented. The MURA team built small electron models of each type, and used them to study orbit dynamics, acceleration processes, orbit instabilities, and space charge limits. A practical result of these studies was the invention of the spiral sector cyclotron. Another was beam stacking, which led to the first practical way of achieving colliding beams. A 50 MeV two-way radial sector model was built in which it proved possible to stack a beam of over 10 amperes of electrons.
Viscous cosmology for early- and late-time universe
NASA Astrophysics Data System (ADS)
Brevik, Iver; Grøn, Øyvind; de Haro, Jaume; Odintsov, Sergei D.; Saridakis, Emmanuel N.
From a hydrodynamicist’s point of view the inclusion of viscosity concepts in the macroscopic theory of the cosmic fluid would appear most natural, as an ideal fluid is after all an abstraction (exluding special cases such as superconductivity). Making use of modern observational results for the Hubble parameter plus standard Friedmann formalism, we may extrapolate the description of the universe back in time up to the inflationary era, or we may go to the opposite extreme and analyze the probable ultimate fate of the universe. In this review, we discuss a variety of topics in cosmology when it is enlarged in order to contain a bulk viscosity. Various forms of this viscosity, when expressed in terms of the fluid density or the Hubble parameter, are discussed. Furthermore, we consider homogeneous as well as inhomogeneous equations of state. We investigate viscous cosmology in the early universe, examining the viscosity effects on the various inflationary observables. Additionally, we study viscous cosmology in the late universe, containing current acceleration and the possible future singularities, and we investigate how one may even unify inflationary and late-time acceleration. Finally, we analyze the viscosity-induced crossing through the quintessence-phantom divide, we examine the realization of viscosity-driven cosmological bounces, and we briefly discuss how the Cardy-Verlinde formula is affected by viscosity.
Dark energy as a fixed point of the Einstein Yang-Mills Higgs equations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rinaldi, Massimiliano, E-mail: massimiliano.rinaldi@unitn.it
We study the Einstein Yang-Mills Higgs equations in the SO(3) representation on a isotropic and homogeneous flat Universe, in the presence of radiation and matter fluids. We map the equations of motion into an autonomous dynamical system of first-order differential equations and we find the equilibrium points. We show that there is only one stable fixed point that corresponds to an accelerated expanding Universe in the future. In the past, instead, there is an unstable fixed point that implies a stiff-matter domination. In between, we find three other unstable fixed points, corresponding, in chronological order, to radiation domination, to mattermore » domination, and, finally, to a transition from decelerated expansion to accelerated expansion. We solve the system numerically and we confirm that there are smooth trajectories that correctly describe the evolution of the Universe, from a remote past dominated by radiation to a remote future dominated by dark energy, passing through a matter-dominated phase.« less
NASA Astrophysics Data System (ADS)
Oses, Corey; Isayev, Olexandr; Toher, Cormac; Curtarolo, Stefano; Tropsha, Alexander
Historically, materials discovery is driven by a laborious trial-and-error process. The growth of materials databases and emerging informatics approaches finally offer the opportunity to transform this practice into data- and knowledge-driven rational design-accelerating discovery of novel materials exhibiting desired properties. By using data from the AFLOW repository for high-throughput, ab-initio calculations, we have generated Quantitative Materials Structure-Property Relationship (QMSPR) models to predict critical materials properties, including the metal/insulator classification, band gap energy, and bulk modulus. The prediction accuracy obtained with these QMSPR models approaches training data for virtually any stoichiometric inorganic crystalline material. We attribute the success and universality of these models to the construction of new materials descriptors-referred to as the universal Property-Labeled Material Fragments (PLMF). This representation affords straightforward model interpretation in terms of simple heuristic design rules that could guide rational materials design. This proof-of-concept study demonstrates the power of materials informatics to dramatically accelerate the search for new materials.
Dark energy as a fixed point of the Einstein Yang-Mills Higgs equations
NASA Astrophysics Data System (ADS)
Rinaldi, Massimiliano
2015-10-01
We study the Einstein Yang-Mills Higgs equations in the SO(3) representation on a isotropic and homogeneous flat Universe, in the presence of radiation and matter fluids. We map the equations of motion into an autonomous dynamical system of first-order differential equations and we find the equilibrium points. We show that there is only one stable fixed point that corresponds to an accelerated expanding Universe in the future. In the past, instead, there is an unstable fixed point that implies a stiff-matter domination. In between, we find three other unstable fixed points, corresponding, in chronological order, to radiation domination, to matter domination, and, finally, to a transition from decelerated expansion to accelerated expansion. We solve the system numerically and we confirm that there are smooth trajectories that correctly describe the evolution of the Universe, from a remote past dominated by radiation to a remote future dominated by dark energy, passing through a matter-dominated phase.
Inhomogeneities in dusty universe — a possible alternative to dark energy?
NASA Astrophysics Data System (ADS)
Chatterjee, S.
2011-03-01
There have been of late renewed debates on the role of inhomogeneities to explain the observed late acceleration of the universe. We have looked into the problem analytically with the help of the well known spherically symmetric but inhomogeneous Lemaitre-Tolman-Bondi(LTB) model generalised to higher dimensions. It is observed that in contrast to the claim made by Kolb et al. the presence of inhomogeneities as well as extra dimensions can not reverse the signature of the deceleration parameter if the matter field obeys the energy conditions. The well known Raychaudhuri equation also points to the same result. Without solving the field equations explicitly it can, however, be shown that although the total deceleration is positive everywhere nevertheless it does not exclude the possibility of having radial acceleration, even in the pure dust universe, if the angular scale factor is decelerating fast enough and vice versa. Moreover it is found that introduction of extra dimensions can not reverse the scenario. To the contrary it actually helps the decelerating process.
Elastic and inelastic scattering of neutrons from 56Fe
NASA Astrophysics Data System (ADS)
Ramirez, Anthony Paul; McEllistrem, M. T.; Liu, S. H.; Mukhopadhyay, S.; Peters, E. E.; Yates, S. W.; Vanhoy, J. R.; Harrison, T. D.; Rice, B. G.; Thompson, B. K.; Hicks, S. F.; Howard, T. J.; Jackson, D. T.; Lenzen, P. D.; Nguyen, T. D.; Pecha, R. L.
2015-10-01
The differential cross sections for elastic and inelastic scattered neutrons from 56Fe have been measured at the University of Kentucky Accelerator Laboratory (www.pa.uky.edu/accelerator) for incident neutron energies between 2.0 and 8.0 MeV and for the angular range 30° to 150°. Time-of-flight techniques and pulse-shape discrimination were employed for enhancing the neutron energy spectra and for reducing background. An overview of the experimental procedures and data analysis for the conversion of neutron yields to differential cross sections will be presented. These include the determination of the energy-dependent detection efficiencies, the normalization of the measured differential cross sections, and the attenuation and multiple scattering corrections. Our results will also be compared to evaluated cross section databases and reaction model calculations using the TALYS code. This work is supported by grants from the U.S. Department of Energy-Nuclear Energy Universities Program: NU-12-KY-UK-0201-05, and the Donald A. Cowan Physics Institute at the University of Dallas.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wohlgemuth, John; Silverman, Timothy; Miller, David C.
This paper describes an effort to inspect and evaluate PV modules in order to determine what failure or degradation modes are occurring in field installations. This paper will report on the results of six site visits, including the Sacramento Municipal Utility District (SMUD) Hedge Array, Tucson Electric Power (TEP) Springerville, Central Florida Utility, Florida Solar Energy Center (FSEC), the TEP Solar Test Yard, and University of Toledo installations. The effort here makes use of a recently developed field inspection data collection protocol, and the results were input into a corresponding database. The results of this work have also been usedmore » to develop a draft of the IEC standard for climate and application specific accelerated stress testing beyond module qualification. TEP Solar Test Yard, and University of Toledo installations. The effort here makes use of a recently developed field inspection data collection protocol, and the results were input into a corresponding database. The results of this work have also been used to develop a draft of the IEC standard for climate and application specific accelerated stress testing beyond module qualification. TEP Solar Test Yard, and University of Toledo installations. The effort here makes use of a recently developed field inspection data collection protocol, and the results were input into a corresponding database. The results of this work have also been used to develop a draft of the IEC standard for climate and application specific accelerated stress testing beyond module qualification.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ritz, Steve; Jeltema, Tesla
One of the greatest mysteries in modern cosmology is the fact that the expansion of the universe is observed to be accelerating. This acceleration may stem from dark energy, an additional energy component of the universe, or may indicate that the theory of general relativity is incomplete on cosmological scales. The growth rate of large-scale structure in the universe and particularly the largest collapsed structures, clusters of galaxies, is highly sensitive to the underlying cosmology. Clusters will provide one of the single most precise methods of constraining dark energy with the ongoing Dark Energy Survey (DES). The accuracy of themore » cosmological constraints derived from DES clusters necessarily depends on having an optimized and well-calibrated algorithm for selecting clusters as well as an optical richness estimator whose mean relation and scatter compared to cluster mass are precisely known. Calibrating the galaxy cluster richness-mass relation and its scatter was the focus of the funded work. Specifically, we employ X-ray observations and optical spectroscopy with the Keck telescopes of optically-selected clusters to calibrate the relationship between optical richness (the number of galaxies in a cluster) and underlying mass. This work also probes aspects of cluster selection like the accuracy of cluster centering which are critical to weak lensing cluster studies.« less
NASA Astrophysics Data System (ADS)
Shibata, Hiromi; Kobayashi, Koichi; Iwai, Takeo; Hamabe, Yoshimi; Sasaki, Sho; Hasegawa, Sunao; Yano, Hajime; Fujiwara, Akira; Ohashi, Hideo; Kawamura, Toru; Nogami, Ken-ichi
2001-01-01
A microparticle (dust) ion source has been installed in the 3.75 MV Van de Graaff electrostatic accelerator and a new beam line for microparticle experiments has been built at the HIT facility of Research Center for Nuclear Science and Technology, the University of Tokyo. Microparticle acceleration has been successful in obtaining expected velocities of 1-20 km/s or more for micron- or submicron-sized particles. Development of in situ dust detectors on board satellites and spacecraft in the expected mass and velocity range of micrometeoroids and investigation of hypervelocity impact phenomena by using time-of-flight mass spectrometry, impact flash measurement and scanning electron microscope observation for metals, polymers and semiconductors bombarded by micron-sized particles have been started.
NASA Astrophysics Data System (ADS)
Assmann, R. W.; Ferrario, M.
2016-09-01
Particle accelerators are a field of continuing and growing success. Today about 30,000 accelerators are operated with various types of particles, including electrons, positrons, protons, neutrinos and various kinds of ions. These particles are used for the investigation of fundamental particles and forces in our universe. In parallel a fast growing field of accelerator-based photon science has developed since the 1970"s. Modern particle beams produce unique photon pulses that are used in ground-breaking studies on fast processes in chemistry and biology, on structures of viruses and bacteria, on the phenomenon of multi-resistivity to medication, on the functioning of photo-synthesis at the electronic level and on other important challenges for human mankind. Last not least, numerous particle accelerators are being used every day for industrial and medical applications, including the irradiation of tumors in human patients.
Choosing order of operations to accelerate strip structure analysis in parameter range
NASA Astrophysics Data System (ADS)
Kuksenko, S. P.; Akhunov, R. R.; Gazizov, T. R.
2018-05-01
The paper considers the issue of using iteration methods in solving the sequence of linear algebraic systems obtained in quasistatic analysis of strip structures with the method of moments. Using the analysis of 4 strip structures, the authors have proved that additional acceleration (up to 2.21 times) of the iterative process can be obtained during the process of solving linear systems repeatedly by means of choosing a proper order of operations and a preconditioner. The obtained results can be used to accelerate the process of computer-aided design of various strip structures. The choice of the order of operations to accelerate the process is quite simple, universal and could be used not only for strip structure analysis but also for a wide range of computational problems.
SuperB Progress Report for Accelerator
DOE Office of Scientific and Technical Information (OSTI.GOV)
Biagini, M.E.; Boni, R.; Boscolo, M.
2012-02-14
This report details the progress made in by the SuperB Project in the area of the Collider since the publication of the SuperB Conceptual Design Report in 2007 and the Proceedings of SuperB Workshop VI in Valencia in 2008. With this document we propose a new electron positron colliding beam accelerator to be built in Italy to study flavor physics in the B-meson system at an energy of 10 GeV in the center-of-mass. This facility is called a high luminosity B-factory with a project name 'SuperB'. This project builds on a long history of successful e+e- colliders built around themore » world, as illustrated in Figure 1.1. The key advances in the design of this accelerator come from recent successes at the DAFNE collider at INFN in Frascati, Italy, at PEP-II at SLAC in California, USA, and at KEKB at KEK in Tsukuba Japan, and from new concepts in beam manipulation at the interaction region (IP) called 'crab waist'. This new collider comprises of two colliding beam rings, one at 4.2 GeV and one at 6.7 GeV, a common interaction region, a new injection system at full beam energies, and one of the two beams longitudinally polarized at the IP. Most of the new accelerator techniques needed for this collider have been achieved at other recently completed accelerators including the new PETRA-3 light source at DESY in Hamburg (Germany) and the upgraded DAFNE collider at the INFN laboratory at Frascati (Italy), or during design studies of CLIC or the International Linear Collider (ILC). The project is to be designed and constructed by a worldwide collaboration of accelerator and engineering staff along with ties to industry. To save significant construction costs, many components from the PEP-II collider at SLAC will be recycled and used in this new accelerator. The interaction region will be designed in collaboration with the particle physics detector to guarantee successful mutual use. The accelerator collaboration will consist of several groups at present universities and national laboratories. In Italy these may include INFN Frascati and the University of Pisa, in the United States SLAC, LBNL, BNL and several universities, in France IN2P3, LAPP, and Grenoble, in Russia BINP, in Poland Krakow University, and in the UK the Cockcroft Institute. The construction time for this collider is a total of about four years. The new tunnel can be bored in about a year. The new accelerator components can be built and installed in about 4 years. The shipping of components from PEP-II at SLAC to Italy will take about a year. A new linac and damping ring complex for the injector for the rings can be built in about three years. The commissioning of this new accelerator will take about a year including the new electron and positron sources, new linac, new damping ring, new beam transport lines, two new collider rings and the Interaction Region. The new particle physics detector can be commissioned simultaneously with the accelerator. Once beam collisions start for particle physics, the luminosity will increase with time, likely reaching full design specifications after about two to three years of operation. After construction, the operation of the collider will be the responsibility of the Italian INFN governmental agency. The intent is to run this accelerator about ten months each year with about one month for accelerator turn-on and nine months for colliding beams. The collider will need to operate for about 10 years to provide the required 50 ab{sup -1} requested by the detector collaboration. Both beams as anticipated in this collider will have properties that are excellent for use as sources for synchrotron radiation (SR). The expected photon properties are comparable to those of PETRA-3 or NSLS-II. The beam lines and user facilities needed to carry out this SR program are being investigated.« less
NASA Astrophysics Data System (ADS)
Yang, Chao Yuan
2012-05-01
Anomalous decelerations of spacecraft Pioneer-10,11,etc could be interpreted as signal delay effect between speed of gravity and that of light as reflected in virtual scale, similar to covarying virtual scale effect in relative motion (http://arxiv.org/html/math-ph/0001019v5).A finite speed of gravity faster than light could be inferred (http://arXiv.org/html/physics/0001034v2). Measurements of gravitational variations by paraconical pendulum during a total solar eclipse infer the same(http://arXiv.org/html/physics/0001034v9). A finite Superluminal speed of gravity is the necessary condition to imply that there exists gravitational horizon (GH). Such "GH" of our Universe would stretch far beyond the cosmic event horizon of light. Dark energy may be owing to mutually interactive gravitational horizons of cousin universes. Sufficient condition for the conjecture is that the dark energy would be increasing with age of our Universe since accelerated expansion started about 5 Gyr ago, since more and more arrivals of "GH" of distant cousin universes would interact with "GH" of our Universe. The history of dark energy variations between then and now would be desirable(http://arXiv.org/html/physics/0001034). In "GH" conjecture, the neighborhood of cousin universes would be likely boundless in 4D-space-time without begining or end. The dark energy would keep all universes in continually accelerated expansion to eventual fragmentation. Fragments would crash and merge into bangs, big or small, to form another generation of cousin universes. These scenarios might offer a clue to what was before the big bang.
ERIC Educational Resources Information Center
Scrivener, Susan; Weiss, Michael J.; Sommo, Colleen
2012-01-01
In recent years, there has been unprecedented national focus on the importance of increasing the stubbornly low graduation rates of community college students. Most reforms that have been tried are short-term and address one or only a few barriers to student success. The City University of New York's (CUNY's) Accelerated Study in Associate…
A New {sup 14}C-AMS Facility at UFF- Niteroi, Brazil
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gomes, P. R. S.; Macario, K. D.; Anjos, R. M.
2010-08-04
We report a new Accelerator Mass Spectrometry facility at the Physics Institute of Fluminense Federal University in Brazil, the Nuclear Chronology Laboratory - LACRON. The sample preparation laboratory is ready to perform chemical treatment through graphitization and the acquisition of a Single Stage Accelerator Mass Spectrometry System is in progress. LACRON will be the first independent laboratory to perform the {sup 14}C-AMS technique not only in Brazil but in Latin America.
A New 14C-AMS Facility at UFF- Niteroi, Brazil
NASA Astrophysics Data System (ADS)
Gomes, P. R. S.; Macario, K. D.; Anjos, R. M.; Linares, R.; Carvalho, C.; Queiroz, E.
2010-08-01
We report a new Accelerator Mass Spectrometry facility at the Physics Institute of Fluminense Federal University in Brazil, the Nuclear Chronology Laboratory—LACRON. The sample preparation laboratory is ready to perform chemical treatment through graphitization and the acquisition of a Single Stage Accelerator Mass Spectrometry System is in progress. LACRON will be the first independent laboratory to perform the 14C-AMS technique not only in Brazil but in Latin America.
NASA Astrophysics Data System (ADS)
Salehpour, M.; Håkansson, K.; Possnert, G.; Wacker, L.; Synal, H.-A.
2016-03-01
A range of ion beam analysis activities are ongoing at Uppsala University, including Accelerator Mass Spectrometry (AMS). Various isotopes are used for AMS but the isotope with the widest variety of applications is radiocarbon. Up until recently, only the 5 MV Pelletron tandem accelerator had been used at our site for radiocarbon AMS, ordinarily using 12 MeV 14,13,12C3+ ions. Recently a new radiocarbon AMS system, the Green-MICADAS, developed at the ion physics group at ETH Zurich, was installed. The system has a number of outstanding features which will be described. The system operates at a terminal voltage of 175 kV and uses helium stripper gas, extracting singly charged carbon ions. The low- and high energy mass spectrometers in the system are stigmatic dipole permanent magnets (0.42 and 0.97 T) requiring no electrical power nor cooling water. The system measures both the 14C/12C and the 13C/12C ratios on-line. Performance of the system is presented for both standard mg samples as well as μg-sized samples.
Cosmic Acceleration, Dark Energy, and Fundamental Physics
NASA Astrophysics Data System (ADS)
Turner, Michael S.; Huterer, Dragan
2007-11-01
A web of interlocking observations has established that the expansion of the Universe is speeding up and not slowing, revealing the presence of some form of repulsive gravity. Within the context of general relativity the cause of cosmic acceleration is a highly elastic ( p˜-ρ), very smooth form of energy called “dark energy” accounting for about 75% of the Universe. The “simplest” explanation for dark energy is the zero-point energy density associated with the quantum vacuum; however, all estimates for its value are many orders-of-magnitude too large. Other ideas for dark energy include a very light scalar field or a tangled network of topological defects. An alternate explanation invokes gravitational physics beyond general relativity. Observations and experiments underway and more precise cosmological measurements and laboratory experiments planned for the next decade will test whether or not dark energy is the quantum energy of the vacuum or something more exotic, and whether or not general relativity can self consistently explain cosmic acceleration. Dark energy is the most conspicuous example of physics beyond the standard model and perhaps the most profound mystery in all of science.
Breakthrough: Record-Setting Cavity
Ciovati, Gianluigi
2018-02-06
Gianluigi "Gigi" Ciovati, a superconducting radiofrequency scientist, discusses how scientists at the U.S. Department of Energy's Jefferson Lab in Newport News, VA, used ARRA funds to fabricate a niobium cavity for superconducting radiofrequency accelerators that has set a world record for energy efficiency. Jefferson Lab's scientists developed a new, super-hot treatment process that could soon make it possible to produce cavities more quickly and at less cost, benefitting research and healthcare around the world. Accelerators are critical to our efforts to study the structure of matter that builds our visible universe. They also are used to produce medical isotopes and particle beams for diagnosing and eradicating disease. And they offer the potential to power future nuclear power plants that produce little or no radioactive waste.around the world. Accelerators are critical to our efforts to study the structure of matter that builds our visible universe. They also are used to produce medical isotopes and particle beams for diagnosing and eradicating disease. And they offer the potential to power future nuclear power plants that produce little or no radioactive waste.
Development of a repetitive compact torus injector
NASA Astrophysics Data System (ADS)
Onchi, Takumi; McColl, David; Dreval, Mykola; Rohollahi, Akbar; Xiao, Chijin; Hirose, Akira; Zushi, Hideki
2013-10-01
A system for Repetitive Compact Torus Injection (RCTI) has been developed at the University of Saskatchewan. CTI is a promising fuelling technology to directly fuel the core region of tokamak reactors. In addition to fuelling, CTI has also the potential for (a) optimization of density profile and thus bootstrap current and (b) momentum injection. For steady-state reactor operation, RCTI is necessary. The approach to RCTI is to charge a storage capacitor bank with a large capacitance and quickly charge the CT capacitor bank through a stack of integrated-gate bipolar transistors (IGBTs). When the CT bank is fully charged, the IGBT stack will be turned off to isolate banks, and CT formation/acceleration sequence will start. After formation of each CT, the fast bank will be replenished and a new CT will be formed and accelerated. Circuits for the formation and the acceleration in University of Saskatchewan CT Injector (USCTI) have been modified. Three CT shots at 10 Hz or eight shots at 1.7 Hz have been achieved. This work has been sponsored by the CRC and NSERC, Canada.
On the use of history of mathematics: an introduction to Galileo's study of free fall motion
NASA Astrophysics Data System (ADS)
Ponce Campuzano, Juan Carlos; Matthews, Kelly E.; Adams, Peter
2018-05-01
In this paper, we report on an experimental activity for discussing the concepts of speed, instantaneous speed and acceleration, generally introduced in first year university courses of calculus or physics. Rather than developing the ideas of calculus and using them to explain these basic concepts for the study of motion, we led 82 first year university students through Galileo's experiments designed to investigate the motion of falling bodies, and his geometrical explanation of his results, via simple dynamic geometric applets designed with GeoGebra. Our goal was to enhance the students' development of mathematical thinking. Through a scholarship of teaching and learning study design, we captured data from students before, during and after the activity. Findings suggest that the historical development presented to the students helped to show the growth and evolution of the ideas and made visible authentic ways of thinking mathematically. Importantly, the activity prompted students to question and rethink what they knew about speed and acceleration, and also to appreciate the novel concepts of instantaneous speed and acceleration at which Galileo arrived.
Marginal evidence for cosmic acceleration from Type Ia supernovae
NASA Astrophysics Data System (ADS)
Nielsen, J. T.; Guffanti, A.; Sarkar, S.
2016-10-01
The ‘standard’ model of cosmology is founded on the basis that the expansion rate of the universe is accelerating at present — as was inferred originally from the Hubble diagram of Type Ia supernovae. There exists now a much bigger database of supernovae so we can perform rigorous statistical tests to check whether these ‘standardisable candles’ indeed indicate cosmic acceleration. Taking account of the empirical procedure by which corrections are made to their absolute magnitudes to allow for the varying shape of the light curve and extinction by dust, we find, rather surprisingly, that the data are still quite consistent with a constant rate of expansion.
Marginal evidence for cosmic acceleration from Type Ia supernovae
Nielsen, J. T.; Guffanti, A.; Sarkar, S.
2016-01-01
The ‘standard’ model of cosmology is founded on the basis that the expansion rate of the universe is accelerating at present — as was inferred originally from the Hubble diagram of Type Ia supernovae. There exists now a much bigger database of supernovae so we can perform rigorous statistical tests to check whether these ‘standardisable candles’ indeed indicate cosmic acceleration. Taking account of the empirical procedure by which corrections are made to their absolute magnitudes to allow for the varying shape of the light curve and extinction by dust, we find, rather surprisingly, that the data are still quite consistent with a constant rate of expansion. PMID:27767125
DOT National Transportation Integrated Search
2015-06-01
For several years the Iowa Department of Transportation (DOT), Iowa State University, the Federal Highway Administration, : and several Iowa counties have been working to develop accelerated bridge construction (ABC) concepts, details, and processes....
Gauss-Bonnet chameleon mechanism of dark energy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ito, Yusaku; Nojiri, Shin'ichi
2009-05-15
As a model of the current accelerated expansion of the Universe, we consider a model of the scalar-Einstein-Gauss-Bonnet gravity. This model includes the propagating scalar modes, which might give a large correction to the Newton law. In order to avoid this problem, we propose an extension of the chameleon mechanism where the scalar mode becomes massive due to the coupling with the Gauss-Bonnet term. Since the Gauss-Bonnet invariant does not vanish near the Earth or in the Solar System, even in the vacuum, the scalar mode is massive even in the vacuum and the correction to the Newton law couldmore » be small. We also discuss the possibility that the model could describe simultaneously the inflation in the early Universe, in addition to the current accelerated expansion.« less
NASA Astrophysics Data System (ADS)
Mishra, Priti; Célérier, Marie-Noëlle Singh, Tejinder P.
2015-01-01
Exact inhomogeneous solutions of Einstein's equations have been used in the literature to build models reproducing the cosmological data without dark energy. However, owing to the degrees of freedom pertaining to these models, it is necessary to get rid of the degeneracy often exhibited by the problem of distinguishing between them and accelerating universe models. We give an overview of redshift drift in inhomogeneous cosmologies, and explain how it serves to this purpose. One class of models which fits the data is the Szekeres Swiss-cheese class where non-spherically symmetric voids exhibit a typical size of about 400 Mpc. We present our calculation of the redshift drift in this model, and compare it with the results obtained by other authors for alternate scenarios.
NASA Astrophysics Data System (ADS)
Saisut, J.; Kusoljariyakul, K.; Rimjaem, S.; Kangrang, N.; Wichaisirimongkol, P.; Thamboon, P.; Rhodes, M. W.; Thongbai, C.
2011-05-01
The Plasma and Beam Physics Research Facility at Chiang Mai University has established a THz facility to focus on the study of ultra-short electron pulses. Short electron bunches can be generated from a system that consists of a radio-frequency (RF) gun with a thermionic cathode, an alpha magnet as a magnetic bunch compressor, and a linear accelerator as a post-acceleration section. The alpha magnet is a conventional and simple instrument for low-energy electron bunch compression. With the alpha magnet constructed in-house, several hundred femtosecond electron bunches for THz radiation production can be generated from the thermionic RF gun. The construction and performance of the alpha magnet, as well as some experimental results, are presented in this paper.
Future evolution in a backreaction model and the analogous scalar field cosmology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ali, Amna; Majumdar, A.S., E-mail: amnaalig@gmail.com, E-mail: archan@bose.res.in
We investigate the future evolution of the universe using the Buchert framework for averaged backreaction in the context of a two-domain partition of the universe. We show that this approach allows for the possibility of the global acceleration vanishing at a finite future time, provided that none of the subdomains accelerate individually. The model at large scales is analogously described in terms of a homogeneous scalar field emerging with a potential that is fixed and free from phenomenological parametrization. The dynamics of this scalar field is explored in the analogous FLRW cosmology. We use observational data from Type Ia Supernovae,more » Baryon Acoustic Oscillations, and Cosmic Microwave Background to constrain the parameters of the model for a viable cosmology, providing the corresponding likelihood contours.« less
NASA Astrophysics Data System (ADS)
Li, Dongdong; Peng, Xi; Peng, Yulian; Zhang, Liping; Chen, Xingyu; Zhuang, Jingli; Zhao, Fang; Yang, Xiangbo; Deng, Dongmei
2017-12-01
Not Available Project supported by the National Natural Science Foundation of China (Grant Nos. 11374108, 11374107, and 11775083), the Funds from CAS Key Laboratory of Geospace Environment, University of Science and Technology of China, and the Innovation Project of Graduate School of South China Normal University (Grant No. 2016lkxm64).
Inflation and late-time acceleration from a double-well potential with cosmological constant
NASA Astrophysics Data System (ADS)
de Haro, Jaume; Elizalde, Emilio
2016-06-01
A model of a universe without big bang singularity is presented, which displays an early inflationary period ending just before a phase transition to a kination epoch. The model produces enough heavy particles so as to reheat the universe at temperatures in the MeV regime. After the reheating, it smoothly matches the standard Λ CDM scenario.
Magnetised Strings in Λ-Dominated Anisotropic Universe
NASA Astrophysics Data System (ADS)
Goswami, G. K.; Yadav, Anil Kumar; Dewangan, R. N.
2016-11-01
In this paper, we have searched the existence of Λ-dominated anisotropic universe filled with magnetized strings. The observed acceleration of universe has been explained by introducing a positive cosmological constant Λ in the Einstein's field equation which is mathematically equivalent to dark energy with equation of state (EOS) parameter set equal to -1. The present values of the matter and the dark energy parameters (Ω m )0 & (ΩΛ)0 are estimated for high red shift (.3 ≤ z ≤ 1.4) SN Ia supernova data's of observed apparent magnitude along with their possible error taken from Union 2.1 compilation. It is found that the best fit value for (Ω m )0 & (ΩΛ)0 are 0.2920 & 0.7076 respectively which are in good agreement with recent astrophysical observations in the latest surveys like WMAP and Plank. Various physical parameters such as the matter and dark energy densities, the present age of the universe and the present value of deceleration parameter have been obtained on the basis of the values of (Ω m )0 & (ΩΛ)0.Also, we have estimated that the acceleration would have begun in the past at z = 0.6845 i. e. 6.2341 Gyrs before from now.
NASA Astrophysics Data System (ADS)
Zhou, Qing; Mao, Chong-Feng; Hou, Lin
Industry-university-institute cooperation is an important means to accelerate technical development and achievements for high-tech enterprises. Considering that Zhejiang high-tech enterprises existed some problems which included low cooperative level, single distribution, weak secondary R&D ability, obvious risk and so on, government should play an guiding role on improving information service system, enhancing cooperative level, promoting scientific intermediary service organization system construction, and building better environment for Industry-university-institute cooperation.
NASA Astrophysics Data System (ADS)
Odintsov, S. D.; Oikonomou, V. K.
2016-06-01
We present some cosmological models which unify the late- and early-time acceleration eras with the radiation and the matter domination era, and we realize the cosmological models by using the theoretical framework of F(R) gravity. Particularly, the first model unifies the late- and early-time acceleration with the matter domination era, and the second model unifies all the evolution eras of our Universe. The two models are described in the same way at early and late times, and only the intermediate stages of the evolution have some differences. Each cosmological model contains two Type IV singularities which are chosen to occur one at the end of the inflationary era and one at the end of the matter domination era. The cosmological models at early times are approximately identical to the R 2 inflation model, so these describe a slow-roll inflationary era which ends when the slow-roll parameters become of order one. The inflationary era is followed by the radiation era and after that the matter domination era follows, which lasts until the second Type IV singularity, and then the late-time acceleration era follows. The models have two appealing features: firstly they produce a nearly scale invariant power spectrum of primordial curvature perturbations and a scalar-to-tensor ratio which are compatible with the most recent observational data and secondly, it seems that the deceleration-acceleration transition is crucially affected by the presence of the second Type IV singularity which occurs at the end of the matter domination era. As we demonstrate, the Hubble horizon at early times shrinks, as expected for an initially accelerating Universe, then during the matter domination era, it expands and finally after the Type IV singularity, the Hubble horizon starts to shrink again, during the late-time acceleration era. Intriguingly enough, the deceleration-acceleration transition, occurs after the second Type IV singularity. In addition, we investigate which F(R) gravity can successfully realize each of the four cosmological epochs.
NASA Astrophysics Data System (ADS)
Tanaka, H.; Sakurai, Y.; Suzuki, M.; Masunaga, S.; Kinashi, Y.; Kashino, G.; Liu, Y.; Mitsumoto, T.; Yajima, S.; Tsutsui, H.; Maruhashi, A.; Ono, K.
2009-06-01
At Kyoto University Research Reactor Institute (KURRI), 275 clinical trials of boron neutron capture therapy (BNCT) have been performed as of March 2006, and the effectiveness of BNCT has been revealed. In order to further develop BNCT, it is desirable to supply accelerator-based epithermal-neutron sources that can be installed near the hospital. We proposed the method of filtering and moderating fast neutrons, which are emitted from the reaction between a beryllium target and 30-MeV protons accelerated by a cyclotron accelerator, using an optimum moderator system composed of iron, lead, aluminum and calcium fluoride. At present, an epithermal-neutron source is under construction from June 2008. This system consists of a cyclotron accelerator, beam transport system, neutron-yielding target, filter, moderator and irradiation bed. In this article, an overview of this system and the properties of the treatment neutron beam optimized by the MCNPX Monte Carlo neutron transport code are presented. The distribution of biological effect weighted dose in a head phantom compared with that of Kyoto University Research Reactor (KUR) is shown. It is confirmed that for the accelerator, the biological effect weighted dose for a deeply situated tumor in the phantom is 18% larger than that for KUR, when the limit dose of the normal brain is 10 Gy-eq. The therapeutic time of the cyclotron-based neutron sources are nearly one-quarter of that of KUR. The cyclotron-based epithermal-neutron source is a promising alternative to reactor-based neutron sources for treatments by BNCT.
Tamaki, S; Sakai, M; Yoshihashi, S; Manabe, M; Zushi, N; Murata, I; Hoashi, E; Kato, I; Kuri, S; Oshiro, S; Nagasaki, M; Horiike, H
2015-12-01
Mock-up experiment for development of accelerator based neutron source for Osaka University BNCT project was carried out at Birmingham University, UK. In this paper, spatial distribution of neutron flux intensity was evaluated by foil activation method. Validity of the design code system was confirmed by comparing measured gold foil activities with calculations. As a result, it was found that the epi-thermal neutron beam was well collimated by our neutron moderator assembly. Also, the design accuracy was evaluated to have less than 20% error. Copyright © 2015 Elsevier Ltd. All rights reserved.
Chandra Opens New Line of Investigation on Dark Energy
NASA Astrophysics Data System (ADS)
2004-05-01
Astronomers have detected and probed dark energy by applying a powerful, new method that uses images of galaxy clusters made by NASA's Chandra X-ray Observatory. The results trace the transition of the expansion of the Universe from a decelerating to an accelerating phase several billion years ago, and give intriguing clues about the nature of dark energy and the fate of the Universe. "Dark energy is perhaps the biggest mystery in physics," said Steve Allen of the Institute of Astronomy (IoA) at the University of Cambridge in England, and leader of the study. "As such, it is extremely important to make an independent test of its existence and properties." Abell 2029 Chandra X-ray Image of Abell 2029 Allen and his colleagues used Chandra to study 26 clusters of galaxies at distances corresponding to light travel times of between one and eight billion years. These data span the time when the Universe slowed from its original expansion, before speeding up again because of the repulsive effect of dark energy. "We're directly seeing that the expansion of the Universe is accelerating by measuring the distances to these galaxy clusters," said Andy Fabian also of the IoA, a co-author on the study. The new Chandra results suggest that the dark energy density does not change quickly with time and may even be constant, consistent with the "cosmological constant" concept first introduced by Albert Einstein. If so, the Universe is expected to continue expanding forever, so that in many billions of years only a tiny fraction of the known galaxies will be observable. More Animations Animation of the "Big Rip" If the dark energy density is constant, more dramatic fates for the Universe would be avoided. These include the "Big Rip," where dark energy increases until galaxies, stars, planets and eventually atoms are eventually torn apart. The "Big Crunch," where the Universe eventually collapses on itself, would also be ruled out. Chandra's probe of dark energy relies on the unique ability of X-ray observations to detect and study the hot gas in galaxy clusters. From these data, the ratio of the mass of the hot gas and the mass of the dark matter in a cluster can be determined. The observed values of the gas fraction depend on the assumed distance to the cluster, which in turn depends on the curvature of space and the amount of dark energy in the universe. Galaxy Cluster Animation Galaxy Cluster Animation Because galaxy clusters are so large, they are thought to represent a fair sample of the matter content in the universe. If so, then relative amounts of hot gas and dark matter should be the same for every cluster. Using this assumption, Allen and colleagues adjusted the distance scale to determine which one fit the data best. These distances show that the expansion of the Universe was first decelerating and then began to accelerate about six billion years ago. Chandra's observations agree with supernova results including those from the Hubble Space Telescope (HST), which first showed dark energy's effect on the acceleration of the Universe. Chandra's results are completely independent of the supernova technique - both in wavelength and the objects observed. Such independent verification is a cornerstone of science. In this case it helps to dispel any remaining doubts that the supernova technique is flawed. "Our Chandra method has nothing to do with other techniques, so they're definitely not comparing notes, so to speak," said Robert Schmidt of University of Potsdam in Germany, another coauthor on the study. Energy Distribution of the Universe Energy Distribution of the Universe Better limits on the amount of dark energy and how it varies with time are obtained by combining the X-ray results with data from NASA's Wilkinson Microwave Anisotropy Probe (WMAP), which used observations of the cosmic microwave background radiation to discover evidence for dark energy in the very early Universe. Using the combined data, Allen and his colleagues found that dark energy makes up about 75% of the Universe, dark matter about 21%, and visible matter about 4%. Allen and his colleagues stress that the uncertainties in the measurements are such that the data are consistent with dark energy having a constant value. The present Chandra data do, however, allow for the possibility that the dark energy density is increasing with time. More detailed studies with Chandra, HST, WMAP and with the future mission Constellation-X should provide much more precise constraints on dark energy. Expansion of the Universe Expansion of the Universe at Constant Acceleration "Until we better understand cosmic acceleration and the nature of the dark energy we cannot hope to understand the destiny of the Universe," said independent commentator Michael Turner, of the University of Chicago. The team conducting the research also included Harald Ebeling of the University of Hawaii and the late Leon van Speybroeck of the Harvard-Smithsonian Center for Astrophysics. These results will appear in an upcoming issue of the Monthly Notices of the Royal Astronomy Society. NASA's Marshall Space Flight Center, Huntsville, Ala., manages the Chandra program for NASA's Office of Space Science, Washington. Northrop Grumman of Redondo Beach, Calif., formerly TRW, Inc., was the prime development contractor for the observatory. The Smithsonian Astrophysical Observatory controls science and flight operations from the Chandra X-ray Center in Cambridge, Mass. Press Kit: Galaxy Clusters and Dark Energy Press Kit Additional information and images are available at: http://chandra.harvard.edu and http://chandra.nasa.gov
NASA Astrophysics Data System (ADS)
Demianski, Marek; Piedipalumbo, Ester; Sawant, Disha; Amati, Lorenzo
2017-02-01
Context. Explaining the accelerated expansion of the Universe is one of the fundamental challenges in physics today. Cosmography provides information about the evolution of the universe derived from measured distances, assuming only that the space time geometry is described by the Friedman-Lemaitre-Robertson-Walker metric, and adopting an approach that effectively uses only Taylor expansions of basic observables. Aims: We perform a high-redshift analysis to constrain the cosmographic expansion up to the fifth order. It is based on the Union2 type Ia supernovae data set, the gamma-ray burst Hubble diagram, a data set of 28 independent measurements of the Hubble parameter, baryon acoustic oscillations measurements from galaxy clustering and the Lyman-α forest in the SDSS-III Baryon Oscillation Spectroscopic Survey (BOSS), and some Gaussian priors on h and ΩM. Methods: We performed a statistical analysis and explored the probability distributions of the cosmographic parameters. By building up their regions of confidence, we maximized our likelihood function using the Markov chain Monte Carlo method. Results: Our high-redshift analysis confirms that the expansion of the Universe currently accelerates; the estimation of the jerk parameter indicates a possible deviation from the standard ΛCDM cosmological model. Moreover, we investigate implications of our results for the reconstruction of the dark energy equation of state (EOS) by comparing the standard technique of cosmography with an alternative approach based on generalized Padé approximations of the same observables. Because these expansions converge better, is possible to improve the constraints on the cosmographic parameters and also on the dark matter EOS. Conclusions: The estimation of the jerk and the DE parameters indicates at 1σ a possible deviation from the ΛCDM cosmological model.
New holographic dark energy model with constant bulk viscosity in modified f(R,T) gravity theory
NASA Astrophysics Data System (ADS)
Srivastava, Milan; Singh, C. P.
2018-06-01
The aim of this paper is to study new holographic dark energy (HDE) model in modified f(R,T) gravity theory within the framework of a flat Friedmann-Robertson-Walker model with bulk viscous matter content. It is thought that the negative pressure caused by the bulk viscosity can play the role of dark energy component, and drive the accelerating expansion of the universe. This is the motive of this paper to observe such phenomena with bulk viscosity. In the specific model f(R,T)=R+λ T, where R is the Ricci scalar, T the trace of the energy-momentum tensor and λ is a constant, we find the solution for non-viscous and viscous new HDE models. We analyze new HDE model with constant bulk viscosity, ζ =ζ 0= const. to explain the present accelerated expansion of the universe. We classify all possible scenarios (deceleration, acceleration and their transition) with possible positive and negative ranges of λ over the constraint on ζ 0 to analyze the evolution of the universe. We obtain the solutions of scale factor and deceleration parameter, and discuss the evolution of the universe. We observe the future finite-time singularities of type I and III at a finite time under certain constraints on λ . We also investigate the statefinder and Om diagnostics of the viscous new HDE model to discriminate with other existing dark energy models. In late time the viscous new HDE model approaches to Λ CDM model. We also discuss the thermodynamics and entropy of the model and find that it satisfies the second law of thermodynamics.
NASA Technical Reports Server (NTRS)
Weekes, Trevor C.
1998-01-01
There are few things more intriguing in high energy astrophysics than the study of the highest energy particles in the universe. Where and how these particles achieve their extreme energies is of interest not only to the astrophysicist but also to the particle physicist. At GeV and TeV energies the problem is manageable since the physics is known and the acceleration processes feasible. But the energy spectrum extends to 10(exp 20)Ev and there the problem of their origin is both more difficult and interesting; in fact at these high energies we do not even know what the particles are. The study of the origin and distribution of relativistic particles in the universe has been a challenge for more than 80 years but it is only in recent years that the technology has become available to really address the question. Although something can be learned from studies of composition and energy spectrum, the origins (and thence the acceleration mechanisms) can only come from the direct study of the neutral particle component (in this respect the highest energy particles are effectively neutral since they are virtually undeflected). The feasible channels of investigation are therefore the study of the arrival directions of: (1) TeV photons (covered by the following U.S. experiments: STACEE, Whipple/VERITAS, MILAGRO and, to some extent, by EGRET/GLAST); (2) neutrinos of TeV energy and above (AMANDA/KM3); (3) the highest energy cosmic rays (HiRes, Auger). While these studies represent a form of astronomy they are the astronomy of the extraordinary universe, the universe populated by the most dynamic and physically exciting objects, the universe of the high energy astrophysicist whose cosmic laboratories represent conditions beyond anything that can be duplicated in a terrestrial laboratory. This extraordinary astronomy may say little about the normal evolution of stars and galaxies but it opens windows into cosmic particle acceleration where new and strange physical processes take place.
Viscous cosmology in new holographic dark energy model and the cosmic acceleration
NASA Astrophysics Data System (ADS)
Singh, C. P.; Srivastava, Milan
2018-03-01
In this work, we study a flat Friedmann-Robertson-Walker universe filled with dark matter and viscous new holographic dark energy. We present four possible solutions of the model depending on the choice of the viscous term. We obtain the evolution of the cosmological quantities such as scale factor, deceleration parameter and transition redshift to observe the effect of viscosity in the evolution. We also emphasis upon the two independent geometrical diagnostics for our model, namely the statefinder and the Om diagnostics. In the first case we study new holographic dark energy model without viscous and obtain power-law expansion of the universe which gives constant deceleration parameter and statefinder parameters. In the limit of the parameter, the model approaches to Λ CDM model. In new holographic dark energy model with viscous, the bulk viscous coefficient is assumed as ζ =ζ 0+ζ 1H, where ζ 0 and ζ 1 are constants, and H is the Hubble parameter. In this model, we obtain all possible solutions with viscous term and analyze the expansion history of the universe. We draw the evolution graphs of the scale factor and deceleration parameter. It is observed that the universe transits from deceleration to acceleration for small values of ζ in late time. However, it accelerates very fast from the beginning for large values of ζ . By illustrating the evolutionary trajectories in r-s and r-q planes, we find that our model behaves as an quintessence like for small values of viscous coefficient and a Chaplygin gas like for large values of bulk viscous coefficient at early stage. However, model has close resemblance to that of the Λ CDM cosmology in late time. The Om has positive and negative curvatures for phantom and quintessence models, respectively depending on ζ . Our study shows that the bulk viscosity plays very important role in the expansion history of the universe.
All-Optical Quasi-Phase Matching for Laser Electron Acceleration
2016-06-01
T E C H N IC A L R E P O R T DTRA-TR-16-65 All-Optical Quasi -Phase Matching for Laser Electron Acceleration Distribution Statement A...outcomes of the project “All-Optical Quasi - Phase Matching for Laser Electron Acceleration”, a project awarded to the Pennsylvania State University by the...can be used to simultaneously extend the accel- eration distance beyond several Rayleigh ranges and to achieve quasi -phase matching between the laser
Research and Development of a High Power-Laser Driven Electron-Accelerator Suitable for Applications
2011-06-12
autocorrelator to measure the temporal duration, an optical imaging system to correct for phase front tilt and a FROG device to measure and optimize the... Phase II Task Summary . . . . . . . . . . . . . . . . . . . . . 4 D.1 Module I: High-Energy Electron Accelerator . . . . . . 4 D.2 Module II: High-Energy...During Phase I of the HRS program, the team from the University of Ne- braska, Lincoln (UNL) made use of the unique capabilities of their high-power
, nuclear structure and reaction research, nuclear theory, medium energy nuclear research and accelerator structure of baryonic matter in the universe - the matter that makes up stars, planets and human life itself
Rayleigh-Taylor mixing with time-dependent acceleration
NASA Astrophysics Data System (ADS)
Abarzhi, Snezhana
2016-10-01
We extend the momentum model to describe Rayleigh-Taylor (RT) mixing driven by a time-dependent acceleration. The acceleration is a power-law function of time, similarly to astrophysical and plasma fusion applications. In RT flow the dynamics of a fluid parcel is driven by a balance per unit mass of the rates of momentum gain and loss. We find analytical solutions in the cases of balanced and imbalanced gains and losses, and identify their dependence on the acceleration exponent. The existence is shown of two typical regimes of self-similar RT mixing-acceleration-driven Rayleigh-Taylor-type and dissipation-driven Richtymer-Meshkov-type with the latter being in general non-universal. Possible scenarios are proposed for transitions from the balanced dynamics to the imbalanced self-similar dynamics. Scaling and correlations properties of RT mixing are studied on the basis of dimensional analysis. Departures are outlined of RT dynamics with time-dependent acceleration from canonical cases of homogeneous turbulence as well as blast waves with first and second kind self-similarity. The work is supported by the US National Science Foundation.
Rayleigh-Taylor mixing with space-dependent acceleration
NASA Astrophysics Data System (ADS)
Abarzhi, Snezhana
2016-11-01
We extend the momentum model to describe Rayleigh-Taylor (RT) mixing driven by a space-dependent acceleration. The acceleration is a power-law function of space coordinate, similarly to astrophysical and plasma fusion applications. In RT flow the dynamics of a fluid parcel is driven by a balance per unit mass of the rates of momentum gain and loss. We find analytical solutions in the cases of balanced and imbalanced gains and losses, and identify their dependence on the acceleration exponent. The existence is shown of two typical sub-regimes of self-similar RT mixing - the acceleration-driven Rayleigh-Taylor-type mixing and dissipation-driven Richtymer-Meshkov-type mixing with the latter being in general non-universal. Possible scenarios are proposed for transitions from the balanced dynamics to the imbalanced self-similar dynamics. Scaling and correlations properties of RT mixing are studied on the basis of dimensional analysis. Departures are outlined of RT dynamics with space-dependent acceleration from canonical cases of homogeneous turbulence as well as blast waves with first and second kind self-similarity. The work is supported by the US National Science Foundation.
Relativistic Electrons in Ground-Level Enhanced (GLE) Solar Particle Events
NASA Astrophysics Data System (ADS)
Tylka, Allan J.; Dietrich, William; Novikova, Elena I.
Ground-level enhanced (GLE) solar particle events are one of the most spectacular manifesta-tions of solar activity, with protons accelerated to multi-GeV energies in minutes. Although GLEs have been observed for more than sixty years, the processes by which the particle ac-celeration takes place remain controversial. Relativistic electrons provide another means of investigating the nature of the particle accelerator, since some processes that can efficiently ac-celerate protons and ions are less attractive candidates for electron acceleration. We report on observations of relativistic electrons, at ˜0.5 -5 MeV, during GLEs of 1976-2005, using data from the University of Chicago's Cosmic Ray Nuclei Experiment (CRNE) on IMP-8, whose electron response has recently been calibrated using GEANT-4 simulations (Novikova et al. 2010). In particular, we examine onset times, temporal structure, fluences, and spectra of elec-trons in GLEs and compare them with comparable quantities for relativistic protons derived from neutron monitors. We discuss the implications of these comparisons for the nature of the particle acceleration process.
The Narodny ion accelerator as an injector for a small cyclotron
NASA Astrophysics Data System (ADS)
Derenchuk, V.
1985-01-01
A 120 keV electrostatic accelerator is currently in use at the University of Manitoba as an ion implanter. It is proposed to use this accelerator (called the Narodny ion accelerator or NIA), upgraded to 200 keV, as an injector for a small light ion cyclotron. This "minicyclotron" will consist of 6 sectors with four dees operating at 60 kV and variable frequency. The ions will be extracted at about 50 cm radius. The types of ions to be accelerated are H -, H +, D -1, 3He 2+, 4He 2+, 6Li 3+, and 7Li 3+ with a maximum energy of about 4 MeV for the Li ions and between 2 and 3 MeV for the He ions. A beam current of close to 0.5 mA is anticipated for H + and D + ions and high energy resolution ( ΔE/ E ~ 10 -3) is expected for all ions. The marriage of these two accelerators will give a very wide range of ion implantation energies (for certain ion species) as well as a source of particles for Rutherford backscatter analysis.
Operation and reactivity measurements of an accelerator driven subcritical TRIGA reactor
NASA Astrophysics Data System (ADS)
O'Kelly, David Sean
Experiments were performed at the Nuclear Engineering Teaching Laboratory (NETL) in 2005 and 2006 in which a 20 MeV linear electron accelerator operating as a photoneutron source was coupled to the TRIGA (Training, Research, Isotope production, General Atomics) Mark II research reactor at the University of Texas at Austin (UT) to simulate the operation and characteristics of a full-scale accelerator driven subcritical system (ADSS). The experimental program provided a relatively low-cost substitute for the higher power and complexity of internationally proposed systems utilizing proton accelerators and spallation neutron sources for an advanced ADSS that may be used for the burning of high-level radioactive waste. Various instrumentation methods that permitted ADSS neutron flux monitoring in high gamma radiation fields were successfully explored and the data was used to evaluate the Stochastic Pulsed Feynman method for reactivity monitoring.
Overview of Light-Ion Beam Therapy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chu, William T.
2006-03-16
In 1930, Ernest Orlando Lawrence at the University of California at Berkeley invented the cyclotron. One of his students, M. Stanley Livingston, constructed a 13-cm diameter model that had all the features of early cyclotrons, accelerating protons to 80 keV using less than 1 kV on a semi-circular accelerating electrode, now called the ''dee''. Soon after, Lawrence constructed the first two-dee 27-Inch (69-cm) Cyclotron, which produced protons and deuterons of 4.8 MeV. In 1939, Lawrence constructed the 60-Inch (150-cm) Cyclotron, which accelerated deuterons to 19 MeV. Just before WWII, Lawrence designed a 184-inch cyclotron, but the war prevented the buildingmore » of this machine. Immediately after the war ended, the Veksler-McMillan principle of phase stability was put forward, which enabled the transformation of conventional cyclotrons to successful synchrocyclotrons. When completed, the 184-Inch Synchrocyclotron produced 340-MeV protons. Following it, more modern synchrocyclotrons were built around the globe, and the synchrocyclotrons in Berkeley and Uppsala, together with the Harvard cyclotron, would perform pioneering work in treatment of human cancer using accelerated hadrons (protons and light ions). When the 184-Inch Synchrocyclotron was built, Lawrence asked Robert Wilson, one of his former graduate students, to look into the shielding requirements for of the new accelerator. Wilson soon realized that the 184-Inch would produce a copious number of protons and other light ions that had enough energy to penetrate human body, and could be used for treatment of deep-seated diseases. Realizing the advantages of delivering a larger dose in the Bragg peak when placed inside deep-seated tumors, he published in a medical journal a seminal paper on the rationale to use accelerated protons and light ions for treatment of human cancer. The precise dose localization provided by protons and light ions means lower doses to normal tissues adjacent to the treatment volume compared to those in conventional (photon) treatments. Wilson wrote his personal account of this pioneering work in 1997. In 1954 Cornelius Tobias and John Lawrence at the Radiation Laboratory (former E.O. Lawrence Berkeley National Laboratory) of the University of California, Berkeley performed the first therapeutic exposure of human patients to hadron (deuteron and helium ion) beams at the 184-Inch Synchrocyclotron. By 1984, or 30 years after the first proton treatment at Berkeley, programs of proton radiation treatments had opened at: University of Uppsala, Sweden, 1957; the Massachusetts General Hospital-Harvard Cyclotron Laboratory (MGH/HCL), USA, 1961; Dubna (1967), Moscow (1969) and St Petersburg (1975) in Russia; Chiba (1979) and Tsukuba (1983) in Japan; and Villigen, Switzerland, 1984. These centers used the accelerators originally constructed for nuclear physics research. The experience at these centers has confirmed the efficacy of protons and light ions in increasing the tumor dose relative to normal tissue dose, with significant improvements in local control and patient survival for several tumor sites. M.R. Raju reviewed the early clinical studies. In 1990, the Loma Linda University Medical Center in California heralded in the age of dedicated medical accelerators when it commissioned its proton therapy facility with a 250-MeV synchrotron. Since then there has been a relatively rapid increase in the number of hospital-based proton treatment centers around the world, and by 2006 there are more than a dozen commercially-built facilities in use, five new facilities under construction, and more in planning stages. In the 1950s larger synchrotrons were built in the GeV region at Brookhaven (3-GeV Cosmotron) and at Berkeley (6-GeV Bevatron), and today most of the world's largest accelerators are synchrotrons. With advances in accelerator design in the early 1970s, synchrotrons at Berkeley and Princeton accelerated ions with atomic numbers between 6 and 18, at energies that permitted the initiation of several biological studies. It is worth noting that when the Bevatron was converted to accelerate light ions, the main push came from biomedical users who wanted to use high-LET radiation for treating human cancer.« less