Application of a fast skyline computation algorithm for serendipitous searching problems
NASA Astrophysics Data System (ADS)
Koizumi, Kenichi; Hiraki, Kei; Inaba, Mary
2018-02-01
Skyline computation is a method of extracting interesting entries from a large population with multiple attributes. These entries, called skyline or Pareto optimal entries, are known to have extreme characteristics that cannot be found by outlier detection methods. Skyline computation is an important task for characterizing large amounts of data and selecting interesting entries with extreme features. When the population changes dynamically, the task of calculating a sequence of skyline sets is called continuous skyline computation. This task is known to be difficult to perform for the following reasons: (1) information of non-skyline entries must be stored since they may join the skyline in the future; (2) the appearance or disappearance of even a single entry can change the skyline drastically; (3) it is difficult to adopt a geometric acceleration algorithm for skyline computation tasks with high-dimensional datasets. Our new algorithm called jointed rooted-tree (JR-tree) manages entries using a rooted tree structure. JR-tree delays extend the tree to deep levels to accelerate tree construction and traversal. In this study, we presented the difficulties in extracting entries tagged with a rare label in high-dimensional space and the potential of fast skyline computation in low-latency cell identification technology.
Programs for skyline planning.
Ward W. Carson
1975-01-01
This paper describes four computer programs for the logging engineer's use in planning log harvesting by skyline systems. One program prepares terrain profile plots from maps mounted on a digitizer; the other programs prepare load-carrying capability and other information for single and multispan standing skylines and single span running skylines. In general, the...
An analysis of running skyline load path.
Ward W. Carson; Charles N. Mann
1971-01-01
This paper is intended for those who wish to prepare an algorithm to determine the load path of a running skyline. The mathematics of a simplified approach to this running skyline design problem are presented. The approach employs assumptions which reduce the complexity of the problem to the point where it can be solved on desk-top computers of limited capacities. The...
Hardwood silviculture and skyline yarding on steep slopes: economic and environmental impacts
John E. Baumgras; Chris B. LeDoux
1995-01-01
Ameliorating the visual and environmental impact associated with harvesting hardwoods on steep slopes will require the efficient use of skyline yarding along with silvicultural alternatives to clearcutting. In evaluating the effects of these alternatives on harvesting revenue, results of field studies and computer simulations were used to estimate costs and revenue for...
Secure Skyline Queries on Cloud Platform.
Liu, Jinfei; Yang, Juncheng; Xiong, Li; Pei, Jian
2017-04-01
Outsourcing data and computation to cloud server provides a cost-effective way to support large scale data storage and query processing. However, due to security and privacy concerns, sensitive data (e.g., medical records) need to be protected from the cloud server and other unauthorized users. One approach is to outsource encrypted data to the cloud server and have the cloud server perform query processing on the encrypted data only. It remains a challenging task to support various queries over encrypted data in a secure and efficient way such that the cloud server does not gain any knowledge about the data, query, and query result. In this paper, we study the problem of secure skyline queries over encrypted data. The skyline query is particularly important for multi-criteria decision making but also presents significant challenges due to its complex computations. We propose a fully secure skyline query protocol on data encrypted using semantically-secure encryption. As a key subroutine, we present a new secure dominance protocol, which can be also used as a building block for other queries. Finally, we provide both serial and parallelized implementations and empirically study the protocols in terms of efficiency and scalability under different parameter settings, verifying the feasibility of our proposed solutions.
Secure Skyline Queries on Cloud Platform
Liu, Jinfei; Yang, Juncheng; Xiong, Li; Pei, Jian
2017-01-01
Outsourcing data and computation to cloud server provides a cost-effective way to support large scale data storage and query processing. However, due to security and privacy concerns, sensitive data (e.g., medical records) need to be protected from the cloud server and other unauthorized users. One approach is to outsource encrypted data to the cloud server and have the cloud server perform query processing on the encrypted data only. It remains a challenging task to support various queries over encrypted data in a secure and efficient way such that the cloud server does not gain any knowledge about the data, query, and query result. In this paper, we study the problem of secure skyline queries over encrypted data. The skyline query is particularly important for multi-criteria decision making but also presents significant challenges due to its complex computations. We propose a fully secure skyline query protocol on data encrypted using semantically-secure encryption. As a key subroutine, we present a new secure dominance protocol, which can be also used as a building block for other queries. Finally, we provide both serial and parallelized implementations and empirically study the protocols in terms of efficiency and scalability under different parameter settings, verifying the feasibility of our proposed solutions. PMID:28883710
76 FR 49753 - Privacy Act of 1974; System of Records
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-11
... Defense. DHA 14 System name: Computer/Electronics Accommodations Program for People with Disabilities... with ``Computer/Electronic Accommodations Program.'' System location: Delete entry and replace with ``Computer/Electronic Accommodations Program, Skyline 5, Suite 302, 5111 Leesburg Pike, Falls Church, VA...
Honeybees use the skyline in orientation.
Towne, William F; Ritrovato, Antoinette E; Esposto, Antonina; Brown, Duncan F
2017-07-01
In view-based navigation, animals acquire views of the landscape from various locations and then compare the learned views with current views in order to orient in certain directions or move toward certain destinations. One landscape feature of great potential usefulness in view-based navigation is the skyline, the silhouette of terrestrial objects against the sky, as it is distant, relatively stable and easy to detect. The skyline has been shown to be important in the view-based navigation of ants, but no flying insect has yet been shown definitively to use the skyline in this way. Here, we show that honeybees do indeed orient using the skyline. A feeder was surrounded with an artificial replica of the natural skyline there, and the bees' departures toward the nest were recorded from above with a video camera under overcast skies (to eliminate celestial cues). When the artificial skyline was rotated, the bees' departures were rotated correspondingly, showing that the bees oriented by the artificial skyline alone. We discuss these findings in the context of the likely importance of the skyline in long-range homing in bees, the likely importance of altitude in using the skyline, the likely role of ultraviolet light in detecting the skyline, and what we know about the bees' ability to resolve skyline features. © 2017. Published by The Company of Biologists Ltd.
Skyline: an open source document editor for creating and analyzing targeted proteomics experiments.
MacLean, Brendan; Tomazela, Daniela M; Shulman, Nicholas; Chambers, Matthew; Finney, Gregory L; Frewen, Barbara; Kern, Randall; Tabb, David L; Liebler, Daniel C; MacCoss, Michael J
2010-04-01
Skyline is a Windows client application for targeted proteomics method creation and quantitative data analysis. It is open source and freely available for academic and commercial use. The Skyline user interface simplifies the development of mass spectrometer methods and the analysis of data from targeted proteomics experiments performed using selected reaction monitoring (SRM). Skyline supports using and creating MS/MS spectral libraries from a wide variety of sources to choose SRM filters and verify results based on previously observed ion trap data. Skyline exports transition lists to and imports the native output files from Agilent, Applied Biosystems, Thermo Fisher Scientific and Waters triple quadrupole instruments, seamlessly connecting mass spectrometer output back to the experimental design document. The fast and compact Skyline file format is easily shared, even for experiments requiring many sample injections. A rich array of graphs displays results and provides powerful tools for inspecting data integrity as data are acquired, helping instrument operators to identify problems early. The Skyline dynamic report designer exports tabular data from the Skyline document model for in-depth analysis with common statistical tools. Single-click, self-updating web installation is available at http://proteome.gs.washington.edu/software/skyline. This web site also provides access to instructional videos, a support board, an issues list and a link to the source code project.
Skyline: an open source document editor for creating and analyzing targeted proteomics experiments
MacLean, Brendan; Tomazela, Daniela M.; Shulman, Nicholas; Chambers, Matthew; Finney, Gregory L.; Frewen, Barbara; Kern, Randall; Tabb, David L.; Liebler, Daniel C.; MacCoss, Michael J.
2010-01-01
Summary: Skyline is a Windows client application for targeted proteomics method creation and quantitative data analysis. It is open source and freely available for academic and commercial use. The Skyline user interface simplifies the development of mass spectrometer methods and the analysis of data from targeted proteomics experiments performed using selected reaction monitoring (SRM). Skyline supports using and creating MS/MS spectral libraries from a wide variety of sources to choose SRM filters and verify results based on previously observed ion trap data. Skyline exports transition lists to and imports the native output files from Agilent, Applied Biosystems, Thermo Fisher Scientific and Waters triple quadrupole instruments, seamlessly connecting mass spectrometer output back to the experimental design document. The fast and compact Skyline file format is easily shared, even for experiments requiring many sample injections. A rich array of graphs displays results and provides powerful tools for inspecting data integrity as data are acquired, helping instrument operators to identify problems early. The Skyline dynamic report designer exports tabular data from the Skyline document model for in-depth analysis with common statistical tools. Availability: Single-click, self-updating web installation is available at http://proteome.gs.washington.edu/software/skyline. This web site also provides access to instructional videos, a support board, an issues list and a link to the source code project. Contact: brendanx@u.washington.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:20147306
A hydraulic assist for a manual skyline lock
Cleveland J. Biller
1977-01-01
A hydraulic locking mechanism was designed to replace the manual skyline lock on a small standing skyline with gravity carriage. It improved the efficiency of the operation by reducing setup and takedown times and reduced the hazard to the crew.
48. VIEW OF SKYLINE DRIVE FROM THE ROCKY PEAK OF ...
48. VIEW OF SKYLINE DRIVE FROM THE ROCKY PEAK OF STONY MAN MOUNTAIN (EL. 4,011). LOOKING NORTHEAST. STONY MAN OVERLOOK VISIBLE IN THE DISTANCE. - Skyline Drive, From Front Royal, VA to Rockfish Gap, VA , Luray, Page County, VA
Conceptions of Height and Verticality in the History of Skyscrapers and Skylines
NASA Astrophysics Data System (ADS)
Maslovskaya, Oksana; Ignatov, Grigoriy
2018-03-01
The main goal of this article is to reveal the significance of height and verticality history of skyscrapers and skylines. The objectives are as follows: 1. trace the origin of design concepts related to skyscraper; 2. discuss the perceived experience of the cultural aspects of skyscrapers and skylines; 3. describe the differences and similarities of the profiles of with comparable skylines. The methodology of study is designed to explore the perceived theory and principals of skyscraper and skyline development phenomenon and its key features. The skyscraper reveals an assertive creative form of vertical design. Skyscraper construction also relates to the origin of ancient cultural symbolism as the dominant vertical element as the main features of an ordered space. The historical idea of height reaches back to the earliest civilization such as the Tower of Babel. Philosophical approaches of elements of such post-structuralism have been included in studying of skyscraper phenomenon. The analysis of skyscraper and their resulting skyline are examined to show the connection to their origins with their concepts of height and verticality. From the historical perspective, cities with skyscrapers and a skyline turn out to be an assertive manifestation of common ideas of height and verticality.
Skyline Harvesting in Appalachia
J. N. Kochenderfer; G. W. Wendel
1978-01-01
The URUS, a small standing skyline system, was tested in the Appalachian Mountains of north-central West Virginia. Some problems encountered with this small, mobile system are discussed. From the results of this test and observation of skyline systems used in the western United States, the authors suggest some machine characteristics that would be desirable for use in...
Operational test of the prototype peewee yarder.
Charles N. Mann; Ronald W. Mifflin
1979-01-01
An operational test of a small, prototype running skyline yarder was conducted early in 1978. Test results indicate that this yarder concept promises a low cost, high performance system for harvesting small logs where skyline methods are indicated. Timber harvest by thinning took place on 12 uphill and 2 downhill skyline roads, and clearcut harvesting was performed on...
The SKYTOWER and SKYMOBILE programs for locating and designing skyline harvest units.
R.H. Twito; R.J. McGaughey; S.E. Reutebuch
1988-01-01
PLANS, a software package for integrated timber-harvest planning, uses digital terrain models to provide the topographic data needed to fit harvest and transportation designs to specific terrain. SKYTOWER and SKYMOBILE are integral programs in the PLANS package and are used to design the timber-harvest units for skyline systems. SKYTOWER determines skyline payloads and...
Panorama: A Targeted Proteomics Knowledge Base
2015-01-01
Panorama is a web application for storing, sharing, analyzing, and reusing targeted assays created and refined with Skyline,1 an increasingly popular Windows client software tool for targeted proteomics experiments. Panorama allows laboratories to store and organize curated results contained in Skyline documents with fine-grained permissions, which facilitates distributed collaboration and secure sharing of published and unpublished data via a web-browser interface. It is fully integrated with the Skyline workflow and supports publishing a document directly to a Panorama server from the Skyline user interface. Panorama captures the complete Skyline document information content in a relational database schema. Curated results published to Panorama can be aggregated and exported as chromatogram libraries. These libraries can be used in Skyline to pick optimal targets in new experiments and to validate peak identification of target peptides. Panorama is open-source and freely available. It is distributed as part of LabKey Server,2 an open source biomedical research data management system. Laboratories and organizations can set up Panorama locally by downloading and installing the software on their own servers. They can also request freely hosted projects on https://panoramaweb.org, a Panorama server maintained by the Department of Genome Sciences at the University of Washington. PMID:25102069
Registration of Panoramic/Fish-Eye Image Sequence and LiDAR Points Using Skyline Features
Zhu, Ningning; Jia, Yonghong; Ji, Shunping
2018-01-01
We propose utilizing a rigorous registration model and a skyline-based method for automatic registration of LiDAR points and a sequence of panoramic/fish-eye images in a mobile mapping system (MMS). This method can automatically optimize original registration parameters and avoid the use of manual interventions in control point-based registration methods. First, the rigorous registration model between the LiDAR points and the panoramic/fish-eye image was built. Second, skyline pixels from panoramic/fish-eye images and skyline points from the MMS’s LiDAR points were extracted, relying on the difference in the pixel values and the registration model, respectively. Third, a brute force optimization method was used to search for optimal matching parameters between skyline pixels and skyline points. In the experiments, the original registration method and the control point registration method were used to compare the accuracy of our method with a sequence of panoramic/fish-eye images. The result showed: (1) the panoramic/fish-eye image registration model is effective and can achieve high-precision registration of the image and the MMS’s LiDAR points; (2) the skyline-based registration method can automatically optimize the initial attitude parameters, realizing a high-precision registration of a panoramic/fish-eye image and the MMS’s LiDAR points; and (3) the attitude correction values of the sequences of panoramic/fish-eye images are different, and the values must be solved one by one. PMID:29883431
Schilling, Birgit; Rardin, Matthew J; MacLean, Brendan X; Zawadzka, Anna M; Frewen, Barbara E; Cusack, Michael P; Sorensen, Dylan J; Bereman, Michael S; Jing, Enxuan; Wu, Christine C; Verdin, Eric; Kahn, C Ronald; Maccoss, Michael J; Gibson, Bradford W
2012-05-01
Despite advances in metabolic and postmetabolic labeling methods for quantitative proteomics, there remains a need for improved label-free approaches. This need is particularly pressing for workflows that incorporate affinity enrichment at the peptide level, where isobaric chemical labels such as isobaric tags for relative and absolute quantitation and tandem mass tags may prove problematic or where stable isotope labeling with amino acids in cell culture labeling cannot be readily applied. Skyline is a freely available, open source software tool for quantitative data processing and proteomic analysis. We expanded the capabilities of Skyline to process ion intensity chromatograms of peptide analytes from full scan mass spectral data (MS1) acquired during HPLC MS/MS proteomic experiments. Moreover, unlike existing programs, Skyline MS1 filtering can be used with mass spectrometers from four major vendors, which allows results to be compared directly across laboratories. The new quantitative and graphical tools now available in Skyline specifically support interrogation of multiple acquisitions for MS1 filtering, including visual inspection of peak picking and both automated and manual integration, key features often lacking in existing software. In addition, Skyline MS1 filtering displays retention time indicators from underlying MS/MS data contained within the spectral library to ensure proper peak selection. The modular structure of Skyline also provides well defined, customizable data reports and thus allows users to directly connect to existing statistical programs for post hoc data analysis. To demonstrate the utility of the MS1 filtering approach, we have carried out experiments on several MS platforms and have specifically examined the performance of this method to quantify two important post-translational modifications: acetylation and phosphorylation, in peptide-centric affinity workflows of increasing complexity using mouse and human models.
Schilling, Birgit; Rardin, Matthew J.; MacLean, Brendan X.; Zawadzka, Anna M.; Frewen, Barbara E.; Cusack, Michael P.; Sorensen, Dylan J.; Bereman, Michael S.; Jing, Enxuan; Wu, Christine C.; Verdin, Eric; Kahn, C. Ronald; MacCoss, Michael J.; Gibson, Bradford W.
2012-01-01
Despite advances in metabolic and postmetabolic labeling methods for quantitative proteomics, there remains a need for improved label-free approaches. This need is particularly pressing for workflows that incorporate affinity enrichment at the peptide level, where isobaric chemical labels such as isobaric tags for relative and absolute quantitation and tandem mass tags may prove problematic or where stable isotope labeling with amino acids in cell culture labeling cannot be readily applied. Skyline is a freely available, open source software tool for quantitative data processing and proteomic analysis. We expanded the capabilities of Skyline to process ion intensity chromatograms of peptide analytes from full scan mass spectral data (MS1) acquired during HPLC MS/MS proteomic experiments. Moreover, unlike existing programs, Skyline MS1 filtering can be used with mass spectrometers from four major vendors, which allows results to be compared directly across laboratories. The new quantitative and graphical tools now available in Skyline specifically support interrogation of multiple acquisitions for MS1 filtering, including visual inspection of peak picking and both automated and manual integration, key features often lacking in existing software. In addition, Skyline MS1 filtering displays retention time indicators from underlying MS/MS data contained within the spectral library to ensure proper peak selection. The modular structure of Skyline also provides well defined, customizable data reports and thus allows users to directly connect to existing statistical programs for post hoc data analysis. To demonstrate the utility of the MS1 filtering approach, we have carried out experiments on several MS platforms and have specifically examined the performance of this method to quantify two important post-translational modifications: acetylation and phosphorylation, in peptide-centric affinity workflows of increasing complexity using mouse and human models. PMID:22454539
NASA Technical Reports Server (NTRS)
Page, Lance; Shen, C. N.
1991-01-01
This paper describes skyline-based terrain matching, a new method for locating the vantage point of laser range-finding measurements on a global map previously prepared by satellite or aerial mapping. Skylines can be extracted from the range-finding measurements and modelled from the global map, and are represented in parametric, cylindrical form with azimuth angle as the independent variable. The three translational parameters of the vantage point are determined with a three-dimensional matching of these two sets of skylines.
3. ENVIRONMENT, FROM NORTH, SHOWING RICHMOND SKYLINE, BRIDGE DECK AND ...
3. ENVIRONMENT, FROM NORTH, SHOWING RICHMOND SKYLINE, BRIDGE DECK AND ROADWAY, AND NORTH APPROACH - Fifth Street Viaduct, Spanning Bacon's Quarter Branch Valley on Fifth Street, Richmond, Independent City, VA
2018-05-03
The Tsugaru Iwaki Skyline is a toll road in northern Japan, which partially ascends Mount Iwaki stratovolcano, and is notable for its steep gradient and 69 hairpin turns. The road ascends 806 meters over an average gradient of 8.66%, with some sections going up to 10%. The Tsugaru Iwaki Skyline has been considered one of the most dangerous mountain roads in the world. (Wikipedia) The image was acquired May 26, 2015, and is located at 40.6 degrees north, 140.3 degrees east. https://photojournal.jpl.nasa.gov/catalog/PIA22385
2010-05-01
Skyline Algorithms 2.2.1 Block-Nested Loops A simple way to find the skyline is to use the block-nested loops ( BNL ) algorithm [3], which is the algorithm...by an NDS member are discarded. After every individual has been compared with the NDS, the NDS is the dataset’s skyline. In the best case for BNL ...SFS) algorithm [4] is a variation on BNL that first introduces the idea of initially ordering the individuals by a monotonically increasing scoring
77 FR 15118 - Buy American Exceptions Under the American Recovery and Reinvestment Act of 2009
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-14
... heat pumps for the Skyline Crest Sustainability Upgrade project. FOR FURTHER INFORMATION CONTACT... Skyline Crest Sustainability Upgrade project. The exception was granted by HUD on the basis that the...
Lapierre, Marguerite; Blin, Camille; Lambert, Amaury; Achaz, Guillaume; Rocha, Eduardo P C
2016-07-01
Recent studies have linked demographic changes and epidemiological patterns in bacterial populations using coalescent-based approaches. We identified 26 studies using skyline plots and found that 21 inferred overall population expansion. This surprising result led us to analyze the impact of natural selection, recombination (gene conversion), and sampling biases on demographic inference using skyline plots and site frequency spectra (SFS). Forward simulations based on biologically relevant parameters from Escherichia coli populations showed that theoretical arguments on the detrimental impact of recombination and especially natural selection on the reconstructed genealogies cannot be ignored in practice. In fact, both processes systematically lead to spurious interpretations of population expansion in skyline plots (and in SFS for selection). Weak purifying selection, and especially positive selection, had important effects on skyline plots, showing patterns akin to those of population expansions. State-of-the-art techniques to remove recombination further amplified these biases. We simulated three common sampling biases in microbiological research: uniform, clustered, and mixed sampling. Alone, or together with recombination and selection, they further mislead demographic inferences producing almost any possible skyline shape or SFS. Interestingly, sampling sub-populations also affected skyline plots and SFS, because the coalescent rates of populations and their sub-populations had different distributions. This study suggests that extreme caution is needed to infer demographic changes solely based on reconstructed genealogies. We suggest that the development of novel sampling strategies and the joint analyzes of diverse population genetic methods are strictly necessary to estimate demographic changes in populations where selection, recombination, and biased sampling are present. © The Author 2016. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.
Block 2. Photograph represents general view taken from the north/west ...
Block 2. Photograph represents general view taken from the north/west region of the May D & F Tower. Photograph shows the main public gathering space for Skyline Park and depicts a light feature and an Information sign - Skyline Park, 1500-1800 Arapaho Street, Denver, Denver County, CO
ERIC Educational Resources Information Center
Skyline Coll., San Bruno, CA.
A joint project was conducted between Toyota Motor Sales and Skyline College (in the San Francisco, California, area) to create an automotive technician training program that would serve the needs of working adults. During the project, a model high technology curriculum suitable for adults was developed, the quality of instruction available for…
The Automatic Recognition of the Abnormal Sky-subtraction Spectra Based on Hadoop
NASA Astrophysics Data System (ADS)
An, An; Pan, Jingchang
2017-10-01
The skylines, superimposing on the target spectrum as a main noise, If the spectrum still contains a large number of high strength skylight residuals after sky-subtraction processing, it will not be conducive to the follow-up analysis of the target spectrum. At the same time, the LAMOST can observe a quantity of spectroscopic data in every night. We need an efficient platform to proceed the recognition of the larger numbers of abnormal sky-subtraction spectra quickly. Hadoop, as a distributed parallel data computing platform, can deal with large amounts of data effectively. In this paper, we conduct the continuum normalization firstly and then a simple and effective method will be presented to automatic recognize the abnormal sky-subtraction spectra based on Hadoop platform. Obtain through the experiment, the Hadoop platform can implement the recognition with more speed and efficiency, and the simple method can recognize the abnormal sky-subtraction spectra and find the abnormal skyline positions of different residual strength effectively, can be applied to the automatic detection of abnormal sky-subtraction of large number of spectra.
A Skyline Plugin for Pathway-Centric Data Browsing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Degan, Michael G.; Ryadinskiy, Lillian; Fujimoto, Grant M.
For targeted proteomics to be broadly adopted in biological laboratories as a routine experimental protocol, wet-bench biologists must be able to approach SRM assay design in the same way they approach biological experimental design. Most often, biological hypotheses are envisioned in a set of protein interactions, networks and pathways. We present a plugin for the popular Skyline tool that presents public mass spectrometry data in a pathway-centric view to assist users in browsing available data and determining how to design quantitative experiments. Selected proteins and their underlying mass spectra are imported to Skyline for further assay design (transition selection). Themore » same plugin can be used for hypothesis-drive DIA data analysis, again utilizing the pathway view to help narrow down the set of proteins which will be investigated. The plugin is backed by the PNNL Biodiversity Library, a corpus of 3 million peptides from >100 organisms, and the draft human proteome. Users can upload personal data to the plugin to use the pathway navigation prior to importing their own data into Skyline.« less
A Skyline Plugin for Pathway-Centric Data Browsing
NASA Astrophysics Data System (ADS)
Degan, Michael G.; Ryadinskiy, Lillian; Fujimoto, Grant M.; Wilkins, Christopher S.; Lichti, Cheryl F.; Payne, Samuel H.
2016-11-01
For targeted proteomics to be broadly adopted in biological laboratories as a routine experimental protocol, wet-bench biologists must be able to approach selected reaction monitoring (SRM) and parallel reaction monitoring (PRM) assay design in the same way they approach biological experimental design. Most often, biological hypotheses are envisioned in a set of protein interactions, networks, and pathways. We present a plugin for the popular Skyline tool that presents public mass spectrometry data in a pathway-centric view to assist users in browsing available data and determining how to design quantitative experiments. Selected proteins and their underlying mass spectra are imported to Skyline for further assay design (transition selection). The same plugin can be used for hypothesis-driven data-independent acquisition (DIA) data analysis, again utilizing the pathway view to help narrow down the set of proteins that will be investigated. The plugin is backed by the Pacific Northwest National Laboratory (PNNL) Biodiversity Library, a corpus of 3 million peptides from >100 organisms, and the draft human proteome. Users can upload personal data to the plugin to use the pathway navigation prior to importing their own data into Skyline.
Spectral Skyline Separation: Extended Landmark Databases and Panoramic Imaging
Differt, Dario; Möller, Ralf
2016-01-01
Evidence from behavioral experiments suggests that insects use the skyline as a cue for visual navigation. However, changes of lighting conditions, over hours, days or possibly seasons, significantly affect the appearance of the sky and ground objects. One possible solution to this problem is to extract the “skyline” by an illumination-invariant classification of the environment into two classes, ground objects and sky. In a previous study (Insect models of illumination-invariant skyline extraction from UV (ultraviolet) and green channels), we examined the idea of using two different color channels available for many insects (UV and green) to perform this segmentation. We found out that for suburban scenes in temperate zones, where the skyline is dominated by trees and artificial objects like houses, a “local” UV segmentation with adaptive thresholds applied to individual images leads to the most reliable classification. Furthermore, a “global” segmentation with fixed thresholds (trained on an image dataset recorded over several days) using UV-only information is only slightly worse compared to using both the UV and green channel. In this study, we address three issues: First, to enhance the limited range of environments covered by the dataset collected in the previous study, we gathered additional data samples of skylines consisting of minerals (stones, sand, earth) as ground objects. We could show that also for mineral-rich environments, UV-only segmentation achieves a quality comparable to multi-spectral (UV and green) segmentation. Second, we collected a wide variety of ground objects to examine their spectral characteristics under different lighting conditions. On the one hand, we found that the special case of diffusely-illuminated minerals increases the difficulty to reliably separate ground objects from the sky. On the other hand, the spectral characteristics of this collection of ground objects covers well with the data collected in the skyline databases, increasing, due to the increased variety of ground objects, the validity of our findings for novel environments. Third, we collected omnidirectional images, as often used for visual navigation tasks, of skylines using an UV-reflective hyperbolic mirror. We could show that “local” separation techniques can be adapted to the use of panoramic images by splitting the image into segments and finding individual thresholds for each segment. Contrarily, this is not possible for ‘global’ separation techniques. PMID:27690053
J. E. Baumgras; C. B. LeDoux; J. R. Sherar
1993-01-01
To evaluate the potential for moderating the visual impact and soil disturbance associated with timber harvesting on steep-slope hardwood sites, thinning and shelterwood harvests were conducted with a skyline yarding system. Operations were monitored to document harvesting production, residual stand damage, soil disturbance, and visual quality. Yarding costs for...
Skyline Gathers K-12 Together Under One Roof.
ERIC Educational Resources Information Center
American School Board Journal, 1968
1968-01-01
Skyline School is a flexible and economical elementary and high school design for 400 pupils. The library, a large resource center serving all ages, and the administration offices are accented by landscaped courts. There are two instructional material centers per grade grouping of K-6 and 7-12. Grades 1-6 surround the kindergarten, which has…
NASA Technical Reports Server (NTRS)
Dunham, R. S.
1976-01-01
FORTRAN coded out-of-core equation solvers that solve using direct methods symmetric banded systems of simultaneous algebraic equations. Banded, frontal and column (skyline) solvers were studied as well as solvers that can partition the working area and thus could fit into any available core. Comparison timings are presented for several typical two dimensional and three dimensional continuum type grids of elements with and without midside nodes. Extensive conclusions are also given.
Parallel-Vector Algorithm For Rapid Structural Anlysis
NASA Technical Reports Server (NTRS)
Agarwal, Tarun R.; Nguyen, Duc T.; Storaasli, Olaf O.
1993-01-01
New algorithm developed to overcome deficiency of skyline storage scheme by use of variable-band storage scheme. Exploits both parallel and vector capabilities of modern high-performance computers. Gives engineers and designers opportunity to include more design variables and constraints during optimization of structures. Enables use of more refined finite-element meshes to obtain improved understanding of complex behaviors of aerospace structures leading to better, safer designs. Not only attractive for current supercomputers but also for next generation of shared-memory supercomputers.
ERIC Educational Resources Information Center
Burns, Robert J.
The major purpose of this evaluation report is to scrutinize the Skyline Wide Educational Plan (SWEP) research methods and analytical schemes and to communicate the project's constituency priorities relative to the educational programs and processes of the future. A Delphi technique was used as the primary mechanism for gathering and scrutinizing…
Tree damage from skyline logging in a western larch/Douglas-fir stand
Robert E. Benson; Michael J. Gonsior
1981-01-01
Damage to shelterwood leave trees and to understory trees in shelterwood and clearcut logging units logged with skyline yarders was measured, and related to stand conditions, harvesting specifications, and yarding system-terrain interactions. About 23 percent of the marked leave trees in the shelterwood units were killed in logging, and about 10 percent had moderate to...
Efficiently Selecting the Best Web Services
NASA Astrophysics Data System (ADS)
Goncalves, Marlene; Vidal, Maria-Esther; Regalado, Alfredo; Yacoubi Ayadi, Nadia
Emerging technologies and linking data initiatives have motivated the publication of a large number of datasets, and provide the basis for publishing Web services and tools to manage the available data. This wealth of resources opens a world of possibilities to satisfy user requests. However, Web services may have similar functionality and assess different performance; therefore, it is required to identify among the Web services that satisfy a user request, the ones with the best quality. In this paper we propose a hybrid approach that combines reasoning tasks with ranking techniques to aim at the selection of the Web services that best implement a user request. Web service functionalities are described in terms of input and output attributes annotated with existing ontologies, non-functionality is represented as Quality of Services (QoS) parameters, and user requests correspond to conjunctive queries whose sub-goals impose restrictions on the functionality and quality of the services to be selected. The ontology annotations are used in different reasoning tasks to infer service implicit properties and to augment the size of the service search space. Furthermore, QoS parameters are considered by a ranking metric to classify the services according to how well they meet a user non-functional condition. We assume that all the QoS parameters of the non-functional condition are equally important, and apply the Top-k Skyline approach to select the k services that best meet this condition. Our proposal relies on a two-fold solution which fires a deductive-based engine that performs different reasoning tasks to discover the services that satisfy the requested functionality, and an efficient implementation of the Top-k Skyline approach to compute the top-k services that meet the majority of the QoS constraints. Our Top-k Skyline solution exploits the properties of the Skyline Frequency metric and identifies the top-k services by just analyzing a subset of the services that meet the non-functional condition. We report on the effects of the proposed reasoning tasks, the quality of the top-k services selected by the ranking metric, and the performance of the proposed ranking techniques. Our results suggest that the number of services can be augmented by up two orders of magnitude. In addition, our ranking techniques are able to identify services that have the best values in at least half of the QoS parameters, while the performance is improved.
Ait Kaci Azzou, Sadoune; Larribe, Fabrice; Froda, Sorana
2015-01-01
The effective population size over time (demographic history) can be retraced from a sample of contemporary DNA sequences. In this paper, we propose a novel methodology based on importance sampling (IS) for exploring such demographic histories. Our starting point is the generalized skyline plot with the main difference being that our procedure, skywis plot, uses a large number of genealogies. The information provided by these genealogies is combined according to the IS weights. Thus, we compute a weighted average of the effective population sizes on specific time intervals (epochs), where the genealogies that agree more with the data are given more weight. We illustrate by a simulation study that the skywis plot correctly reconstructs the recent demographic history under the scenarios most commonly considered in the literature. In particular, our method can capture a change point in the effective population size, and its overall performance is comparable with the one of the bayesian skyline plot. We also introduce the case of serially sampled sequences and illustrate that it is possible to improve the performance of the skywis plot in the case of an exponential expansion of the effective population size. PMID:26300910
Camera Geolocation From Mountain Images
2015-09-17
be reliably extracted from query images. However, in real-life scenarios the skyline in a query image may be blurred or invisible , due to occlusions...extracted from multiple mountain ridges is critical to reliably geolocating challenging real-world query images with blurred or invisible mountain skylines...Buddemeier, A. Bissacco, F. Brucher, T. Chua, H. Neven, and J. Yagnik, “Tour the world: building a web -scale landmark recognition engine,” in Proc. of
Mountain Logging Symposium Proceedings Held in West Virginia on Jun 5-7, 1984
1984-06-07
and board" analysis ( Lysons and Mann 1967) provided a method to make skyline payload determination feasible using topographic maps or field run... Lysons , Hilton H.; Mann, Charles N. Skyline tension and deflection handbook. Res. Pap. PNW-39. Portland, OR: U.S. Department of Agriculture, Forest...those described by Mifflin and Lysons (1978)and Miyata (1980). The estimated cost for the Clearwater Yarder and a four-man crew was $48.27 per
ERIC Educational Resources Information Center
Dallas Independent School District, TX. Dept. of Research and Evaluation.
This volume consists of a number of appendixes containing data and analyses that were compiled to aid administrators of the Skyline Wide Educational Plan (SWEP) in their efforts to develop a comprehensive secondary school plan for the Dallas-Fort Worth metroplex in the 1970's. Much of the volume is devoted to various facility considerations…
Reliable Execution Based on CPN and Skyline Optimization for Web Service Composition
Ha, Weitao; Zhang, Guojun
2013-01-01
With development of SOA, the complex problem can be solved by combining available individual services and ordering them to best suit user's requirements. Web services composition is widely used in business environment. With the features of inherent autonomy and heterogeneity for component web services, it is difficult to predict the behavior of the overall composite service. Therefore, transactional properties and nonfunctional quality of service (QoS) properties are crucial for selecting the web services to take part in the composition. Transactional properties ensure reliability of composite Web service, and QoS properties can identify the best candidate web services from a set of functionally equivalent services. In this paper we define a Colored Petri Net (CPN) model which involves transactional properties of web services in the composition process. To ensure reliable and correct execution, unfolding processes of the CPN are followed. The execution of transactional composition Web service (TCWS) is formalized by CPN properties. To identify the best services of QoS properties from candidate service sets formed in the TCSW-CPN, we use skyline computation to retrieve dominant Web service. It can overcome that the reduction of individual scores to an overall similarity leads to significant information loss. We evaluate our approach experimentally using both real and synthetically generated datasets. PMID:23935431
Reliable execution based on CPN and skyline optimization for Web service composition.
Chen, Liping; Ha, Weitao; Zhang, Guojun
2013-01-01
With development of SOA, the complex problem can be solved by combining available individual services and ordering them to best suit user's requirements. Web services composition is widely used in business environment. With the features of inherent autonomy and heterogeneity for component web services, it is difficult to predict the behavior of the overall composite service. Therefore, transactional properties and nonfunctional quality of service (QoS) properties are crucial for selecting the web services to take part in the composition. Transactional properties ensure reliability of composite Web service, and QoS properties can identify the best candidate web services from a set of functionally equivalent services. In this paper we define a Colored Petri Net (CPN) model which involves transactional properties of web services in the composition process. To ensure reliable and correct execution, unfolding processes of the CPN are followed. The execution of transactional composition Web service (TCWS) is formalized by CPN properties. To identify the best services of QoS properties from candidate service sets formed in the TCSW-CPN, we use skyline computation to retrieve dominant Web service. It can overcome that the reduction of individual scores to an overall similarity leads to significant information loss. We evaluate our approach experimentally using both real and synthetically generated datasets.
A new parallel-vector finite element analysis software on distributed-memory computers
NASA Technical Reports Server (NTRS)
Qin, Jiangning; Nguyen, Duc T.
1993-01-01
A new parallel-vector finite element analysis software package MPFEA (Massively Parallel-vector Finite Element Analysis) is developed for large-scale structural analysis on massively parallel computers with distributed-memory. MPFEA is designed for parallel generation and assembly of the global finite element stiffness matrices as well as parallel solution of the simultaneous linear equations, since these are often the major time-consuming parts of a finite element analysis. Block-skyline storage scheme along with vector-unrolling techniques are used to enhance the vector performance. Communications among processors are carried out concurrently with arithmetic operations to reduce the total execution time. Numerical results on the Intel iPSC/860 computers (such as the Intel Gamma with 128 processors and the Intel Touchstone Delta with 512 processors) are presented, including an aircraft structure and some very large truss structures, to demonstrate the efficiency and accuracy of MPFEA.
NASA Astrophysics Data System (ADS)
Titus, Benjamin M.; Daly, Marymegan
2017-03-01
Specialist and generalist life histories are expected to result in contrasting levels of genetic diversity at the population level, and symbioses are expected to lead to patterns that reflect a shared biogeographic history and co-diversification. We test these assumptions using mtDNA sequencing and a comparative phylogeographic approach for six co-occurring crustacean species that are symbiotic with sea anemones on western Atlantic coral reefs, yet vary in their host specificities: four are host specialists and two are host generalists. We first conducted species discovery analyses to delimit cryptic lineages, followed by classic population genetic diversity analyses for each delimited taxon, and then reconstructed the demographic history for each taxon using traditional summary statistics, Bayesian skyline plots, and approximate Bayesian computation to test for signatures of recent and concerted population expansion. The genetic diversity values recovered here contravene the expectations of the specialist-generalist variation hypothesis and classic population genetics theory; all specialist lineages had greater genetic diversity than generalists. Demography suggests recent population expansions in all taxa, although Bayesian skyline plots and approximate Bayesian computation suggest the timing and magnitude of these events were idiosyncratic. These results do not meet the a priori expectation of concordance among symbiotic taxa and suggest that intrinsic aspects of species biology may contribute more to phylogeographic history than extrinsic forces that shape whole communities. The recovery of two cryptic specialist lineages adds an additional layer of biodiversity to this symbiosis and contributes to an emerging pattern of cryptic speciation in the specialist taxa. Our results underscore the differences in the evolutionary processes acting on marine systems from the terrestrial processes that often drive theory. Finally, we continue to highlight the Florida Reef Tract as an important biodiversity hotspot.
Nasso, Sara; Goetze, Sandra; Martens, Lennart
2015-09-04
Selected reaction monitoring (SRM) MS is a highly selective and sensitive technique to quantify protein abundances in complex biological samples. To enhance the pace of SRM large studies, a validated, robust method to fully automate absolute quantification and to substitute for interactive evaluation would be valuable. To address this demand, we present Ariadne, a Matlab software. To quantify monitored targets, Ariadne exploits metadata imported from the transition lists, and targets can be filtered according to mProphet output. Signal processing and statistical learning approaches are combined to compute peptide quantifications. To robustly estimate absolute abundances, the external calibration curve method is applied, ensuring linearity over the measured dynamic range. Ariadne was benchmarked against mProphet and Skyline by comparing its quantification performance on three different dilution series, featuring either noisy/smooth traces without background or smooth traces with complex background. Results, evaluated as efficiency, linearity, accuracy, and precision of quantification, showed that Ariadne's performance is independent of data smoothness and complex background presence and that Ariadne outperforms mProphet on the noisier data set and improved 2-fold Skyline's accuracy and precision for the lowest abundant dilution with complex background. Remarkably, Ariadne could statistically distinguish from each other all different abundances, discriminating dilutions as low as 0.1 and 0.2 fmol. These results suggest that Ariadne offers reliable and automated analysis of large-scale SRM differential expression studies.
Using a multifrontal sparse solver in a high performance, finite element code
NASA Technical Reports Server (NTRS)
King, Scott D.; Lucas, Robert; Raefsky, Arthur
1990-01-01
We consider the performance of the finite element method on a vector supercomputer. The computationally intensive parts of the finite element method are typically the individual element forms and the solution of the global stiffness matrix both of which are vectorized in high performance codes. To further increase throughput, new algorithms are needed. We compare a multifrontal sparse solver to a traditional skyline solver in a finite element code on a vector supercomputer. The multifrontal solver uses the Multiple-Minimum Degree reordering heuristic to reduce the number of operations required to factor a sparse matrix and full matrix computational kernels (e.g., BLAS3) to enhance vector performance. The net result in an order-of-magnitude reduction in run time for a finite element application on one processor of a Cray X-MP.
DISPAQ: Distributed Profitable-Area Query from Big Taxi Trip Data.
Putri, Fadhilah Kurnia; Song, Giltae; Kwon, Joonho; Rao, Praveen
2017-09-25
One of the crucial problems for taxi drivers is to efficiently locate passengers in order to increase profits. The rapid advancement and ubiquitous penetration of Internet of Things (IoT) technology into transportation industries enables us to provide taxi drivers with locations that have more potential passengers (more profitable areas) by analyzing and querying taxi trip data. In this paper, we propose a query processing system, called Distributed Profitable-Area Query ( DISPAQ ) which efficiently identifies profitable areas by exploiting the Apache Software Foundation's Spark framework and a MongoDB database. DISPAQ first maintains a profitable-area query index (PQ-index) by extracting area summaries and route summaries from raw taxi trip data. It then identifies candidate profitable areas by searching the PQ-index during query processing. Then, it exploits a Z-Skyline algorithm, which is an extension of skyline processing with a Z-order space filling curve, to quickly refine the candidate profitable areas. To improve the performance of distributed query processing, we also propose local Z-Skyline optimization, which reduces the number of dominant tests by distributing killer profitable areas to each cluster node. Through extensive evaluation with real datasets, we demonstrate that our DISPAQ system provides a scalable and efficient solution for processing profitable-area queries from huge amounts of big taxi trip data.
DISPAQ: Distributed Profitable-Area Query from Big Taxi Trip Data †
Putri, Fadhilah Kurnia; Song, Giltae; Rao, Praveen
2017-01-01
One of the crucial problems for taxi drivers is to efficiently locate passengers in order to increase profits. The rapid advancement and ubiquitous penetration of Internet of Things (IoT) technology into transportation industries enables us to provide taxi drivers with locations that have more potential passengers (more profitable areas) by analyzing and querying taxi trip data. In this paper, we propose a query processing system, called Distributed Profitable-Area Query (DISPAQ) which efficiently identifies profitable areas by exploiting the Apache Software Foundation’s Spark framework and a MongoDB database. DISPAQ first maintains a profitable-area query index (PQ-index) by extracting area summaries and route summaries from raw taxi trip data. It then identifies candidate profitable areas by searching the PQ-index during query processing. Then, it exploits a Z-Skyline algorithm, which is an extension of skyline processing with a Z-order space filling curve, to quickly refine the candidate profitable areas. To improve the performance of distributed query processing, we also propose local Z-Skyline optimization, which reduces the number of dominant tests by distributing killer profitable areas to each cluster node. Through extensive evaluation with real datasets, we demonstrate that our DISPAQ system provides a scalable and efficient solution for processing profitable-area queries from huge amounts of big taxi trip data. PMID:28946679
Comparison of two matrix data structures for advanced CSM testbed applications
NASA Technical Reports Server (NTRS)
Regelbrugge, M. E.; Brogan, F. A.; Nour-Omid, B.; Rankin, C. C.; Wright, M. A.
1989-01-01
The first section describes data storage schemes presently used by the Computational Structural Mechanics (CSM) testbed sparse matrix facilities and similar skyline (profile) matrix facilities. The second section contains a discussion of certain features required for the implementation of particular advanced CSM algorithms, and how these features might be incorporated into the data storage schemes described previously. The third section presents recommendations, based on the discussions of the prior sections, for directing future CSM testbed development to provide necessary matrix facilities for advanced algorithm implementation and use. The objective is to lend insight into the matrix structures discussed and to help explain the process of evaluating alternative matrix data structures and utilities for subsequent use in the CSM testbed.
Progress made in understanding Mount Rainier's hazards
Sisson, T.W.; Vallance, J.W.; Pringle, P.T.
2001-01-01
At 4392 m high, glacier-clad Mount Rainier dominates the skyline of the southern Puget Sound region and is the centerpiece of Mount Rainier National Park. About 2.5 million people of the greater Seattle-Tacoma metropolitan area can see Mount Rainier on clear days, and 150,000 live in areas swept by lahars and floods that emanated from the volcano during the last 6,000 years (Figure 1). These lahars include the voluminous Osceola Mudflow that floors the lowlands south of Seattle and east of Tacoma, and which was generated by massive volcano flank-collapse. Mount Rainier's last eruption was a light dusting of ash in 1894; minor pumice last erupted between 1820 and 1854; and the most recent large eruptions we know of were about 1100 and 2300 years ago, according to reports from the U.S. Geological Survey.
Rapid Assessment of Contaminants and Interferences in Mass Spectrometry Data Using Skyline
NASA Astrophysics Data System (ADS)
Rardin, Matthew J.
2018-04-01
Proper sample preparation in proteomic workflows is essential to the success of modern mass spectrometry experiments. Complex workflows often require reagents which are incompatible with MS analysis (e.g., detergents) necessitating a variety of sample cleanup procedures. Efforts to understand and mitigate sample contamination are a continual source of disruption with respect to both time and resources. To improve the ability to rapidly assess sample contamination from a diverse array of sources, I developed a molecular library in Skyline for rapid extraction of contaminant precursor signals using MS1 filtering. This contaminant template library is easily managed and can be modified for a diverse array of mass spectrometry sample preparation workflows. Utilization of this template allows rapid assessment of sample integrity and indicates potential sources of contamination. [Figure not available: see fulltext.
NASA Astrophysics Data System (ADS)
Stockdale, James; Ineson, Philip
2016-04-01
Modelled predictions of the response of terrestrial systems to climate change are highly variable, yet the response of net ecosystem exchange (NEE) is a vital ecosystem behaviour to understand due to its inherent feedback to the carbon cycle. The establishment and subsequent monitoring of replicated experimental manipulations are a direct method to reveal these responses, yet are difficult to achieve as they typically resource-heavy and labour intensive. We actively manipulated the temperature at three agricultural grasslands in southern England and deployed novel 'SkyLine' systems, recently developed at the University of York, to continuously monitor GHG fluxes. Each 'SkyLine' is a low-cost and fully autonomous technology yet produces fluxes at a near-continuous temporal frequency and across a wide spatial area. The results produced by 'SkyLine' enable the detail response of each system to increased temperature over diurnal and seasonal timescales. Unexpected differences in NEE are shown between superficially similar ecosystems which, upon investigation, suggest that interactions between a variety of environmental variables are key and that knowledge of pre-existing environmental conditions help to predict a systems response to future climate. For example, the prevailing hydrological conditions at each site appear to affect its response to changing temperature. The high-frequency data shown here, combined with the fully-replicated experimental design reveal complex interactions which must be understood to improve predictions of ecosystem response to a changing climate.
2015-01-01
Food consumption is an important behavior that is regulated by an intricate array of neuropeptides (NPs). Although many feeding-related NPs have been identified in mammals, precise mechanisms are unclear and difficult to study in mammals, as current methods are not highly multiplexed and require extensive a priori knowledge about analytes. New advances in data-independent acquisition (DIA) MS/MS and the open-source quantification software Skyline have opened up the possibility to identify hundreds of compounds and quantify them from a single DIA MS/MS run. An untargeted DIA MSE quantification method using Skyline software for multiplexed, discovery-driven quantification was developed and found to produce linear calibration curves for peptides at physiologically relevant concentrations using a protein digest as internal standard. By using this method, preliminary relative quantification of the crab Cancer borealis neuropeptidome (<2 kDa, 137 peptides from 18 families) was possible in microdialysates from 8 replicate feeding experiments. Of these NPs, 55 were detected with an average mass error below 10 ppm. The time-resolved profiles of relative concentration changes for 6 are shown, and there is great potential for the use of this method in future experiments to aid in correlation of NP changes with behavior. This work presents an unbiased approach to winnowing candidate NPs related to a behavior of interest in a functionally relevant manner, and demonstrates the success of such a UPLC-MSE quantification method using the open source software Skyline. PMID:25552291
Schmerberg, Claire M; Liang, Zhidan; Li, Lingjun
2015-01-21
Food consumption is an important behavior that is regulated by an intricate array of neuropeptides (NPs). Although many feeding-related NPs have been identified in mammals, precise mechanisms are unclear and difficult to study in mammals, as current methods are not highly multiplexed and require extensive a priori knowledge about analytes. New advances in data-independent acquisition (DIA) MS/MS and the open-source quantification software Skyline have opened up the possibility to identify hundreds of compounds and quantify them from a single DIA MS/MS run. An untargeted DIA MS(E) quantification method using Skyline software for multiplexed, discovery-driven quantification was developed and found to produce linear calibration curves for peptides at physiologically relevant concentrations using a protein digest as internal standard. By using this method, preliminary relative quantification of the crab Cancer borealis neuropeptidome (<2 kDa, 137 peptides from 18 families) was possible in microdialysates from 8 replicate feeding experiments. Of these NPs, 55 were detected with an average mass error below 10 ppm. The time-resolved profiles of relative concentration changes for 6 are shown, and there is great potential for the use of this method in future experiments to aid in correlation of NP changes with behavior. This work presents an unbiased approach to winnowing candidate NPs related to a behavior of interest in a functionally relevant manner, and demonstrates the success of such a UPLC-MS(E) quantification method using the open source software Skyline.
8. Engineering Drawing of Panama Gun Mount by U.S. Engineering ...
8. Engineering Drawing of Panama Gun Mount by U.S. Engineering Office, San Francisco, California - Fort Funston, Panama Mounts for 155mm Guns, Skyline Boulevard & Great Highway, San Francisco, San Francisco County, CA
Photogrammetric mobile satellite service prediction
NASA Technical Reports Server (NTRS)
Akturan, Riza; Vogel, Wolfhard J.
1994-01-01
Photographic images of the sky were taken with a camera through a fisheye lens with a 180 deg field-of-view. The images of rural, suburban, and urban scenes were analyzed on a computer to derive quantitative information about the elevation angles at which the sky becomes visible. Such knowledge is needed by designers of mobile and personal satellite communications systems and is desired by customers of these systems. The 90th percentile elevation angle of the skyline was found to be 10 deg, 17 deg, and 51 deg in the three environments. At 8 deg, 75 percent, 75 percent, and 35 percent of the sky was visible, respectively. The elevation autocorrelation fell to zero with a 72 deg lag in the rural and urban environment and a 40 deg lag in the suburb. Mean estimation errors are below 4 deg.
The exhibit is a 10'x10' skyline truss which will be used to highlight the activities of the U.S.-German Bilateral Working Group in the area of brownfields revitalization. The U.S. product, Sustainable Management Approaches and Revitalization Tools - electronic (SMARTe) will be d...
Implementation of precast concrete deck system NUDECK (2nd generation).
DOT National Transportation Integrated Search
2013-12-01
The first generation of precast concrete deck system, NUDECK, developed by the University of NebraskaLincoln (UNL) for Nebraska Department of Roads (NDOR), was implemented on the Skyline Bridge, : Omaha, NE in 2004. The project was highly successful ...
2009-03-15
STS119-S-025 (15 March 2009) --- The setting sun paints the clouds over NASA's Kennedy Space Center in Florida before the launch of Space Shuttle Discovery on the STS-119 mission. Liftoff is scheduled for 7:43 p.m. (EDT) on March 15, 2009.
4. A river level view of the Broad Street bridge ...
4. A river level view of the Broad Street bridge and Columbus skyline from the railroad truss north of the bridge. - Broad Street Bridge, Spanning Scioto River at U.S. Route 40 (Broad Street), Columbus, Franklin County, OH
Deadly Everest Avalanche Site Spotted by NASA Spacecraft
2014-04-28
On Friday, April 26, 2014, an avalanche on Mount Everest killed at least 13 Sherpa guides. NASA Terra spacecraft looked toward the northeast, with Mount Everest center, and Lhotse, the fourth-highest mountain on Earth, on the skyline to right center.
101. Catalog HHistory 1, C.C.C., 34 Landscaping, Negative No. 1340 ...
101. Catalog H-History 1, C.C.C., 34 Landscaping, Negative No. 1340 (Photographer and date unknown) BANK BLENDING WORK BY CCC. - Skyline Drive, From Front Royal, VA to Rockfish Gap, VA , Luray, Page County, VA
98. Catalog HHistory 1, C.C.C., 19 Tree Planting, Negative No. ...
98. Catalog H-History 1, C.C.C., 19 Tree Planting, Negative No. P 474c (Photographer and date unknown) TRANSPLANTING TREE. - Skyline Drive, From Front Royal, VA to Rockfish Gap, VA , Luray, Page County, VA
Evaluating the constructability of NUDECK precast concrete deck panels for Kearney Bypass Project.
DOT National Transportation Integrated Search
2015-02-01
The first generation of precast concrete deck system, NUDECK, was implemented on the Skyline Bridge, : Omaha, NE in 2004. The second generation of NUDECK system was developed to further simplify the : system and improve its constructability and durab...
66. BIG MEADOWS. VIEW OF PARKING AREA AT THE GATED ...
66. BIG MEADOWS. VIEW OF PARKING AREA AT THE GATED ENTRANCE TO RAPIDAN FIRE ROAD, THE ACCESS ROAD TO CAMP HOOVER. LOOKING SOUTH, MILE 51.3. - Skyline Drive, From Front Royal, VA to Rockfish Gap, VA , Luray, Page County, VA
2. VIEW OF PARK SIGNAGE AT FRONT ROYAL. SIGN SAYS: ...
2. VIEW OF PARK SIGNAGE AT FRONT ROYAL. SIGN SAYS: "NORTH ENTRANCE SHENANDOAH NATIONAL PARK." LOCATED ON EXIT SIDE OF ROAD. LOOKING SOUTHWEST, MILE 0.0. - Skyline Drive, From Front Royal, VA to Rockfish Gap, VA , Luray, Page County, VA
100. Catalog HHistory 1, C.C.C., 34 Landscaping, Negative No. P ...
100. Catalog H-History 1, C.C.C., 34 Landscaping, Negative No. P 733c (Photographer and date unknown) SLOPE MAINTENANCE WORK BY CCC. - Skyline Drive, From Front Royal, VA to Rockfish Gap, VA , Luray, Page County, VA
99. Catalog HHistory 1, C.C.C., 23 Guard Rail Construction, Negative ...
99. Catalog H-History 1, C.C.C., 23 Guard Rail Construction, Negative No. P455e (Photographer and date unknown) GUARD RAIL INSTALLATION. - Skyline Drive, From Front Royal, VA to Rockfish Gap, VA , Luray, Page County, VA
3D exploitation of large urban photo archives
NASA Astrophysics Data System (ADS)
Cho, Peter; Snavely, Noah; Anderson, Ross
2010-04-01
Recent work in computer vision has demonstrated the potential to automatically recover camera and scene geometry from large collections of uncooperatively-collected photos. At the same time, aerial ladar and Geographic Information System (GIS) data are becoming more readily accessible. In this paper, we present a system for fusing these data sources in order to transfer 3D and GIS information into outdoor urban imagery. Applying this system to 1000+ pictures shot of the lower Manhattan skyline and the Statue of Liberty, we present two proof-of-concept examples of geometry-based photo enhancement which are difficult to perform via conventional image processing: feature annotation and image-based querying. In these examples, high-level knowledge projects from 3D world-space into georegistered 2D image planes and/or propagates between different photos. Such automatic capabilities lay the groundwork for future real-time labeling of imagery shot in complex city environments by mobile smart phones.
10. Detail of map showing Battery Davis and Panama Gun ...
10. Detail of map showing Battery Davis and Panama Gun Mounts at right, by U.S. Engineering Office, San Francisco, California, August 5, 1934. - Fort Funston, Panama Mounts for 155mm Guns, Skyline Boulevard & Great Highway, San Francisco, San Francisco County, CA
5. VIEW LOOKING NORTHEAST INTO CENTRAL COURTYARD OF TECHWOOD DORMITORY, ...
5. VIEW LOOKING NORTHEAST INTO CENTRAL COURTYARD OF TECHWOOD DORMITORY, SHOWING WEST FRONT OF CENTER WING AND PART OF SOUTH SIDE OF NORTH WING. MIDTOWN SKYLINE VISIBLE IN BACKGROUND. - Techwood Homes, McDaniel Dormitory, 581-587 Techwood Drive, Atlanta, Fulton County, GA
Rulon B. Gardner
1980-01-01
Larch-fir stands in northwest Montana were experimentally logged to determine the influence of increasingly intensive levels of utilization upon rates of yarding production, under three different silvicultural prescriptions. Variables influencing rate of production were also identified.
75 FR 5289 - Defense Health Board (DHB) Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-02
... DEPARTMENT OF DEFENSE Office of the Secretary Defense Health Board (DHB) Meeting AGENCY... announces that the Defense Health Board (DHB or Board) will meet on March 1-2, 2010, to address and.... Feeks, Executive Secretary, Defense Health Board, Five Skyline Place, 5111 Leesburg Pike, Suite 810...
3 CFR 8410 - Proclamation 8410 of September 3, 2009. National Days of Prayer and Remembrance, 2009
Code of Federal Regulations, 2010 CFR
2010-01-01
... struck the skyline of New York City, the structure of the Pentagon, and the grass of Pennsylvania. In the... world. They have left the safety of home so that our Nation might be more secure. They have endured...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-07
... commercial and noncommercial vegetation management and road system modifications and maintenance. DATES... stands and old forest habitat; (2) improve watershed conditions and reduce road- related impacts to... commercial timber harvest on about 3,265 acres utilizing tractor/off-road jammer (1,124 acres), skyline (926...
Economics of hardwood silviculture using skyline and conventional logging
John E. Baumgras; Gary W. Miller; Chris B. LeDoux
1995-01-01
Managing Appalachian hardwood forests to satisfy the growing and diverse demands on this resource will require alternatives to traditional silvicultural methods and harvesting systems. Determining the relative economic efficiency of these alternative methods and systems with respect to harvest cash flows is essential. The effects of silvicultural methods and roundwood...
Block 3. Central view of Block 3 observed from the ...
Block 3. Central view of Block 3 observed from the west to the east. This photograph reveals the alignment of trees within the central path of the park. In addition, this photograph exposes broken bricks aligning tree beds - Skyline Park, 1500-1800 Arapaho Street, Denver, Denver County, CO
SIMYAR: a cable-yarding simulation model.
R.J. McGaughey; R.H. Twito
1987-01-01
A skyline-logging simulation model designed to help planners evaluate potential yarding options and alternative harvest plans is presented. The model, called SIMYAR, uses information about the timber stand, yarding equipment, and unit geometry to estimate yarding co stand productivity for a particular operation. The costs of felling, bucking, loading, and hauling are...
Balloon logging with the inverted skyline
NASA Technical Reports Server (NTRS)
Mosher, C. F.
1975-01-01
There is a gap in aerial logging techniques that has to be filled. The need for a simple, safe, sizeable system has to be developed before aerial logging will become effective and accepted in the logging industry. This paper presents such a system designed on simple principles with realistic cost and ecological benefits.
97. Catalog B, Higher Plants, 200 2 American Chestnut Tree, ...
97. Catalog B, Higher Plants, 200 2 American Chestnut Tree, Negative No. 6032 (Photographer and date unknown) THIS GHOST FOREST OF BLIGHTED CHESTNUTS ONCE STOOD APPROXIMATELY AT THE LOCATION OF THE BYRD VISITOR CENTER. - Skyline Drive, From Front Royal, VA to Rockfish Gap, VA , Luray, Page County, VA
ERIC Educational Resources Information Center
Fedore, Heidi
2005-01-01
In 2002, with pressure on students and educators mounting regarding performance on standardized tests, the author, who is an assistant principal at Skyline High School in Issaquah, Washington, and some staff members decided to have a little fun in the midst of the preparation for the state's high-stakes test, the Washington Assessment of Student…
PHOTOGRAPH NUMBERS 40, 39, 38 FORM A 189 DEGREE PANORAMA ...
PHOTOGRAPH NUMBERS 40, 39, 38 FORM A 189 DEGREE PANORAMA FROM LEFT TO RIGHT. PHOTOGRAPH NUMBER 38 LOOKING NORTHEAST TO SKYLINE FROM ROOF OF POLSON BUILDING; PHOTOGRAPH NUMBER 39 VIEW NORTH; PHOTOGRAPH NUMBER 40 VIEW NORTHWEST. - Alaskan Way Viaduct and Battery Street Tunnel, Seattle, King County, WA
The trustworthy digital camera: Restoring credibility to the photographic image
NASA Technical Reports Server (NTRS)
Friedman, Gary L.
1994-01-01
The increasing sophistication of computers has made digital manipulation of photographic images, as well as other digitally-recorded artifacts such as audio and video, incredibly easy to perform and increasingly difficult to detect. Today, every picture appearing in newspapers and magazines has been digitally altered to some degree, with the severity varying from the trivial (cleaning up 'noise' and removing distracting backgrounds) to the point of deception (articles of clothing removed, heads attached to other people's bodies, and the complete rearrangement of city skylines). As the power, flexibility, and ubiquity of image-altering computers continues to increase, the well-known adage that 'the photography doesn't lie' will continue to become an anachronism. A solution to this problem comes from a concept called digital signatures, which incorporates modern cryptographic techniques to authenticate electronic mail messages. 'Authenticate' in this case means one can be sure that the message has not been altered, and that the sender's identity has not been forged. The technique can serve not only to authenticate images, but also to help the photographer retain and enforce copyright protection when the concept of 'electronic original' is no longer meaningful.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-17
...) Multicolor Inc.; (7) Novelty Handicrafts Co., Ltd.; (8) Pacific Imports; (9) Papillon Ribbon & Bow (Canada... Lion Ribbon Company, Inc., for the following companies: (1) Apex Ribbon; (2) Apex Trimmings; (3) FinerRibbon.com ; (4) Hsien Chan Enterprise Co., Ltd.; (5) Hubschercorp; (6) Intercontinental Skyline; (7...
Nutrient losses from timber harvesting in a larch/ Douglas-fir forest
Nellie M. Stark
1979-01-01
Nutrient levels as a result of experimental clearcutting, shelterwood cutting, and group selection cutting - each with three levels of harvesting intensity - were studied in a larchfir forest in northwest Montana, experimentally logged with a skyline system. None of the treatments altered nutrient levels in an intermittent stream, nor were excessive amounts of...
Trends in streamflow and suspended sediment after logging, North Fork Caspar Creek
Jack Lewis; Elizabeth T. Keppeler
2007-01-01
Streamflow and suspended sediment were intensively monitored at fourteen gaging stations before and after logging a second-growth redwood (Sequoia sempervirens) forest. About 50 percent of the watershed was harvested, primarily by clear-cutting with skyline-cable systems. New road construction and tractor skidding were restricted to gently-sloping...
102. Catalog HHistory 1, C.C.C., 34 Landscaping, Negative No. 6040a ...
102. Catalog H-History 1, C.C.C., 34 Landscaping, Negative No. 6040a (Photographer and date unknown) BEAUTIFICATION PROGRAM STARTED AS SOON AS GRADING ALONG THE DRIVE WAS COMPLETED. CCC CAMP 3 SHOWN PLANTING LAUREL. - Skyline Drive, From Front Royal, VA to Rockfish Gap, VA , Luray, Page County, VA
Smith Assists in Superstorm Sandy Relief Efforts | Poster
By Cathy McClintock, Guest Writer It should have been routine by now for a 30-year volunteer firefighter/ emergency medical technician from Thurmont, Md., but it wasn’t. That first night, as Ross Smith, IT security, looked across the Hudson River from Jersey City, N.J., he saw an unusually dark New York skyline.
Building high-quality assay libraries for targeted analysis of SWATH MS data.
Schubert, Olga T; Gillet, Ludovic C; Collins, Ben C; Navarro, Pedro; Rosenberger, George; Wolski, Witold E; Lam, Henry; Amodei, Dario; Mallick, Parag; MacLean, Brendan; Aebersold, Ruedi
2015-03-01
Targeted proteomics by selected/multiple reaction monitoring (S/MRM) or, on a larger scale, by SWATH (sequential window acquisition of all theoretical spectra) MS (mass spectrometry) typically relies on spectral reference libraries for peptide identification. Quality and coverage of these libraries are therefore of crucial importance for the performance of the methods. Here we present a detailed protocol that has been successfully used to build high-quality, extensive reference libraries supporting targeted proteomics by SWATH MS. We describe each step of the process, including data acquisition by discovery proteomics, assertion of peptide-spectrum matches (PSMs), generation of consensus spectra and compilation of MS coordinates that uniquely define each targeted peptide. Crucial steps such as false discovery rate (FDR) control, retention time normalization and handling of post-translationally modified peptides are detailed. Finally, we show how to use the library to extract SWATH data with the open-source software Skyline. The protocol takes 2-3 d to complete, depending on the extent of the library and the computational resources available.
Bayesian inference of a historical bottleneck in a heavily exploited marine mammal.
Hoffman, J I; Grant, S M; Forcada, J; Phillips, C D
2011-10-01
Emerging Bayesian analytical approaches offer increasingly sophisticated means of reconstructing historical population dynamics from genetic data, but have been little applied to scenarios involving demographic bottlenecks. Consequently, we analysed a large mitochondrial and microsatellite dataset from the Antarctic fur seal Arctocephalus gazella, a species subjected to one of the most extreme examples of uncontrolled exploitation in history when it was reduced to the brink of extinction by the sealing industry during the late eighteenth and nineteenth centuries. Classical bottleneck tests, which exploit the fact that rare alleles are rapidly lost during demographic reduction, yielded ambiguous results. In contrast, a strong signal of recent demographic decline was detected using both Bayesian skyline plots and Approximate Bayesian Computation, the latter also allowing derivation of posterior parameter estimates that were remarkably consistent with historical observations. This was achieved using only contemporary samples, further emphasizing the potential of Bayesian approaches to address important problems in conservation and evolutionary biology. © 2011 Blackwell Publishing Ltd.
Model for Evaluating the Cost Consequences of Deferring New System Acquisition Through Upgrades
1999-07-01
Analysis & Evaluation The Pentagon Washington, DC 20301 Attn: Mr. Eric Coulter, Director Projection Forces Division, Room 2E314 Lt Col Kathleen Conley...1034 Office of the Air National Guard ANG/AQM 5109 Leesburg Pike Skyline VI, Suite 302A Falls Church, VA 22041-3201 Attn: Col Brent Marler 1 Lt Col
Installation and use of epoxy-grouted rock anchors for skyline logging in southeast Alaska.
W.L. Schroeder; D.N. Swanston
1992-01-01
Field tests of the load-carrying capacity of epoxy-grouted rock anchors in poor quality bedrock on Wrangel Island in southeast Alaska demonstrated the effectiveness of rock anchors as substitutes for stump anchors for logging system guylines. Ultimate capacity depends mainly on rock hardness or strength and length of the imbedded anchor.
An earth anchor system: installation and design guide.
R.L. Copstead; D.D. Studier
1990-01-01
A system for anchoring the guylines and skylines of cable yarding equipment is presented. A description of three types of tipping plate anchors is given. Descriptions of the installation equipment and methods specific to each type are given. Procedures for determining the correct number of anchors to install are included, as are guidelines for installing the anchors so...
Production and cost of a live skyline cable yarder tested in Appalachia
Edward L. Fisher; Harry G. Gibson; Cleveland J. Biller
1980-01-01
Logging systems that are profitable and environmentally acceptable are needed in Appalachian hardwood forests. Small, mobile cable yarders show promise in meeting both economic and environmental objectives. One such yarder, the Ecologger, was tested on the Jefferson National Forest near Marion, Virginia. Production rates and costs are presented for the system along...
103. Catalog HHistory 1, C.C.C., 58 Landscaping, Negative No. 870 ...
103. Catalog H-History 1, C.C.C., 58 Landscaping, Negative No. 870 10 ca. 1936 PROPAGATION AND PLANTING. ROOTED PLANTS TRANSPLANTED FROM HOT BEDS TO CANS TO SHADED BEDS IN PREPARATION FOR PLANTING ON ROAD SLOPES. NURSERY AT NORTH ENTRANCE. - Skyline Drive, From Front Royal, VA to Rockfish Gap, VA , Luray, Page County, VA
Cost and production analysis of the Bitterroot Miniyarder on an Appalachian hardwood site
John E. Baumgras; Penn A. Peters; Penn A. Peters
1985-01-01
An 18-horsepower skyline yarder was studied on a steep slope clearcut, yarding small hardwood trees uphill for fuelwood. Yarding cycle characteristics sampled include: total cycle time including delays, 5.20 minutes; yarding distance, 208 feet (350 feet maximum); turn volume, 11.6 cubic feet (24 cubic feet maximum); pieces per turn, 2.3. Cost analysis shows yarding...
Gary D. Falk
1981-01-01
A systematic procedure for predicting the payload capability of running, live, and standing skylines is presented. Three hand-held calculator programs are used to predict payload capability that includes the effect of partial suspension. The programs allow for predictions for downhill yarding and for yarding away from the yarder. The equations and basic principles...
A second look at cable logging in the Appalachians
Harry G. Gibson; Cleveland J. Biller
1975-01-01
Cable logging, once used extensively in the Appalachians, is being re-examined to see if smaller, more mobile systems can help solve some of the timber-managment problems on steep slopes. A small Austrian skyline was tested in West Virginia to determine its feasibility for harvesting enstern hardwoods. The short-term test included both selection and clearcut harvesting...
Gods of the City? Reflecting on City Building Games as an Early Introduction to Urban Systems
ERIC Educational Resources Information Center
Bereitschaft, Bradley
2016-01-01
For millions of gamers and students alike, city building games (CBGs) like SimCity and the more recent Cities: Skylines present a compelling initial introduction to the world of urban planning and development. As such, these games have great potential to shape players' understanding and expectations of real urban patterns and processes. In this…
DNA breaks and end resection measured genome-wide by end sequencing | Center for Cancer Research
About the Cover The cover depicts a ribbon of DNA portrayed as a city skyline. The central gap in the landscape localizes to the precise site of the DNA break. The features surrounding the break denote the processing of DNA-end structures (end-resection) emanating from the break location. Cover artwork by Ethan Tyler, NIH. Abstract
ERIC Educational Resources Information Center
Yeager, Susan Cadavid
2017-01-01
This case study examined the implementation of a baccalaureate degree at Skyline Community College--one of the 15 California community colleges authorized to offer baccalaureate degrees established as part of a pilot program enacted by the California Legislature via Senate Bill 850 (2014). The study explored the policies and procedures in place at…
NASA Astrophysics Data System (ADS)
MacDonald, B.; Finot, M.; Heiken, B.; Trowbridge, T.; Ackler, H.; Leonard, L.; Johnson, E.; Chang, B.; Keating, T.
2009-08-01
Skyline Solar Inc. has developed a novel silicon-based PV system to simultaneously reduce energy cost and improve scalability of solar energy. The system achieves high gain through a combination of high capacity factor and optical concentration. The design approach drives innovation not only into the details of the system hardware, but also into manufacturing and deployment-related costs and bottlenecks. The result of this philosophy is a modular PV system whose manufacturing strategy relies only on currently existing silicon solar cell, module, reflector and aluminum parts supply chains, as well as turnkey PV module production lines and metal fabrication industries that already exist at enormous scale. Furthermore, with a high gain system design, the generating capacity of all components is multiplied, leading to a rapidly scalable system. The product design and commercialization strategy cooperate synergistically to promise dramatically lower LCOE with substantially lower risk relative to materials-intensive innovations. In this paper, we will present the key design aspects of Skyline's system, including aspects of the optical, mechanical and thermal components, revealing the ease of scalability, low cost and high performance. Additionally, we will present performance and reliability results on modules and the system, using ASTM and UL/IEC methodologies.
Upscaling of greenhouse gas emissions in upland forestry following clearfell
NASA Astrophysics Data System (ADS)
Toet, Sylvia; Keane, Ben; Yamulki, Sirwan; Blei, Emanuel; Gibson-Poole, Simon; Xenakis, Georgios; Perks, Mike; Morison, James; Ineson, Phil
2016-04-01
Data on greenhouse gas (GHG) emissions caused by forest management activities are limited. Management such as clearfelling may, however, have major impacts on the GHG balance of forests through effects of soil disturbance, increased water table, and brash and root inputs. Besides carbon dioxide (CO2), the biogenic GHGs nitrous oxide (N2O) and methane (CH4) may also contribute to GHG emissions from managed forests. Accurate flux estimates of all three GHGs are therefore necessary, but, since GHG emissions usually show large spatial and temporal variability, in particular CH4 and N2O fluxes, high-frequency GHG flux measurements and better understanding of their controls are central to improve process-based flux models and GHG budgets at multiple scales. In this study, we determined CO2, CH4 and N2O emissions following felling in a mature Sitka spruce (Picea sitchensis) stand in an upland forest in northern England. High-frequency measurements were made along a transect using a novel, automated GHG chamber flux system ('SkyLine') developed at the University of York. The replicated, linear experiment aimed (1) to quantify GHG emissions from three main topographical features at the clearfell site, i.e. the ridges on which trees had been planted, the hollows in between and the drainage ditches, and (2) to determine the effects of the green-needle component of the discarded brash. We also measured abiotic soil and climatic factors alongside the 'SkyLine' GHG flux measurements to identify drivers of the observed GHG emissions. All three topographic features were overall sources of GHG emissions (in CO2 equivalents), and, although drainage ditches are often not included in studies, GHG emissions per unit area were highest from ditches, followed by ridges and lowest in hollows. The CO2 emissions were most important in the GHG balance of ridges and hollows, but CH4 emissions were very high from the drainage ditches, contributing to over 50% of their overall net GHG emissions. Ridges usually emitted N2O, whilst N2O emissions from hollows and ditches were very low. As much as 25% of the total GHG flux resulted from large intermittent emissions from the ditches following rainfall. Addition of green needles from the brash immediately increased soil respiration and reduced CH4 emission in comparison to controls. To upscale our high-frequency 'SkyLine' GHG flux measurements at the different topographic features to the field scale, we collected high resolution imagery from unmanned aerial vehicle (UAV) flights. We will compare results using this upscaling technique to GHG emissions simultaneously measured by eddy covariance with the 'SkyLine' system in the predominant footprint. This detailed knowledge of the spatial and temporal distribution of GHG emissions in an upland forest after felling and their drivers, and development of robust upscaling techniques can provide important tools to improve GHG flux models and to design appropriate management practices in upland forestry to mitigate GHG emissions following clearfell.
Predicting bunching costs for the Radio Horse 9 winch
Chris B. LeDoux; Bruce W. Kling; Patrice A. Harou; Patrice A. Harou
1987-01-01
Data from field studies and a prebunching cost simulator have been assembled and converted into a general equation that can be used to estimate the prebunching cost of the Radio Horse 9 winch. The methods can be used to estimate prebunching cost for bunching under the skyline corridor for swinging with cable systems, for bunching to skid trail edge to be picked up by a...
A topographic index to quantify the effect of mesoscale and form on site productivity
W. Henry McNab
1992-01-01
Landform is related to environmental factorsthat affectsite productivity in mountainous areas. I devised a simple index of landform and tested this index as a predictor of site index Ãn the Blue Ridge physiographic province. The landform index is the mean of eight slope gradients from plot center to skyline. A preliminary test indicated that the index was...
76 FR 76684 - Idaho: Tentative Approval of State Underground Storage Tank Program
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-08
.... Skyline, Suite B, Idaho Falls, ID 83402 from 10 a.m. to 12 p.m. and 1 p.m. to 4 p.m.; and 6. IDEQ Lewiston... ENVIRONMENTAL PROTECTION AGENCY 40 CFR Part 281 [EPA-R10-UST-2011-0896; FRL-9502-6] Idaho...). ACTION: Proposed rule. SUMMARY: The State of Idaho has applied for final approval of its Underground...
104. Catalog HHistory 1, C.C.C., 73 Picnic Furniture Construction, Negative ...
104. Catalog H-History 1, C.C.C., 73 Picnic Furniture Construction, Negative No. 8821 ca. 1936 WOOD UTILIZATION. COMPLETED RUSTIC BENCH MADE BY CCC ENROLLEES AT CAMP NP-3 FOR USE AT PARKING OVERLOOKS AND PICNIC GROUNDS. NOTE SAW IN BACKGROUND USED FOR HALVING CHESTNUT. - Skyline Drive, From Front Royal, VA to Rockfish Gap, VA , Luray, Page County, VA
Cycle-time equation for the Koller K300 cable yarder operating on steep slopes in the Northeast
Neil K. Huyler; Chris B. LeDoux
1997-01-01
Describes a delay-free-cycle time equation for the Koller K300 skyline yarder operating on steep slopes in the Northeast. Using the equation, the average delay-free-cycle time was 5.72 minutes. This means that about 420 cubic feet of material per hour can be produced. The important variables used in the equation were slope yarding distance, lateral yarding distance,...
Maclean, Brendan; Tomazela, Daniela M; Abbatiello, Susan E; Zhang, Shucha; Whiteaker, Jeffrey R; Paulovich, Amanda G; Carr, Steven A; Maccoss, Michael J
2010-12-15
Proteomics experiments based on Selected Reaction Monitoring (SRM, also referred to as Multiple Reaction Monitoring or MRM) are being used to target large numbers of protein candidates in complex mixtures. At present, instrument parameters are often optimized for each peptide, a time and resource intensive process. Large SRM experiments are greatly facilitated by having the ability to predict MS instrument parameters that work well with the broad diversity of peptides they target. For this reason, we investigated the impact of using simple linear equations to predict the collision energy (CE) on peptide signal intensity and compared it with the empirical optimization of the CE for each peptide and transition individually. Using optimized linear equations, the difference between predicted and empirically derived CE values was found to be an average gain of only 7.8% of total peak area. We also found that existing commonly used linear equations fall short of their potential, and should be recalculated for each charge state and when introducing new instrument platforms. We provide a fully automated pipeline for calculating these equations and individually optimizing CE of each transition on SRM instruments from Agilent, Applied Biosystems, Thermo-Scientific and Waters in the open source Skyline software tool ( http://proteome.gs.washington.edu/software/skyline ).
Implementation of statistical process control for proteomic experiments via LC MS/MS.
Bereman, Michael S; Johnson, Richard; Bollinger, James; Boss, Yuval; Shulman, Nick; MacLean, Brendan; Hoofnagle, Andrew N; MacCoss, Michael J
2014-04-01
Statistical process control (SPC) is a robust set of tools that aids in the visualization, detection, and identification of assignable causes of variation in any process that creates products, services, or information. A tool has been developed termed Statistical Process Control in Proteomics (SProCoP) which implements aspects of SPC (e.g., control charts and Pareto analysis) into the Skyline proteomics software. It monitors five quality control metrics in a shotgun or targeted proteomic workflow. None of these metrics require peptide identification. The source code, written in the R statistical language, runs directly from the Skyline interface, which supports the use of raw data files from several of the mass spectrometry vendors. It provides real time evaluation of the chromatographic performance (e.g., retention time reproducibility, peak asymmetry, and resolution), and mass spectrometric performance (targeted peptide ion intensity and mass measurement accuracy for high resolving power instruments) via control charts. Thresholds are experiment- and instrument-specific and are determined empirically from user-defined quality control standards that enable the separation of random noise and systematic error. Finally, Pareto analysis provides a summary of performance metrics and guides the user to metrics with high variance. The utility of these charts to evaluate proteomic experiments is illustrated in two case studies.
NASA Technical Reports Server (NTRS)
Raju, I. S.; Newman, J. C., Jr.
1993-01-01
A computer program, surf3d, that uses the 3D finite-element method to calculate the stress-intensity factors for surface, corner, and embedded cracks in finite-thickness plates with and without circular holes, was developed. The cracks are assumed to be either elliptic or part eliptic in shape. The computer program uses eight-noded hexahedral elements to model the solid. The program uses a skyline storage and solver. The stress-intensity factors are evaluated using the force method, the crack-opening displacement method, and the 3-D virtual crack closure methods. In the manual the input to and the output of the surf3d program are described. This manual also demonstrates the use of the program and describes the calculation of the stress-intensity factors. Several examples with sample data files are included with the manual. To facilitate modeling of the user's crack configuration and loading, a companion program (a preprocessor program) that generates the data for the surf3d called gensurf was also developed. The gensurf program is a three dimensional mesh generator program that requires minimal input and that builds a complete data file for surf3d. The program surf3d is operational on Unix machines such as CRAY Y-MP, CRAY-2, and Convex C-220.
John E. Baumgras; Chris B. LeDoux
1986-01-01
Cable yarding can reduce the environmental impact of timber harvesting on steep slopes by increasing road spacing and reducing soil disturbance. To determine the cost of harvesting forest biomass with a small cable yarder, a 13.4 kW (18 hp) skyline yarder was tested on two southern Appalachian sites. At both sites, fuelwood was harvested from the boles of hardwood...
2. A panoramic view of the historical district as seen ...
2. A panoramic view of the historical district as seen from the top of the Waterford Towers. This picture shows the Town Street bridge in the foreground, the Broad Street bridge in the background, Central High School on the left and the Columbus skyline on the right (facing north), and Bicentennial Park just below. - Broad Street Bridge, Spanning Scioto River at U.S. Route 40 (Broad Street), Columbus, Franklin County, OH
User-Driven Geolocation of Untagged Desert Imagery Using Digital Elevation Models (Open Access)
2013-09-12
IEEE International Conference on, pages 3677–3680. IEEE, 2011. [13] W. Zhang and J. Kosecka. Image based localization in urban environments. In 3D ...non- urban environments such as deserts. Our system generates synthetic skyline views from a DEM and extracts stable concavity-based features from these...fine as 100m2. 1. Introduction Automatic geolocation of imagery has many exciting use cases. For example, such a tool could semantically orga- nize
User-Driven Geolocation of Untagged Desert Imagery Using Digital Elevation Models
2013-01-01
Conference on, pages 3677–3680. IEEE, 2011. [13] W. Zhang and J. Kosecka. Image based localization in urban environments. In 3D Data Processing...non- urban environments such as deserts. Our system generates synthetic skyline views from a DEM and extracts stable concavity-based features from these...fine as 100m2. 1. Introduction Automatic geolocation of imagery has many exciting use cases. For example, such a tool could semantically orga- nize
Alter, S. Elizabeth; Newsome, Seth D.; Palumbi, Stephen R.
2012-01-01
Commercial whaling decimated many whale populations, including the eastern Pacific gray whale, but little is known about how population dynamics or ecology differed prior to these removals. Of particular interest is the possibility of a large population decline prior to whaling, as such a decline could explain the ∼5-fold difference between genetic estimates of prior abundance and estimates based on historical records. We analyzed genetic (mitochondrial control region) and isotopic information from modern and prehistoric gray whales using serial coalescent simulations and Bayesian skyline analyses to test for a pre-whaling decline and to examine prehistoric genetic diversity, population dynamics and ecology. Simulations demonstrate that significant genetic differences observed between ancient and modern samples could be caused by a large, recent population bottleneck, roughly concurrent with commercial whaling. Stable isotopes show minimal differences between modern and ancient gray whale foraging ecology. Using rejection-based Approximate Bayesian Computation, we estimate the size of the population bottleneck at its minimum abundance and the pre-bottleneck abundance. Our results agree with previous genetic studies suggesting the historical size of the eastern gray whale population was roughly three to five times its current size. PMID:22590499
Lustration: Transitional Justice in Poland and Its Continuous Struggle to Make Means With the Past
2008-06-01
Warsaw, just as the secret police did over its citizens. The skyline of Warsaw, dominated by this building, offers a daily reminder of life under the...the communist regime (especially acts of collaboration with the secret police) and in turn disqualifying members of these groups from holding high...Ministry of Interior for their name to be vetted through the Secret Police files of the former regime.3 A similar approach was adopted in Poland, but due
Ancient Chinese Astronomy - An Overview
NASA Astrophysics Data System (ADS)
Shi, Yunli
Documentary and archaeological evidence testifies the early origin and continuous development of ancient Chinese astronomy to meet both the ideological and practical needs of a society largely based on agriculture. There was a long period when the beginning of the year, month, and season was determined by direct observation of celestial phenomena, including their alignments with respect to the local skyline. As the need for more exact study arose, new instruments for more exact observation were invented and the system of calendrical astronomy became entirely mathematized.
The connection between landscapes and the solar ephemeris in honeybees.
Towne, William F; Moscrip, Heather
2008-12-01
Honeybees connect the sun's daily pattern of azimuthal movement to some aspect of the landscape around their nests. In the present study, we ask what aspect of the landscape is used in this context--the entire landscape panorama or only sectors seen along familiar flight routes. Previous studies of the solar ephemeris memory in bees have generally used bees that had experience flying a specific route, usually along a treeline, to a feeder. When such bees were moved to a differently oriented treeline on overcast days, the bees oriented their communicative dances as if they were still at the first treeline, based on a memory of the sun's course in relation to some aspect of the site, possibly the familiar route along the treeline or possibly the entire landscape or skyline panorama. Our results show that bees lacking specific flight-route training can nonetheless recall the sun's compass bearing relative to novel flight routes in their natal landscape. Specifically, we moved a hive from one landscape to a differently oriented twin landscape, and only after transplantation under overcast skies did we move a feeder away from the hive. These bees nonetheless danced accurately by memory of the sun's course in relation to their natal landscape. The bees' knowledge of the relationship between the sun and landscape, therefore, is not limited to familiar flight routes and so may encompass, at least functionally, the entire panorama. Further evidence suggests that the skyline in particular may be the bees' preferred reference in this context.
Paraskevis, Dimitrios; Paraschiv, Simona; Sypsa, Vana; Nikolopoulos, Georgios; Tsiara, Chryssa; Magiorkinis, Gkikas; Psichogiou, Mina; Flampouris, Andreas; Mardarescu, Mariana; Niculescu, Iulia; Batan, Ionelia; Malliori, Meni; Otelea, Dan; Hatzakis, Angelos
2015-10-01
A significant increase in HIV-1 diagnoses was reported among Injecting Drug Users (IDUs) in the Athens (17-fold) and Bucharest (9-fold) metropolitan areas starting 2011. Molecular analyses were conducted on HIV-1 sequences from IDUs comprising 51% and 20% of the diagnosed cases among IDUs during 2011-2013 for Greece and Romania, respectively. Phylodynamic analyses were performed using the newly developed birth-death serial skyline model which allows estimating of important epidemiological parameters, as implemented in BEAST programme. Most infections (>90%) occurred within four and three IDU local transmission networks in Athens and Bucharest, respectively. For all Romanian clusters, the viral strains originated from local circulating strains, whereas in Athens, the local strains seeded only two of the four sub-outbreaks. Birth-death skyline plots suggest a more explosive nature for sub-outbreaks in Bucharest than in Athens. In Athens, two sub-outbreaks had been controlled (Re<1.0) by 2013 and two appeared to be endemic (Re∼1). In Bucharest one outbreak continued to expand (Re>1.0) and two had been controlled (Re<1.0). The lead times were shorter for the outbreak in Athens than in Bucharest. Enhanced molecular surveillance proved useful to gain information about the origin, causal pathways, dispersal patterns and transmission dynamics of the outbreaks that can be useful in a public health setting. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Ashwood, Christopher; Lin, Chi-Hung; Thaysen-Andersen, Morten; Packer, Nicolle H.
2018-03-01
Profiling cellular protein glycosylation is challenging due to the presence of highly similar glycan structures that play diverse roles in cellular physiology. As the anomericity and the exact linkage type of a single glycosidic bond can influence glycan function, there is a demand for improved and automated methods to confirm detailed structural features and to discriminate between structurally similar isomers, overcoming a significant bottleneck in the analysis of data generated by glycomics experiments. We used porous graphitized carbon-LC-ESI-MS/MS to separate and detect released N- and O-glycan isomers from mammalian model glycoproteins using negative mode resonance activation CID-MS/MS. By interrogating similar fragment spectra from closely related glycan isomers that differ only in arm position and sialyl linkage, product fragment ions for discrimination between these features were discovered. Using the Skyline software, at least two diagnostic fragment ions of high specificity were validated for automated discrimination of sialylation and arm position in N-glycan structures, and sialylation in O-glycan structures, complementing existing structural diagnostic ions. These diagnostic ions were shown to be useful for isomer discrimination using both linear and 3D ion trap mass spectrometers when analyzing complex glycan mixtures from cell lysates. Skyline was found to serve as a useful tool for automated assessment of glycan isomer discrimination. This platform-independent workflow can potentially be extended to automate the characterization and quantitation of other challenging glycan isomers. [Figure not available: see fulltext.
Jung, Daewui; Li, Qi; Kong, Ling-Feng; Ni, Gang; Nakano, Tomoyuki; Matsukuma, Akihiko; Kim, Sanghee; Park, Chungoo; Lee, Hyuk Je; Park, Joong-Ki
2015-01-01
The present-day genetic structure of a species reflects both historical demography and patterns of contemporary gene flow among populations. To precisely understand how these factors shape current population structure of the northwestern (NW) Pacific marine gastropod, Thais clavigera, we determined the partial nucleotide sequences of the mitochondrial COI gene for 602 individuals sampled from 29 localities spanning almost the whole distribution of T. clavigera in the NW Pacific Ocean (~3,700 km). Results from population genetic and demographic analyses (AMOVA, ΦST-statistics, haplotype networks, Tajima’s D, Fu’s FS, mismatch distribution, and Bayesian skyline plots) revealed a lack of genealogical branches or geographical clusters, and a high level of genetic (haplotype) diversity within each of studied population. Nevertheless, low but significant genetic structuring was detected among some geographical populations separated by the Changjiang River, suggesting the presence of geographical barriers to larval dispersal around this region. Several lines of evidence including significant negative Tajima’s D and Fu’s FS statistics values, the unimodally shaped mismatch distribution, and Bayesian skyline plots suggest a population expansion at marine isotope stage 11 (MIS 11; 400 ka), the longest and warmest interglacial interval during the Pleistocene epoch. The lack of genetic structure among the great majority of the NW Pacific T. clavigera populations may be attributable to high gene flow by current-driven long-distance dispersal of prolonged planktonic larval phase of this species. PMID:26171966
Ramus, Claire; Hovasse, Agnès; Marcellin, Marlène; Hesse, Anne-Marie; Mouton-Barbosa, Emmanuelle; Bouyssié, David; Vaca, Sebastian; Carapito, Christine; Chaoui, Karima; Bruley, Christophe; Garin, Jérôme; Cianférani, Sarah; Ferro, Myriam; Van Dorssaeler, Alain; Burlet-Schiltz, Odile; Schaeffer, Christine; Couté, Yohann; Gonzalez de Peredo, Anne
2016-01-30
Proteomic workflows based on nanoLC-MS/MS data-dependent-acquisition analysis have progressed tremendously in recent years. High-resolution and fast sequencing instruments have enabled the use of label-free quantitative methods, based either on spectral counting or on MS signal analysis, which appear as an attractive way to analyze differential protein expression in complex biological samples. However, the computational processing of the data for label-free quantification still remains a challenge. Here, we used a proteomic standard composed of an equimolar mixture of 48 human proteins (Sigma UPS1) spiked at different concentrations into a background of yeast cell lysate to benchmark several label-free quantitative workflows, involving different software packages developed in recent years. This experimental design allowed to finely assess their performances in terms of sensitivity and false discovery rate, by measuring the number of true and false-positive (respectively UPS1 or yeast background proteins found as differential). The spiked standard dataset has been deposited to the ProteomeXchange repository with the identifier PXD001819 and can be used to benchmark other label-free workflows, adjust software parameter settings, improve algorithms for extraction of the quantitative metrics from raw MS data, or evaluate downstream statistical methods. Bioinformatic pipelines for label-free quantitative analysis must be objectively evaluated in their ability to detect variant proteins with good sensitivity and low false discovery rate in large-scale proteomic studies. This can be done through the use of complex spiked samples, for which the "ground truth" of variant proteins is known, allowing a statistical evaluation of the performances of the data processing workflow. We provide here such a controlled standard dataset and used it to evaluate the performances of several label-free bioinformatics tools (including MaxQuant, Skyline, MFPaQ, IRMa-hEIDI and Scaffold) in different workflows, for detection of variant proteins with different absolute expression levels and fold change values. The dataset presented here can be useful for tuning software tool parameters, and also testing new algorithms for label-free quantitative analysis, or for evaluation of downstream statistical methods. Copyright © 2015 Elsevier B.V. All rights reserved.
Computational simulation of progressive fracture in fiber composites
NASA Technical Reports Server (NTRS)
Chamis, C. C.
1986-01-01
Computational methods for simulating and predicting progressive fracture in fiber composite structures are presented. These methods are integrated into a computer code of modular form. The modules include composite mechanics, finite element analysis, and fracture criteria. The code is used to computationally simulate progressive fracture in composite laminates with and without defects. The simulation tracks the fracture progression in terms of modes initiating fracture, damage growth, and imminent global (catastrophic) laminate fracture.
The full proteomics analysis of a small tumor sample (similar in mass to a few grains of rice) produces well over 500 megabytes of unprocessed "raw" data when analyzed on a mass spectrometer (MS). Thus, for every proteomics experiment there is a vast amount of raw data that must be analyzed and interrogated in order to extract biological information. Moreover, the raw data output from different MS vendors are generally in different formats inhibiting the ability of labs to productively work together.
Space Shuttle Discovery DC Fly-Over
2012-04-17
Space shuttle Discovery, mounted atop a NASA 747 Shuttle Carrier Aircraft (SCA), flies over the Washington skyline as seen from a NASA T-38 aircraft, Tuesday, April 17, 2012. Discovery, the first orbiter retired from NASA’s shuttle fleet, completed 39 missions, spent 365 days in space, orbited the Earth 5,830 times, and traveled 148,221,675 miles. NASA will transfer Discovery to the National Air and Space Museum to begin its new mission to commemorate past achievements in space and to educate and inspire future generations of explorers. Photo Credit: (NASA/Robert Markowitz)
Structural behavior of composites with progressive fracture
NASA Technical Reports Server (NTRS)
Minnetyan, L.; Murthy, P. L. N.; Chamis, C. C.
1989-01-01
The objective of the study is to unify several computational tools developed for the prediction of progressive damage and fracture with efforts for the prediction of the overall response of damaged composite structures. In particular, a computational finite element model for the damaged structure is developed using a computer program as a byproduct of the analysis of progressive damage and fracture. Thus, a single computational investigation can predict progressive fracture and the resulting variation in structural properties of angleplied composites.
Unique patellofemoral alignment in a patient with a symptomatic bipartite patella.
Ishikawa, Masakazu; Adachi, Nobuo; Deie, Masataka; Nakamae, Atsuo; Nakasa, Tomoyuki; Kamei, Goki; Takazawa, Kobun; Ochi, Mitsuo
2016-01-01
A symptomatic bipartite patella is rarely seen in athletic adolescents or young adults in daily clinical practice. To date, only a limited number of studies have focused on patellofemoral alignment. The current study revealed a unique patellofemoral alignment in a patient with a symptomatic bipartite patella. Twelve patients with 12 symptomatic bipartite patellae who underwent arthroscopic vastus lateralis release (VLR) were investigated (10 males and two females, age: 15.7±4.4years). The radiographic data of contralateral intact and affected knees were reviewed retrospectively. From the lateral- and skyline-view imaging, the following parameters were measured: the congruence angle (CA), the lateral patellofemoral angle (LPA), and the Caton-Deschamps index (CDI). As an additional parameter, the bipartite fragment angle (BFA) was evaluated against the main part of the patella in the skyline view. Compared with the contralateral side, the affected patellae were significantly medialized and laterally tilted (CA: P=0.019; LPA: P=0.016), although there was no significant difference in CDI (P=0.877). This patellar malalignment was found to significantly change after VLR (CA: P=0.001; LPA: P=0.003) and the patellar height was significantly lower than in the preoperative condition (P=0.016). In addition, the BFA significantly shifted to a higher degree after operation (P=0.001). Patients with symptomatic bipartite patellae presented significantly medialized and laterally tilted patellae compared with the contralateral intact side. This malalignment was corrected by VLR, and the alignment of the bipartite fragment was also significantly changed. Level IV, case series. Copyright © 2015 Elsevier B.V. All rights reserved.
Extract useful knowledge from agro-hydrological simulations data for decision making
NASA Astrophysics Data System (ADS)
Gascuel-odoux, C.; Bouadi, T.; Cordier, M.; Quiniou, R.
2013-12-01
In recent years, models have been developed and used to test the effect of scenarios and help stakeholders in decision making. Agro-hydrological models have guided agricultural water management by testing the effect of landscape structure and farming system changes on water quantity and quality. Such models generate a large amount of data but few are stored and are often not customized for stakeholders, so that a great amount of information is lost from the simulation process or not transformed in a usable format. A first approach, already published (Trepos et al., 2012), has been developed to identify object oriented tree patterns, that represent surface flow and pollutant pathways from plot to plot, involved in water pollution by herbicides. A simulation model (Gascuel-odoux et al., 2009) predicted herbicide transfer rate, defined as the proportion of applied herbicide that reaches water courses. The predictions were used as a set of learning examples for symbolic learning techniques to induce rules based on qualitative and quantitative attributes and explain two extreme classes in transfer rate. Two automatic symbolic learning techniques were used: the inductive logic programming approach to induce spatial tree patterns, and an attribute-value method to induce aggregated attributes of the trees. A visualization interface allows the users to identify rules explaining contamination and mitigation measures improving the current situation. A second approach has been recently developed to analyse directly the simulated data (Bouadi et al, submitted). A data warehouse called N-catch has been built to store and manage simulation data from the agro-hydrological model TNT2 (Beaujouan et al., 2002). 44 output key simulated variables are stored per plot and at a daily time step on a 50 squared km area, i.e, 8 GB of storage size. After identifying the set of multileveled dimensions integrating hierarchical structures and relationships among related dimension levels, N-Catch has been designed using the open source Business Intelligence Platform Pentaho. We show how to use online analytical processing (OLAP) to access and exploit, intuitively and quickly, the multidimensional and aggregated data from the N-Catch data warehouse. We illustrate how the data warehouse can be used to explore spatio-temporal dimensions efficiently and to discover new knowledge at multiple levels of simulation. OLAP tool can be used to synthesize environmental information and understand nitrogen emissions in water bodies by generating comparative and personalized views of historical data. This DWH is currently extended with data mining or information retrieval methods as Skyline queries to perform advanced analyses (Bouadi et al., 2012). Bouadi et al. N-Catch: A Data Warehouse for Multilevel Analysis of Simulated Nitrogen Data from an Agro-hydrological Model. Submitted. Bouadi et al., 2012) Bouadi, T., Cordier, M., and Quiniou, R. (2012). Incremental computation of skyline queries with dynamic preferences. In DEXA (1), pages 219-233. Trepos et al. 2012. Mining simulation data by rule induction to determine critical source areas of stream water pollution by herbicides. Computers and Electronics in Agriculture 86, 75-88.
Ichikawa, Shota; Kamishima, Tamotsu; Sutherland, Kenneth; Fukae, Jun; Katayama, Kou; Aoki, Yuko; Okubo, Takanobu; Okino, Taichi; Kaneda, Takahiko; Takagi, Satoshi; Tanimura, Kazuhide
2017-10-01
We have developed a refined computer-based method to detect joint space narrowing (JSN) progression with the joint space narrowing progression index (JSNPI) by superimposing sequential hand radiographs. The purpose of this study is to assess the validity of a computer-based method using images obtained from multiple institutions in rheumatoid arthritis (RA) patients. Sequential hand radiographs of 42 patients (37 females and 5 males) with RA from two institutions were analyzed by a computer-based method and visual scoring systems as a standard of reference. The JSNPI above the smallest detectable difference (SDD) defined JSN progression on the joint level. The sensitivity and specificity of the computer-based method for JSN progression was calculated using the SDD and a receiver operating characteristic (ROC) curve. Out of 314 metacarpophalangeal joints, 34 joints progressed based on the SDD, while 11 joints widened. Twenty-one joints progressed in the computer-based method, 11 joints in the scoring systems, and 13 joints in both methods. Based on the SDD, we found lower sensitivity and higher specificity with 54.2 and 92.8%, respectively. At the most discriminant cutoff point according to the ROC curve, the sensitivity and specificity was 70.8 and 81.7%, respectively. The proposed computer-based method provides quantitative measurement of JSN progression using sequential hand radiographs and may be a useful tool in follow-up assessment of joint damage in RA patients.
Viking Lander imaging investigation: Picture catalog of primary mission experiment data record
NASA Technical Reports Server (NTRS)
Tucker, R. B.
1978-01-01
All the images returned by the two Viking Landers during the primary phase of the Viking Mission are presented. Listings of supplemental information which described the conditions under which the images were acquired are included together with skyline drawings which show where the images are positioned in the field of view of the cameras. Subsets of the images are listed in a variety of sequences to aid in locating images of interest. The format and organization of the digital magnetic tape storage of the images are described. The mission and the camera system are briefly described.
Shuttle Enterprise Flight to New York
2012-04-27
Space shuttle Enterprise, mounted atop a NASA 747 Shuttle Carrier Aircraft (SCA), is seen as it flies over the Manhattan Skyline with Freedom Tower in the background, Friday, April 27, 2012, in New York. Enterprise was the first shuttle orbiter built for NASA performing test flights in the atmosphere and was incapable of spaceflight. Originally housed at the Smithsonian's Steven F. Udvar-Hazy Center, Enterprise will be demated from the SCA and placed on a barge that will eventually be moved by tugboat up the Hudson River to the Intrepid Sea, Air & Space Museum in June. Photo Credit: (NASA/Robert Markowitz)
Shuttle Enterprise Flight to New York
2012-04-27
Space shuttle Enterprise, mounted atop a NASA 747 Shuttle Carrier Aircraft (SCA), is seen as it flies near the Statue of Liberty and the Manhattan skyline, Friday, April 27, 2012, in New York. Enterprise was the first shuttle orbiter built for NASA performing test flights in the atmosphere and was incapable of spaceflight. Originally housed at the Smithsonian's Steven F. Udvar-Hazy Center, Enterprise will be demated from the SCA and placed on a barge that will eventually be moved by tugboat up the Hudson River to the Intrepid Sea, Air & Space Museum in June. Photo Credit: (NASA/Robert Markowitz)
Shao, Yuhao; Yin, Xiaoxi; Kang, Dian; Shen, Boyu; Zhu, Zhangpei; Li, Xinuo; Li, Haofeng; Xie, Lin; Wang, Guangji; Liang, Yan
2017-08-01
Liquid chromatography mass spectrometry based methods provide powerful tools for protein analysis. Cytochromes P450 (CYPs), the most important drug metabolic enzymes, always exhibit sex-dependent expression patterns and metabolic activities. To date, analysis of CYPs based on mass spectrometry is still facing critical technical challenges due to the complexity and diversity of CYP isoforms besides lack of corresponding standards. The aim of present work consisted in developing a label-free qualitative and quantitative strategy for endogenous proteins, and then applying to the gender-difference study for CYPs in rat liver microsomes (RLMs). Initially, trypsin digested RLM specimens were analyzed by the nanoLC-LTQ-Orbitrap MS/MS. Skyline, an open source and freely available software for targeted proteomics research, was then used to screen the main CYP isoforms in RLMs under a series of criteria automatically, and a total of 40 and 39 CYP isoforms were identified in male and female RLMs, respectively. More importantly, a robust quantitative method in a tandem mass spectrometry-multiple reaction mode (MS/MS-MRM) was built and optimized under the help of Skyline, and successfully applied into the CYP gender difference study in RLMs. In this process, a simple and accurate approach named 'Standard Curve Slope" (SCS) was established based on the difference of standard curve slopes of CYPs between female and male RLMs in order to assess the gender difference of CYPs in RLMs. This presently developed methodology and approach could be widely used in the protein regulation study during drug pharmacological mechanism research. Copyright © 2017 Elsevier B.V. All rights reserved.
Godoy, Bibiane A; Gomes-Gouvêa, Michele S; Zagonel-Oliveira, Marcelo; Alvarado-Mora, Mónica V; Salzano, Francisco M; Pinho, João R R; Fagundes, Nelson J R
2016-09-01
Native American populations present the highest prevalence of Hepatitis B Virus (HBV) infection in the Americas, which may be associated to severe disease outcomes. Ten HBV genotypes (A–J) have been described, displaying a remarkable geographic structure, which most likely reflects historic patterns of human migrations. In this study, we characterize the HBV strains circulating in a historical sample of Native South Americans to characterize the historical viral dynamics in this population. The sample consisted of 1070 individuals belonging to 38 populations collected between 1965 and 1997. Presence of HBV DNA was checked by quantitative real-time PCR, and determination of HBV genotypes and subgenotypes was performed through sequencing and phylogenetic analysis of a fragment including part of HBsAg and Pol coding regions (S/Pol). A Bayesian Skyline Plot analysis was performed to compare the viral population dynamics of HBV/A1 strains found in Native Americans and in the general Brazilian population. A total of 109 individuals were positive for HBV DNA (~ 10%), and 70 samples were successfully sequenced and genotyped. Subgenotype A1 (HBV/A1), related to African populations and the African slave trade, was the most prevalent (66–94%). The Skyline Plot analysis showed a marked population expansion of HBV/A1 in Native Americans occurring more recently (1945–1965) than in the general Brazilian population. Our results suggest that historic processes that contributed to formation of HBV/A1 circulating in Native American are related with more recent migratory waves towards the Amazon basin, which generated a different viral dynamics in this region.
Abedini, Yasamin; Zamani, Bibi Eshrat; Kheradmand, Ali; Rajabizadeh, Ghodratollah
2012-01-01
Addiction to computer (video) games in adolescents and its relationship with educational progress has recently attracted the attention of rearing and education experts as well as organizations and institutes involved in physical and mental health. The current research attempted to propose a structural model of the relationships between parenting styles, mothers' occupation status, and addiction to computer games, self-control, and educational progress of secondary school students. Using multistage cluster random sampling, 500 female and male secondary school students in Kerman (Iran) were selected and studied. The research tools included self-control, parenting styles, and addiction to computer games questionnaires and a self-made questionnaire containing demographic details. The data was analyzed using exploratory factor analysis, Cronbach's alpha coefficient and route analysis (in LISREL). We found self-control to have a linking role in the relationship between four parenting styles and educational progress. Mothers' occupation status was directly and significantly correlated with addiction to computer games. Although four parenting styles directly and significantly affected addiction to computer games, the findings did not support the linking role of addiction to computer games in the relationship between four parenting styles and educational progress. In agreement with previous studies, the current research reflected the impact of four parenting styles on self-control, addiction to computer games, and educational progress of students. Among the parenting styles, authoritative style can affect the severity of addiction to computer games through self-control development. It can thus indirectly influence the educational progress of students. Parents are recommended to use authoritative parenting style to help both self-management and psychological health of their children. The employed mothers are also recommended to have more supervision and control on the degree and type of computer games selected by their children.
Abedini, Yasamin; Zamani, Bibi Eshrat; Kheradmand, Ali; Rajabizadeh, Ghodratollah
2012-01-01
Background Addiction to computer (video) games in adolescents and its relationship with educational progress has recently attracted the attention of rearing and education experts as well as organizations and institutes involved in physical and mental health. The current research attempted to propose a structural model of the relationships between parenting styles, mothers’ occupation status, and addiction to computer games, self-control, and educational progress of secondary school students. Methods Using multistage cluster random sampling, 500 female and male secondary school students in Kerman (Iran) were selected and studied. The research tools included self-control, parenting styles, and addiction to computer games questionnaires and a self-made questionnaire containing demographic details. The data was analyzed using exploratory factor analysis, Cronbach’s alpha coefficient and route analysis (in LISREL). Findings We found self-control to have a linking role in the relationship between four parenting styles and educational progress. Mothers’ occupation status was directly and significantly correlated with addiction to computer games. Although four parenting styles directly and significantly affected addiction to computer games, the findings did not support the linking role of addiction to computer games in the relationship between four parenting styles and educational progress. Conclusion In agreement with previous studies, the current research reflected the impact of four parenting styles on self-control, addiction to computer games, and educational progress of students. Among the parenting styles, authoritative style can affect the severity of addiction to computer games through self-control development. It can thus indirectly influence the educational progress of students. Parents are recommended to use authoritative parenting style to help both self-management and psychological health of their children. The employed mothers are also recommended to have more supervision and control on the degree and type of computer games selected by their children. PMID:24494143
Beyond context to the skyline: thinking in 3D.
Hoagwood, Kimberly; Olin, Serene; Cleek, Andrew
2013-01-01
Sweeping and profound structural, regulatory, and fiscal changes are rapidly reshaping the contours of health and mental health practice. The community-based practice contexts described in the excellent review by Garland and colleagues are being fundamentally altered with different business models, regional networks, accountability standards, and incentive structures. If community-based mental health services are to remain viable, the two-dimensional and flat research and practice paradigm has to be replaced with three-dimensional thinking. Failure to take seriously the changes that are happening to the larger healthcare context and respond actively through significant system redesign will lead to the demise of specialty mental health services.
NASA Technical Reports Server (NTRS)
Jones, K. L.; Henshaw, M.; Mcmenomy, C.; Robles, A.; Scribner, P. C.; Wall, S. D.; Wilson, J. W.
1981-01-01
Images returned by the two Viking landers during the extended and continuation automatic phases of the Viking Mission are presented. Information describing the conditions under which the images were acquired is included with skyline drawings showing the images positioned in the field of view of the cameras. Subsets of the images are listed in a variety of sequences to aid in locating images of interest. The format and organization of the digital magnetic tape storage of the images are described. A brief description of the mission and the camera system is also included.
NASA Technical Reports Server (NTRS)
Jones, K. L.; Henshaw, M.; Mcmenomy, C.; Robles, A.; Scribner, P. C.; Wall, S. D.; Wilson, J. W.
1981-01-01
All images returned by Viking Lander 1 during the extended and continuation automatic phases of the Viking Mission are presented. Listings of supplemental information which describe the conditions under which the images were acquired are included together with skyline drawings which show where the images are positioned in the field of view of the cameras. Subsets of the images are listed in a variety of sequences to aid in locating images of interest. The format and organization of the digital magnetic tape storage of the images are described as well as the mission and the camera system.
2017-01-01
Computational modeling has been applied to simulate the heterogeneity of cancer behavior. The development of Cervical Cancer (CC) is a process in which the cell acquires dynamic behavior from non-deleterious and deleterious mutations, exhibiting chromosomal alterations as a manifestation of this dynamic. To further determine the progression of chromosomal alterations in precursor lesions and CC, we introduce a computational model to study the dynamics of deleterious and non-deleterious mutations as an outcome of tumor progression. The analysis of chromosomal alterations mediated by our model reveals that multiple deleterious mutations are more frequent in precursor lesions than in CC. Cells with lethal deleterious mutations would be eliminated, which would mitigate cancer progression; on the other hand, cells with non-deleterious mutations would become dominant, which could predispose them to cancer progression. The study of somatic alterations through computer simulations of cancer progression provides a feasible pathway for insights into the transformation of cell mechanisms in humans. During cancer progression, tumors may acquire new phenotype traits, such as the ability to invade and metastasize or to become clinically important when they develop drug resistance. Non-deleterious chromosomal alterations contribute to this progression. PMID:28723940
Damage progression in Composite Structures
NASA Technical Reports Server (NTRS)
Minnetyan, Levon
1996-01-01
A computational simulation tool is used to evaluate the various stages of damage progression in composite materials during Iosipescu sheat testing. Unidirectional composite specimens with either the major or minor material axis in the load direction are considered. Damage progression characteristics are described for each specimen using two types of boundary conditions. A procedure is outlined regarding the use of computational simulation in composites testing. Iosipescu shear testing using the V-notched beam specimen is a convenient method to measure both shear strength and shear stiffness simultaneously. The evaluation of composite test response can be made more productive and informative via computational simulation of progressive damage and fracture. Computational simulation performs a complete evaluation of laminated composite fracture via assessment of ply and subply level damage/fracture processes.
Progressive fracture of fiber composites
NASA Technical Reports Server (NTRS)
Irvin, T. B.; Ginty, C. A.
1983-01-01
Refined models and procedures are described for determining progressive composite fracture in graphite/epoxy angleplied laminates. Lewis Research Center capabilities are utilized including the Real Time Ultrasonic C Scan (RUSCAN) experimental facility and the Composite Durability Structural Analysis (CODSTRAN) computer code. The CODSTRAN computer code is used to predict the fracture progression based on composite mechanics, finite element stress analysis, and fracture criteria modules. The RUSCAN facility, CODSTRAN computer code, and scanning electron microscope are used to determine durability and identify failure mechanisms in graphite/epoxy composites.
ERIC Educational Resources Information Center
Worrell, Jamie; Duffy, Mary Lou; Brady, Michael P.; Dukes, Charles; Gonzalez-DeHass, Alyssa
2016-01-01
Many schools use computer-based testing to measure students' progress for end-of-the-year and statewide assessments. There is little research to support whether computer-based testing accurately reflects student progress, particularly among students with learning, performance, and generalization difficulties. This article summarizes an…
Graphical Man/Machine Communications
Progress is reported concerning the use of computer controlled graphical displays in the areas of radiaton diffusion and hydrodynamics, general...ventricular dynamics. Progress is continuing on the use of computer graphics in architecture. Some progress in halftone graphics is reported with no basic...developments presented. Colored halftone perspective pictures are being used to represent multivariable situations. Nonlinear waveform processing is
Optical correction of refractive error for preventing and treating eye symptoms in computer users.
Heus, Pauline; Verbeek, Jos H; Tikka, Christina
2018-04-10
Computer users frequently complain about problems with seeing and functioning of the eyes. Asthenopia is a term generally used to describe symptoms related to (prolonged) use of the eyes like ocular fatigue, headache, pain or aching around the eyes, and burning and itchiness of the eyelids. The prevalence of asthenopia during or after work on a computer ranges from 46.3% to 68.5%. Uncorrected or under-corrected refractive error can contribute to the development of asthenopia. A refractive error is an error in the focusing of light by the eye and can lead to reduced visual acuity. There are various possibilities for optical correction of refractive errors including eyeglasses, contact lenses and refractive surgery. To examine the evidence on the effectiveness, safety and applicability of optical correction of refractive error for reducing and preventing eye symptoms in computer users. We searched the Cochrane Central Register of Controlled Trials (CENTRAL); PubMed; Embase; Web of Science; and OSH update, all to 20 December 2017. Additionally, we searched trial registries and checked references of included studies. We included randomised controlled trials (RCTs) and quasi-randomised trials of interventions evaluating optical correction for computer workers with refractive error for preventing or treating asthenopia and their effect on health related quality of life. Two authors independently assessed study eligibility and risk of bias, and extracted data. Where appropriate, we combined studies in a meta-analysis. We included eight studies with 381 participants. Three were parallel group RCTs, three were cross-over RCTs and two were quasi-randomised cross-over trials. All studies evaluated eyeglasses, there were no studies that evaluated contact lenses or surgery. Seven studies evaluated computer glasses with at least one focal area for the distance of the computer screen with or without additional focal areas in presbyopic persons. Six studies compared computer glasses to other types of glasses; and one study compared them to an ergonomic workplace assessment. The eighth study compared optimal correction of refractive error with the actual spectacle correction in use. Two studies evaluated computer glasses in persons with asthenopia but for the others the glasses were offered to all workers regardless of symptoms. The risk of bias was unclear in five, high in two and low in one study. Asthenopia was measured as eyestrain or a summary score of symptoms but there were no studies on health-related quality of life. Adverse events were measured as headache, nausea or dizziness. Median asthenopia scores at baseline were about 30% of the maximum possible score.Progressive computer glasses versus monofocal glassesOne study found no considerable difference in asthenopia between various progressive computer glasses and monofocal computer glasses after one-year follow-up (mean difference (MD) change scores 0.23, 95% confidence interval (CI) -5.0 to 5.4 on a 100 mm VAS scale, low quality evidence). For headache the results were in favour of progressive glasses.Progressive computer glasses with an intermediate focus in the upper part of the glasses versus other glassesIn two studies progressive computer glasses with intermediate focus led to a small decrease in asthenopia symptoms (SMD -0.49, 95% CI -0.75 to -0.23, low-quality evidence) but not in headache score in the short-term compared to general purpose progressive glasses. There were similar small decreases in dizziness. At medium term follow-up, in one study the effect size was not statistically significant (SMD -0.64, 95% CI -1.40 to 0.12). The study did not assess adverse events.Another study found no considerable difference in asthenopia between progressive computer glasses and monofocal computer glasses after one-year follow-up (MD change scores 1.44, 95% CI -6.95 to 9.83 on a 100 mm VAS scale, very low quality evidence). For headache the results were inconsistent.Progressive computer glasses with far-distance focus in the upper part of the glasses versus other glassesOne study found no considerable difference in number of persons with asthenopia between progressive computer glasses with far-distance focus and bifocal computer glasses after four weeks' follow-up (OR 1.00, 95% CI 0.40 to 2.50, very low quality evidence). The number of persons with headache, nausea and dizziness was also not different between groups.Another study found no considerable difference in asthenopia between progressive computer glasses with far-distance focus and monofocal computer glasses after one-year follow-up (MD change scores -1.79, 95% CI -11.60 to 8.02 on a 100 mm VAS scale, very low quality evidence). The effects on headaches were inconsistent.One study found no difference between progressive far-distance focus computer glasses and trifocal glasses in effect on eyestrain severity (MD -0.50, 95% CI -1.07 to 0.07, very low quality evidence) or on eyestrain frequency (MD -0.75, 95% CI -1.61 to 0.11, very low quality evidence).Progressive computer glasses versus ergonomic assessment with habitual (computer) glassesOne study found that computer glasses optimised for individual needs reduced asthenopia sum score more than an ergonomic assessment and habitual (computer) glasses (MD -8.9, 95% CI -16.47 to -1.33, scale 0 to 140, very low quality evidence) but there was no effect on the frequency of eyestrain (OR 1.08, 95% CI 0.38 to 3.11, very low quality evidence).We rated the quality of the evidence as low or very low due to risk of bias in the included studies, inconsistency in the results and imprecision. There is low to very low quality evidence that providing computer users with progressive computer glasses does not lead to a considerable decrease in problems with the eyes or headaches compared to other computer glasses. Progressive computer glasses might be slightly better than progressive glasses for daily use in the short term but not in the intermediate term and there is no data on long-term follow-up. The quality of the evidence is low or very low and therefore we are uncertain about this conclusion. Larger studies with several hundreds of participants are needed with proper randomisation, validated outcome measurement methods, and longer follow-up of at least one year to improve the quality of the evidence.
ERIC Educational Resources Information Center
Nelson, Peter M.; Van Norman, Ethan R.; Klingbeil, Dave A.; Parker, David C.
2017-01-01
Although extensive research exists on the use of curriculum-based measures for progress monitoring, little is known about using computer adaptive tests (CATs) for progress-monitoring purposes. The purpose of this study was to evaluate the impact of the frequency of data collection on individual and group growth estimates using a CAT. Data were…
Phylodynamics of classical swine fever virus with emphasis on Ecuadorian strains.
Garrido Haro, A D; Barrera Valle, M; Acosta, A; J Flores, F
2018-06-01
Classic swine fever virus (CSFV) is a Pestivirus from the Flaviviridae family that affects pigs worldwide and is endemic in several Latin American countries. However, there are still some countries in the region, including Ecuador, for which CSFV molecular information is lacking. To better understand the epidemiology of CSFV in the Americas, sequences from CSFVs from Ecuador were generated and a phylodynamic analysis of the virus was performed. Sequences for the full-length glycoprotein E2 gene of twenty field isolates were obtained and, along with sequences from strains previously described in the Americas and from the most representative strains worldwide, were used to analyse the phylodynamics of the virus. Bayesian methods were used to test several molecular clock and demographic models. A calibrated ultrametric tree and a Bayesian skyline were constructed, and codons associated with positive selection involving immune scape were detected. The best model according to Bayes factors was the strict molecular clock and Bayesian skyline model, which shows that CSFV has an evolution rate of 3.2 × 10 -4 substitutions per site per year. The model estimates the origin of CSFV in the mid-1500s. There is a strong spatial structure for CSFV in the Americas, indicating that the virus is moving mainly through neighbouring countries. The genetic diversity of CSFV has increased constantly since its appearance, with a slight decrease in mid-twentieth century, which coincides, with eradication campaigns in North America. Even though there is no evidence of strong directional evolution of the E2 gene in CSFV, codons 713, 761, 762 and 975 appear to be selected positively and could be related to virulence or pathogenesis. These results reveal how CSFV has spread and evolved since it first appeared in the Americas and provide important information for attaining the goal of eradication of this virus in Latin America. © 2018 Blackwell Verlag GmbH.
Putting humans in the loop: Using crowdsourced snow information to inform water management
NASA Astrophysics Data System (ADS)
Fedorov, Roman; Giuliani, Matteo; Castelletti, Andrea; Fraternali, Piero
2016-04-01
The unprecedented availability of user generated data on the Web due to the advent of online services, social networks, and crowdsourcing, is opening new opportunities for enhancing real-time monitoring and modeling of environmental systems based on data that are public, low-cost, and spatio-temporally dense, possibly contributing to our ability of making better decisions. In this work, we contribute a novel crowdsourcing procedure for computing virtual snow indexes from public web images, either produced by users or generated by touristic webcams, which is based on a complex architecture designed for automatically crawling content from multiple web data sources. The procedure retains only geo-tagged images containing a mountain skyline, identifies the visible peaks in each image using a public online digital terrain model, and classifies the mountain image pixels as snow or no-snow. This operation yields a snow mask per image, from which it is possible to extract time series of virtual snow indexes representing a proxy of the snow covered area. The value of the obtained virtual snow indexes is estimated in a real world water management problem. We consider the snow-dominated catchment of Lake Como, a regulated lake in Northern Italy, where snowmelt represents the most important contribution to seasonal lake storage, and we used the virtual snow indexes for informing the daily operation of the lake's dam. Numerical results show that such information is effective in extending the anticipation capacity of the lake operations, ultimately improving the system performance.
Schönberg, Anna; Theunert, Christoph; Li, Mingkun; Stoneking, Mark; Nasidze, Ivan
2011-09-01
To investigate the demographic history of human populations from the Caucasus and surrounding regions, we used high-throughput sequencing to generate 147 complete mtDNA genome sequences from random samples of individuals from three groups from the Caucasus (Armenians, Azeri and Georgians), and one group each from Iran and Turkey. Overall diversity is very high, with 144 different sequences that fall into 97 different haplogroups found among the 147 individuals. Bayesian skyline plots (BSPs) of population size change through time show a population expansion around 40-50 kya, followed by a constant population size, and then another expansion around 15-18 kya for the groups from the Caucasus and Iran. The BSP for Turkey differs the most from the others, with an increase from 35 to 50 kya followed by a prolonged period of constant population size, and no indication of a second period of growth. An approximate Bayesian computation approach was used to estimate divergence times between each pair of populations; the oldest divergence times were between Turkey and the other four groups from the South Caucasus and Iran (~400-600 generations), while the divergence time of the three Caucasus groups from each other was comparable to their divergence time from Iran (average of ~360 generations). These results illustrate the value of random sampling of complete mtDNA genome sequences that can be obtained with high-throughput sequencing platforms.
Distributed Computing Environment for Mine Warfare Command
1993-06-01
based system to a decentralized network of personal computers over the past several years. This thesis analyzes the progress of the evolution as of May of...network of personal computers over the past several years. This thesis analyzes the progress of the evolution as of May of 1992. The building blocks of a...85 A. BACKGROUND ............. .................. 85 B. PAST ENVIRONMENT ........... ............... 86 C. PRESENT ENVIRONMENT
Toward high-resolution computational design of helical membrane protein structure and function
Barth, Patrick; Senes, Alessandro
2016-01-01
The computational design of α-helical membrane proteins is still in its infancy but has made important progress. De novo design has produced stable, specific and active minimalistic oligomeric systems. Computational re-engineering can improve stability and modulate the function of natural membrane proteins. Currently, the major hurdle for the field is not computational, but the experimental characterization of the designs. The emergence of new structural methods for membrane proteins will accelerate progress PMID:27273630
2016 Institutional Computing Progress Report for w14_firetec
DOE Office of Scientific and Technical Information (OSTI.GOV)
White, Judith W.; Linn, Rodman
2016-07-14
This is a computing progress report for w14_firetec. FIRETEC simulations will explore the prescribed fire ignition methods to achieve burning objectives (understory reduction and ecosystem health) but at the same time minimize the risk of escaped fire.
78 FR 25482 - Notice of Revised Determination on Reconsideration
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-01
...-PROGRESSIVE SOFTWARE COMPUTING, QUALITY TESTING SERVICES, INC., RAILROAD CONSTRUCTION CO. OF SOUTH JERSEY, INC..., LP, PSCI- Progressive Software Computing, Quality Testing Services, Inc., Railroad Construction Co..., ANDERSON CONSTRUCTION SERVICES, BAKER PETROLITE, BAKERCORP, BELL-FAST FIRE PROTECTION INC., BOLTTECH INC...
How to Build a Quantum Computer
NASA Astrophysics Data System (ADS)
Sanders, Barry C.
2017-11-01
Quantum computer technology is progressing rapidly with dozens of qubits and hundreds of quantum logic gates now possible. Although current quantum computer technology is distant from being able to solve computational problems beyond the reach of non-quantum computers, experiments have progressed well beyond simply demonstrating the requisite components. We can now operate small quantum logic processors with connected networks of qubits and quantum logic gates, which is a great stride towards functioning quantum computers. This book aims to be accessible to a broad audience with basic knowledge of computers, electronics and physics. The goal is to convey key notions relevant to building quantum computers and to present state-of-the-art quantum-computer research in various media such as trapped ions, superconducting circuits, photonics and beyond.
Progressive Damage and Fracture in Composites Under Dynamic Loading
NASA Technical Reports Server (NTRS)
Minnetyan, Levon
1994-01-01
A computational simulation tool is used to evaluate the various stages of damage progression in composite materials during losipescu shear testing. Unidirectional composite specimens with either the major or minor material axis in the load direction are considered. Damage progression characteristics are described for each specimen using two types of boundary conditions. A procedure is outlined regarding the use of computational simulation in the testing of composite materials.
Progressive Fracture of Fiber Composite Build-Up Structures
NASA Technical Reports Server (NTRS)
Gotsis, Pascal K.; Chamis, C. C.; Minnetyan, Levon
1997-01-01
Damage progression and fracture of built-up composite structures is evaluated by using computational simulation. The objective is to examine the behavior and response of a stiffened composite (0/ +/- 45/90)(sub s6) laminate panel by simulating the damage initiation, growth, accumulation, progression and propagation to structural collapse. An integrated computer code, CODSTRAN, was augmented for the simulation of the progressive damage and fracture of built-up composite structures under mechanical loading. Results show that damage initiation and progression have significant effect on the structural response. Influence of the type of loading is investigated on the damage initiation, propagation and final fracture of the build-up composite panel.
Progressive Fracture of Fiber Composite Build-Up Structures
NASA Technical Reports Server (NTRS)
Minnetyan, Levon; Gotsis, Pascal K.; Chamis, C. C.
1997-01-01
Damage progression and fracture of built-up composite structures is evaluated by using computational simulation. The objective is to examine the behavior and response of a stiffened composite (0 +/-45/90)(sub s6) laminate panel by simulating the damage initiation, growth, accumulation, progression and propagation to structural collapse. An integrated computer code CODSTRAN was augmented for the simulation of the progressive damage and fracture of built-up composite structures under mechanical loading. Results show that damage initiation and progression to have significant effect on the structural response. Influence of the type of loading is investigated on the damage initiation, propagation and final fracture of the build-up composite panel.
A second level of the Saint Petersburg skyline
NASA Astrophysics Data System (ADS)
Krasnopolsky, Andrey; Bolotin, Sergey
2018-03-01
The article considers the history of the residential development in Saint Petersburg and states corresponding landmark dates. In recent years, changes in the altitude range of the residential development are noted, the influence of this factor on the formation of the city's silhouette is assessed. Reasons for such changes are identified. Attractiveness of high-rise residential complexes for living is assessed. Conclusions are made of tendencies in further housing construction development in terms of its altitude range. It is noted that it is possible to locate multi-storied buildings in the periphery of the city, taking into account specific visual characteristics of the construction site and silhouette of erected buildings; as for central districts, strict regulations regarding the altitude range are needed.
Interpreting megalithic tomb orientation and siting within broader cultural contexts
NASA Astrophysics Data System (ADS)
Prendergast, Frank
2016-02-01
This paper assesses the measured axial orientations and siting of Irish passage tombs. The distribution of monuments with passages/entrances directed at related tombs/cairns is shown. Where this phenomenon occurs, the targeted structure is invariably located at a higher elevation on the skyline and this could suggest a symbolic and hierarchical relationship in their relative siting in the landscape. Additional analysis of astronomical declinations at a national scale has identified tombs with an axial alignment towards the rising and setting positions of the Sun at the winter and summer solstices. A criteria-based framework is developed which potentially allows for these types of data to be more meaningfully considered and culturally interpreted within broader archaeological and social anthropological contexts.
Progress on the Fabric for Frontier Experiments Project at Fermilab
NASA Astrophysics Data System (ADS)
Box, Dennis; Boyd, Joseph; Dykstra, Dave; Garzoglio, Gabriele; Herner, Kenneth; Kirby, Michael; Kreymer, Arthur; Levshina, Tanya; Mhashilkar, Parag; Sharma, Neha
2015-12-01
The FabrIc for Frontier Experiments (FIFE) project is an ambitious, major-impact initiative within the Fermilab Scientific Computing Division designed to lead the computing model for Fermilab experiments. FIFE is a collaborative effort between experimenters and computing professionals to design and develop integrated computing models for experiments of varying needs and infrastructure. The major focus of the FIFE project is the development, deployment, and integration of Open Science Grid solutions for high throughput computing, data management, database access and collaboration within experiment. To accomplish this goal, FIFE has developed workflows that utilize Open Science Grid sites along with dedicated and commercial cloud resources. The FIFE project has made significant progress integrating into experiment computing operations several services including new job submission services, software and reference data distribution through CVMFS repositories, flexible data transfer client, and access to opportunistic resources on the Open Science Grid. The progress with current experiments and plans for expansion with additional projects will be discussed. FIFE has taken a leading role in the definition of the computing model for Fermilab experiments, aided in the design of computing for experiments beyond Fermilab, and will continue to define the future direction of high throughput computing for future physics experiments worldwide.
Durkin, Kevin; Conti-Ramsden, Gina
2012-01-01
Computer use draws on linguistic abilities. Using this medium thus presents challenges for young people with Specific Language Impairment (SLI) and raises questions of whether computer-based tasks are appropriate for them. We consider theoretical arguments predicting impaired performance and negative outcomes relative to peers without SLI versus the possibility of positive gains. We examine the relationship between frequency of computer use (for leisure and educational purposes) and educational achievement; in particular examination performance at the end of compulsory education and level of educational progress two years later. Participants were 49 young people with SLI and 56 typically developing (TD) young people. At around age 17, the two groups did not differ in frequency of educational computer use or leisure computer use. There were no associations between computer use and educational outcomes in the TD group. In the SLI group, after PIQ was controlled for, educational computer use at around 17 years of age contributed substantially to the prediction of educational progress at 19 years. The findings suggest that educational uses of computers are conducive to educational progress in young people with SLI. PMID:23300610
Biomolecular computing systems: principles, progress and potential.
Benenson, Yaakov
2012-06-12
The task of information processing, or computation, can be performed by natural and man-made 'devices'. Man-made computers are made from silicon chips, whereas natural 'computers', such as the brain, use cells and molecules. Computation also occurs on a much smaller scale in regulatory and signalling pathways in individual cells and even within single biomolecules. Indeed, much of what we recognize as life results from the remarkable capacity of biological building blocks to compute in highly sophisticated ways. Rational design and engineering of biological computing systems can greatly enhance our ability to study and to control biological systems. Potential applications include tissue engineering and regeneration and medical treatments. This Review introduces key concepts and discusses recent progress that has been made in biomolecular computing.
NASA Astrophysics Data System (ADS)
Saghafian, Amirreza; Pitsch, Heinz
2012-11-01
A compressible flamelet/progress variable approach (CFPV) has been devised for high-speed flows. Temperature is computed from the transported total energy and tabulated species mass fractions and the source term of the progress variable is rescaled with pressure and temperature. The combustion is thus modeled by three additional scalar equations and a chemistry table that is computed in a pre-processing step. Three-dimensional direct numerical simulation (DNS) databases of reacting supersonic turbulent mixing layer with detailed chemistry are analyzed to assess the underlying assumptions of CFPV. Large eddy simulations (LES) of the same configuration using the CFPV method have been performed and compared with the DNS results. The LES computations are based on the presumed subgrid PDFs of mixture fraction and progress variable, beta function and delta function respectively, which are assessed using DNS databases. The flamelet equation budget is also computed to verify the validity of CFPV method for high-speed flows.
Recent progress of quantum annealing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Suzuki, Sei
2015-03-10
We review the recent progress of quantum annealing. Quantum annealing was proposed as a method to solve generic optimization problems. Recently a Canadian company has drawn a great deal of attention, as it has commercialized a quantum computer based on quantum annealing. Although the performance of quantum annealing is not sufficiently understood, it is likely that quantum annealing will be a practical method both on a conventional computer and on a quantum computer.
Scientific Discovery through Advanced Computing in Plasma Science
NASA Astrophysics Data System (ADS)
Tang, William
2005-03-01
Advanced computing is generally recognized to be an increasingly vital tool for accelerating progress in scientific research during the 21st Century. For example, the Department of Energy's ``Scientific Discovery through Advanced Computing'' (SciDAC) Program was motivated in large measure by the fact that formidable scientific challenges in its research portfolio could best be addressed by utilizing the combination of the rapid advances in super-computing technology together with the emergence of effective new algorithms and computational methodologies. The imperative is to translate such progress into corresponding increases in the performance of the scientific codes used to model complex physical systems such as those encountered in high temperature plasma research. If properly validated against experimental measurements and analytic benchmarks, these codes can provide reliable predictive capability for the behavior of a broad range of complex natural and engineered systems. This talk reviews recent progress and future directions for advanced simulations with some illustrative examples taken from the plasma science applications area. Significant recent progress has been made in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics, giving increasingly good agreement between experimental observations and computational modeling. This was made possible by the combination of access to powerful new computational resources together with innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning a huge range in time and space scales. In particular, the plasma science community has made excellent progress in developing advanced codes for which computer run-time and problem size scale well with the number of processors on massively parallel machines (MPP's). A good example is the effective usage of the full power of multi-teraflop (multi-trillion floating point computations per second) MPP's to produce three-dimensional, general geometry, nonlinear particle simulations which have accelerated progress in understanding the nature of plasma turbulence in magnetically-confined high temperature plasmas. These calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In general, results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. The associated scientific excitement should serve to stimulate improved cross-cutting collaborations with other fields and also to help attract bright young talent to the computational science area.
Progressive Fracture of Fiber Composite Builtup Structures
NASA Technical Reports Server (NTRS)
Gotsis, Pascal K.; Chamis, Christos C.; Minnetyan, Levon
1996-01-01
The damage progression and fracture of builtup composite structures was evaluated by using computational simulation to examine the behavior and response of a stiffened composite (0 +/- 45/90)(sub s6) laminate panel subjected to a bending load. The damage initiation, growth, accumulation, progression, and propagation to structural collapse were simulated. An integrated computer code (CODSTRAN) was augmented for the simulation of the progressive damage and fracture of builtup composite structures under mechanical loading. Results showed that damage initiation and progression have a significant effect on the structural response. Also investigated was the influence of different types of bending load on the damage initiation, propagation, and final fracture of the builtup composite panel.
Computer Technology Standards of Learning for Virginia's Public Schools
ERIC Educational Resources Information Center
Virginia Department of Education, 2005
2005-01-01
The Computer/Technology Standards of Learning identify and define the progressive development of essential knowledge and skills necessary for students to access, evaluate, use, and create information using technology. They provide a framework for technology literacy and demonstrate a progression from physical manipulation skills for the use of…
Synthetic Analog and Digital Circuits for Cellular Computation and Memory
Purcell, Oliver; Lu, Timothy K.
2014-01-01
Biological computation is a major area of focus in synthetic biology because it has the potential to enable a wide range of applications. Synthetic biologists have applied engineering concepts to biological systems in order to construct progressively more complex gene circuits capable of processing information in living cells. Here, we review the current state of computational genetic circuits and describe artificial gene circuits that perform digital and analog computation. We then discuss recent progress in designing gene circuits that exhibit memory, and how memory and computation have been integrated to yield more complex systems that can both process and record information. Finally, we suggest new directions for engineering biological circuits capable of computation. PMID:24794536
Computer Graphics Research Laboratory Quarterly Progress Report Number 49, July-September 1993
1993-11-22
20 Texture Sampling and Strength Guided Motion: Jeffry S. Nimeroff 23 21 Radiosity : Min-Zhi Shao 24 22 Blended Shape Primitives: Douglas DeCarlo 25 23...placement. "* Extensions of radiosity rendering. "* A discussion of blended shape primitives and the applications in computer vision and computer...user. Radiosity : An improved version of the radiosity renderer is included. This version uses a fast over- relaxation progressive refinement algorithm
Progress on the FabrIc for Frontier Experiments project at Fermilab
Box, Dennis; Boyd, Joseph; Dykstra, Dave; ...
2015-12-23
The FabrIc for Frontier Experiments (FIFE) project is an ambitious, major-impact initiative within the Fermilab Scientific Computing Division designed to lead the computing model for Fermilab experiments. FIFE is a collaborative effort between experimenters and computing professionals to design and develop integrated computing models for experiments of varying needs and infrastructure. The major focus of the FIFE project is the development, deployment, and integration of Open Science Grid solutions for high throughput computing, data management, database access and collaboration within experiment. To accomplish this goal, FIFE has developed workflows that utilize Open Science Grid sites along with dedicated and commercialmore » cloud resources. The FIFE project has made significant progress integrating into experiment computing operations several services including new job submission services, software and reference data distribution through CVMFS repositories, flexible data transfer client, and access to opportunistic resources on the Open Science Grid. Hence, the progress with current experiments and plans for expansion with additional projects will be discussed. FIFE has taken a leading role in the definition of the computing model for Fermilab experiments, aided in the design of computing for experiments beyond Fermilab, and will continue to define the future direction of high throughput computing for future physics experiments worldwide« less
ERIC Educational Resources Information Center
Darabi, Aubteen; Nelson, David W.; Meeker, Richard; Liang, Xinya; Boulware, Wilma
2010-01-01
In a diagnostic problem solving operation of a computer-simulated chemical plant, chemical engineering students were randomly assigned to two groups: one studying product-oriented worked examples, the other practicing conventional problem solving. Effects of these instructional strategies on the progression of learners' mental models were examined…
Progress in computational toxicology.
Ekins, Sean
2014-01-01
Computational methods have been widely applied to toxicology across pharmaceutical, consumer product and environmental fields over the past decade. Progress in computational toxicology is now reviewed. A literature review was performed on computational models for hepatotoxicity (e.g. for drug-induced liver injury (DILI)), cardiotoxicity, renal toxicity and genotoxicity. In addition various publications have been highlighted that use machine learning methods. Several computational toxicology model datasets from past publications were used to compare Bayesian and Support Vector Machine (SVM) learning methods. The increasing amounts of data for defined toxicology endpoints have enabled machine learning models that have been increasingly used for predictions. It is shown that across many different models Bayesian and SVM perform similarly based on cross validation data. Considerable progress has been made in computational toxicology in a decade in both model development and availability of larger scale or 'big data' models. The future efforts in toxicology data generation will likely provide us with hundreds of thousands of compounds that are readily accessible for machine learning models. These models will cover relevant chemistry space for pharmaceutical, consumer product and environmental applications. Copyright © 2013 Elsevier Inc. All rights reserved.
Barth, Patrick; Senes, Alessandro
2016-06-07
The computational design of α-helical membrane proteins is still in its infancy but has already made great progress. De novo design allows stable, specific and active minimal oligomeric systems to be obtained. Computational reengineering can improve the stability and function of naturally occurring membrane proteins. Currently, the major hurdle for the field is the experimental characterization of the designs. The emergence of new structural methods for membrane proteins will accelerate progress.
Progress in Computational Electron-Molecule Collisions
NASA Astrophysics Data System (ADS)
Rescigno, Tn
1997-10-01
The past few years have witnessed tremendous progress in the development of sophisticated ab initio methods for treating collisions of slow electrons with isolated small molecules. Researchers in this area have benefited greatly from advances in computer technology; indeed, the advent of parallel computers has made it possible to carry out calculations at a level of sophistication inconceivable a decade ago. But bigger and faster computers are only part of the picture. Even with today's computers, the practical need to study electron collisions with the kinds of complex molecules and fragments encountered in real-world plasma processing environments is taxing present methods beyond their current capabilities. Since extrapolation of existing methods to handle increasingly larger targets will ultimately fail as it would require computational resources beyond any imagined, continued progress must also be linked to new theoretical developments. Some of the techniques recently introduced to address these problems will be discussed and illustrated with examples of electron-molecule collision calculations we have carried out on some fairly complex target gases encountered in processing plasmas. Electron-molecule scattering continues to pose many formidable theoretical and computational challenges. I will touch on some of the outstanding open questions.
Synthetic analog and digital circuits for cellular computation and memory.
Purcell, Oliver; Lu, Timothy K
2014-10-01
Biological computation is a major area of focus in synthetic biology because it has the potential to enable a wide range of applications. Synthetic biologists have applied engineering concepts to biological systems in order to construct progressively more complex gene circuits capable of processing information in living cells. Here, we review the current state of computational genetic circuits and describe artificial gene circuits that perform digital and analog computation. We then discuss recent progress in designing gene networks that exhibit memory, and how memory and computation have been integrated to yield more complex systems that can both process and record information. Finally, we suggest new directions for engineering biological circuits capable of computation. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.
Experimental evaluation of certification trails using abstract data type validation
NASA Technical Reports Server (NTRS)
Wilson, Dwight S.; Sullivan, Gregory F.; Masson, Gerald M.
1993-01-01
Certification trails are a recently introduced and promising approach to fault-detection and fault-tolerance. Recent experimental work reveals many cases in which a certification-trail approach allows for significantly faster program execution time than a basic time-redundancy approach. Algorithms for answer-validation of abstract data types allow a certification trail approach to be used for a wide variety of problems. An attempt to assess the performance of algorithms utilizing certification trails on abstract data types is reported. Specifically, this method was applied to the following problems: heapsort, Hullman tree, shortest path, and skyline. Previous results used certification trails specific to a particular problem and implementation. The approach allows certification trails to be localized to 'data structure modules,' making the use of this technique transparent to the user of such modules.
VizieR Online Data Catalog: Investigating Tully-Fisher relation with KMOS3D (Ubler+,
NASA Astrophysics Data System (ADS)
Ubler, H.; Forster Schreiber, N. M.; Genzel, R.; Wisnioski, E.; Wuyts, S.; Lang, P.; Naab, T.; Burkert, A.; van Dokkum, P. G.; Tacconi, L. J.; Wilman, D. J.; Fossati, M.; Mendel, J. T.; Beifiori, A.; Belli, S.; Bender, R.; Brammer, G. B.; Chan, J.; Davies, R.; Fabricius, M.; Galametz, A.; Lutz, D.; Momcheva, I. G.; Nelson, E. J.; Saglia, R. P.; Seitz, S.; Tadaki, K.
2018-02-01
This work is based on the first 3yr of observations of KMOS3D multiyear near-infrared (near-IR) IFS survey of more than 600 mass-selected star-forming galaxies (SFGs) at 0.6<~z<~2.6 with the K-band Multi Object Spectrograph (KMOS; Sharples+ 2013Msngr.151...21S) on the Very Large Telescope. The KMOS3D survey and data reduction are described in detail by Wisnioski et al. 2015ApJ...799..209W The results presented in this paper build on the KMOS3D sample as of 2016 January, with 536 observed galaxies. Of these, 316 are detected in, and have spatially resolved, Hα emission free from skyline contamination from which two-dimensional velocity and dispersion maps are produced. (1 data file).
Closer Look: Majestic Mountains and Frozen Plains
2015-09-17
Just 15 minutes after its closest approach to Pluto on July 14, 2015, NASA's New Horizons spacecraft looked back toward the sun and captured a near-sunset view of the rugged, icy mountains and flat ice plains extending to Pluto's horizon. The smooth expanse of the informally named Sputnik Planum (right) is flanked to the west (left) by rugged mountains up to 11,000 feet (3,500 meters) high, including the informally named Norgay Montes in the foreground and Hillary Montes on the skyline. The backlighting highlights more than a dozen layers of haze in Pluto's tenuous but distended atmosphere. The image was taken from a distance of 11,000 miles (18,000 kilometers) to Pluto; the scene is 230 miles (380 kilometers) across. http://photojournal.jpl.nasa.gov/catalog/PIA19947
Preparing to Test for Deep Space
2015-07-15
A structural steel section is lifted into place atop the B-2 Test Stand at NASA’s Stennis Space Center as part of modification work to prepare for testing the core stage of NASA’s new Space Launch System. The section is part of the Main Propulsion Test Article (MPTA) framework, which will support the SLS core stage for testing. The existing framework was installed on the stand in the late 1970s to test the shuttle MPTA. However, that framework had to be repositioned and modified to accommodate the larger SLS stage. About 1 million pounds of structural steel has been added, extending the framework about 100 feet higher and providing a new look to the Stennis skyline. Stennis will test the actual flight core stage for the first uncrewed SLS mission, Exploration Mission-1.
Method and system for benchmarking computers
Gustafson, John L.
1993-09-14
A testing system and method for benchmarking computer systems. The system includes a store containing a scalable set of tasks to be performed to produce a solution in ever-increasing degrees of resolution as a larger number of the tasks are performed. A timing and control module allots to each computer a fixed benchmarking interval in which to perform the stored tasks. Means are provided for determining, after completion of the benchmarking interval, the degree of progress through the scalable set of tasks and for producing a benchmarking rating relating to the degree of progress for each computer.
High-speed multiple sequence alignment on a reconfigurable platform.
Oliver, Tim; Schmidt, Bertil; Maskell, Douglas; Nathan, Darran; Clemens, Ralf
2006-01-01
Progressive alignment is a widely used approach to compute multiple sequence alignments (MSAs). However, aligning several hundred sequences by popular progressive alignment tools requires hours on sequential computers. Due to the rapid growth of sequence databases biologists have to compute MSAs in a far shorter time. In this paper we present a new approach to MSA on reconfigurable hardware platforms to gain high performance at low cost. We have constructed a linear systolic array to perform pairwise sequence distance computations using dynamic programming. This results in an implementation with significant runtime savings on a standard FPGA.
Turchetto, Caroline; Fagundes, Nelson J R; Segatto, Ana L A; Kuhlemeier, Cris; Solís Neffa, Viviana G; Speranza, Pablo R; Bonatto, Sandro L; Freitas, Loreta B
2014-02-01
Understanding the spatiotemporal distribution of genetic variation and the ways in which this distribution is connected to the ecological context of natural populations is fundamental for understanding the nature and mode of intraspecific and, ultimately, interspecific differentiation. The Petunia axillaris complex is endemic to the grasslands of southern South America and includes three subspecies: P. a. axillaris, P. a. parodii and P. a. subandina. These subspecies are traditionally delimited based on both geography and floral morphology, although the latter is highly variable. Here, we determined the patterns of genetic (nuclear and cpDNA), morphological and ecological (bioclimatic) variation of a large number of P. axillaris populations and found that they are mostly coincident with subspecies delimitation. The nuclear data suggest that the subspecies are likely independent evolutionary units, and their morphological differences may be associated with local adaptations to diverse climatic and/or edaphic conditions and population isolation. The demographic dynamics over time estimated by skyline plot analyses showed different patterns for each subspecies in the last 100 000 years, which is compatible with a divergence time between 35 000 and 107 000 years ago between P. a. axillaris and P. a. parodii, as estimated with the IMa program. Coalescent simulation tests using Approximate Bayesian Computation do not support previous suggestions of extensive gene flow between P. a. axillaris and P. a. parodii in their contact zone. © 2013 John Wiley & Sons Ltd.
Phillips, C D; Hoffman, J I; George, J C; Suydam, R S; Huebinger, R M; Patton, J C; Bickham, J W
2013-01-01
Patterns of genetic variation observed within species reflect evolutionary histories that include signatures of past demography. Understanding the demographic component of species' history is fundamental to informed management because changes in effective population size affect response to environmental change and evolvability, the strength of genetic drift, and maintenance of genetic variability. Species experiencing anthropogenic population reductions provide valuable case studies for understanding the genetic response to demographic change because historic changes in the census size are often well documented. A classic example is the bowhead whale, Balaena mysticetus, which experienced dramatic population depletion due to commercial whaling in the late 19th and early 20th centuries. Consequently, we analyzed a large multi-marker dataset of bowhead whales using a variety of analytical methods, including extended Bayesian skyline analysis and approximate Bayesian computation, to characterize genetic signatures of both ancient and contemporary demographic histories. No genetic signature of recent population depletion was recovered through any analysis incorporating realistic mutation assumptions, probably due to the combined influences of long generation time, short bottleneck duration, and the magnitude of population depletion. In contrast, a robust signal of population expansion was detected around 70,000 years ago, followed by a population decline around 15,000 years ago. The timing of these events coincides to a historic glacial period and the onset of warming at the end of the last glacial maximum, respectively. By implication, climate driven long-term variation in Arctic Ocean productivity, rather than recent anthropogenic disturbance, appears to have been the primary driver of historic bowhead whale demography. PMID:23403722
Rosvold, Jørgen; Røed, Knut H; Hufthammer, Anne Karin; Andersen, Reidar; Stenøien, Hans K
2012-09-26
Red deer (Cervus elaphus) have been an important human resource for millennia, experiencing intensive human influence through habitat alterations, hunting and translocation of animals. In this study we investigate a time series of ancient and contemporary DNA from Norwegian red deer spanning about 7,000 years. Our main aim was to investigate how increasing agricultural land use, hunting pressure and possibly human mediated translocation of animals have affected the genetic diversity on a long-term scale. We obtained mtDNA (D-loop) sequences from 73 ancient specimens. These show higher genetic diversity in ancient compared to extant samples, with the highest diversity preceding the onset of agricultural intensification in the Early Iron Age. Using standard diversity indices, Bayesian skyline plot and approximate Bayesian computation, we detected a population reduction which was more prolonged than, but not as severe as, historic documents indicate. There are signs of substantial changes in haplotype frequencies primarily due to loss of haplotypes through genetic drift. There is no indication of human mediated translocations into the Norwegian population. All the Norwegian sequences show a western European origin, from which the Norwegian lineage diverged approximately 15,000 years ago. Our results provide direct insight into the effects of increasing habitat fragmentation and human hunting pressure on genetic diversity and structure of red deer populations. They also shed light on the northward post-glacial colonisation process of red deer in Europe and suggest increased precision in inferring past demographic events when including both ancient and contemporary DNA.
Computational chemistry in pharmaceutical research: at the crossroads.
Bajorath, Jürgen
2012-01-01
Computational approaches are an integral part of pharmaceutical research. However, there are many of unsolved key questions that limit the scientific progress in the still evolving computational field and its impact on drug discovery. Importantly, a number of these questions are not new but date back many years. Hence, it might be difficult to conclusively answer them in the foreseeable future. Moreover, the computational field as a whole is characterized by a high degree of heterogeneity and so is, unfortunately, the quality of its scientific output. In light of this situation, it is proposed that changes in scientific standards and culture should be seriously considered now in order to lay a foundation for future progress in computational research.
ERIC Educational Resources Information Center
Van Norman, Ethan R.; Nelson, Peter M.; Parker, David C.
2017-01-01
Computer adaptive tests (CATs) hold promise to monitor student progress within multitiered systems of support. However, the relationship between how long and how often data are collected and the technical adequacy of growth estimates from CATs has not been explored. Given CAT administration times, it is important to identify optimal data…
1920x1080 pixel color camera with progressive scan at 50 to 60 frames per second
NASA Astrophysics Data System (ADS)
Glenn, William E.; Marcinka, John W.
1998-09-01
For over a decade, the broadcast industry, the film industry and the computer industry have had a long-range objective to originate high definition images with progressive scan. This produces images with better vertical resolution and much fewer artifacts than interlaced scan. Computers almost universally use progressive scan. The broadcast industry has resisted switching from interlace to progressive because no cameras were available in that format with the 1920 X 1080 resolution that had obtained international acceptance for high definition program production. The camera described in this paper produces an output in that format derived from two 1920 X 1080 CCD sensors produced by Eastman Kodak.
Computational Aeroelastic Modeling of Airframes and TurboMachinery: Progress and Challenges
NASA Technical Reports Server (NTRS)
Bartels, R. E.; Sayma, A. I.
2006-01-01
Computational analyses such as computational fluid dynamics and computational structural dynamics have made major advances toward maturity as engineering tools. Computational aeroelasticity is the integration of these disciplines. As computational aeroelasticity matures it too finds an increasing role in the design and analysis of aerospace vehicles. This paper presents a survey of the current state of computational aeroelasticity with a discussion of recent research, success and continuing challenges in its progressive integration into multidisciplinary aerospace design. This paper approaches computational aeroelasticity from the perspective of the two main areas of application: airframe and turbomachinery design. An overview will be presented of the different prediction methods used for each field of application. Differing levels of nonlinear modeling will be discussed with insight into accuracy versus complexity and computational requirements. Subjects will include current advanced methods (linear and nonlinear), nonlinear flow models, use of order reduction techniques and future trends in incorporating structural nonlinearity. Examples in which computational aeroelasticity is currently being integrated into the design of airframes and turbomachinery will be presented.
Curiosity Self-Portrait at Murray Buttes.
2016-10-03
This self-portrait of NASA's Curiosity Mars rover shows the vehicle at the "Quela" drilling location in the "Murray Buttes" area on lower Mount Sharp. Key features on the skyline of this panorama are the dark mesa called "M12" to the left of the rover's mast and pale, upper Mount Sharp to the right of the mast. The top of M12 stands about 23 feet (7 meters) above the base of the sloping piles of rocks just behind Curiosity. The scene combines approximately 60 images taken by the Mars Hand Lens Imager (MAHLI) camera at the end of the rover's robotic arm. Most of the component images were taken on Sept. 17, 2016, during the 1,463rd Martian day, or sol, of Curiosity's work on Mars. Two component images of the drill-hole area in front of the rover were taken on Sol 1466 (Sept. 20) to show the hole created by collecting a drilled sample at Quela on Sol 1464 (Sept. 18). The skyline sweeps from west on the left to south-southwest on the right, with the rover's mast at northeast. The rover's location when it recorded this scene was where it ended a drive on Sol 1455, mapped at http://mars.nasa.gov/msl/multimedia/images/?ImageID=8029. The view does not include the rover's arm nor the MAHLI camera itself, except in the miniature scene reflected upside down in the parabolic mirror at the top of the mast. That mirror is part of Curiosity's Chemistry and Camera (ChemCam) instrument. MAHLI appears in the center of the mirror. Wrist motions and turret rotations on the arm allowed MAHLI to acquire the mosaic's component images. The arm was positioned out of the shot in the images, or portions of images, that were used in this mosaic. This process was used previously in acquiring and assembling Curiosity self-portraits taken at other sample-collection sites, including "Rocknest" (PIA16468), "Windjana" (PIA18390"), "Buckskin" (PIA19808) and "Gobabeb" (PIA20316). For scale, the rover's wheels are 20 inches (50 centimeters) in diameter and about 16 inches (40 centimeters) wide. http://photojournal.jpl.nasa.gov/catalog/PIA20844
Ritchie, Andrew M; Lo, Nathan; Ho, Simon Y W
2017-05-01
In Bayesian phylogenetic analyses of genetic data, prior probability distributions need to be specified for the model parameters, including the tree. When Bayesian methods are used for molecular dating, available tree priors include those designed for species-level data, such as the pure-birth and birth-death priors, and coalescent-based priors designed for population-level data. However, molecular dating methods are frequently applied to data sets that include multiple individuals across multiple species. Such data sets violate the assumptions of both the speciation and coalescent-based tree priors, making it unclear which should be chosen and whether this choice can affect the estimation of node times. To investigate this problem, we used a simulation approach to produce data sets with different proportions of within- and between-species sampling under the multispecies coalescent model. These data sets were then analyzed under pure-birth, birth-death, constant-size coalescent, and skyline coalescent tree priors. We also explored the ability of Bayesian model testing to select the best-performing priors. We confirmed the applicability of our results to empirical data sets from cetaceans, phocids, and coregonid whitefish. Estimates of node times were generally robust to the choice of tree prior, but some combinations of tree priors and sampling schemes led to large differences in the age estimates. In particular, the pure-birth tree prior frequently led to inaccurate estimates for data sets containing a mixture of inter- and intraspecific sampling, whereas the birth-death and skyline coalescent priors produced stable results across all scenarios. Model testing provided an adequate means of rejecting inappropriate tree priors. Our results suggest that tree priors do not strongly affect Bayesian molecular dating results in most cases, even when severely misspecified. However, the choice of tree prior can be significant for the accuracy of dating results in the case of data sets with mixed inter- and intraspecies sampling. [Bayesian phylogenetic methods; model testing; molecular dating; node time; tree prior.]. © The authors 2016. Published by Oxford University Press, on behalf of the Society of Systematic Biologists. All rights reserved. For permissions, please e-mail: journals.permission@oup.com.
Progressive Visual Analytics: User-Driven Visual Exploration of In-Progress Analytics.
Stolper, Charles D; Perer, Adam; Gotz, David
2014-12-01
As datasets grow and analytic algorithms become more complex, the typical workflow of analysts launching an analytic, waiting for it to complete, inspecting the results, and then re-Iaunching the computation with adjusted parameters is not realistic for many real-world tasks. This paper presents an alternative workflow, progressive visual analytics, which enables an analyst to inspect partial results of an algorithm as they become available and interact with the algorithm to prioritize subspaces of interest. Progressive visual analytics depends on adapting analytical algorithms to produce meaningful partial results and enable analyst intervention without sacrificing computational speed. The paradigm also depends on adapting information visualization techniques to incorporate the constantly refining results without overwhelming analysts and provide interactions to support an analyst directing the analytic. The contributions of this paper include: a description of the progressive visual analytics paradigm; design goals for both the algorithms and visualizations in progressive visual analytics systems; an example progressive visual analytics system (Progressive Insights) for analyzing common patterns in a collection of event sequences; and an evaluation of Progressive Insights and the progressive visual analytics paradigm by clinical researchers analyzing electronic medical records.
Obtaining lower bounds from the progressive hedging algorithm for stochastic mixed-integer programs
Gade, Dinakar; Hackebeil, Gabriel; Ryan, Sarah M.; ...
2016-04-02
We present a method for computing lower bounds in the progressive hedging algorithm (PHA) for two-stage and multi-stage stochastic mixed-integer programs. Computing lower bounds in the PHA allows one to assess the quality of the solutions generated by the algorithm contemporaneously. The lower bounds can be computed in any iteration of the algorithm by using dual prices that are calculated during execution of the standard PHA. In conclusion, we report computational results on stochastic unit commitment and stochastic server location problem instances, and explore the relationship between key PHA parameters and the quality of the resulting lower bounds.
John G. Michopoulos; John Hermanson; Athanasios Iliopoulos
2014-01-01
The research areas of mutiaxial robotic testing and design optimization have been recently utilized for the purpose of data-driven constitutive characterization of anisotropic material systems. This effort has been enabled by both the progress in the areas of computers and information in engineering as well as the progress in computational automation. Although our...
ERIC Educational Resources Information Center
HANKIN, EDWARD K.; AND OTHERS
THIS TECHNICAL PROGRESS REPORT COVERS THE FIRST THREE MONTHS OF A PROJECT TO DEVELOP COMPUTER ASSISTED PREVOCATIONAL READING AND ARITHMETIC COURSES FOR DISADVANTAGED YOUTHS AND ADULTS. DURING THE FIRST MONTH OF OPERATION, PROJECT PERSONNEL CONCENTRATED ON SUCH ADMINISTRATIVE MATTERS AS TRAINING STAFF AND PREPARING FACILITIES. AN ARITHMETIC PROGRAM…
Research, Development and Validation of the Daily Demand Computer Schedule 360/50. Final Report.
ERIC Educational Resources Information Center
Ovard, Glen F.; Rowley, Vernon C.
A study was designed to further the research, development and validation of the Daily Demand Computer Schedule (DDCS), a system by which students can be rescheduled daily for facilitating their individual continuous progress through the curriculum. It will allow teachers to regroup students as needed based upon that progress, and will make time a…
ERIC Educational Resources Information Center
Forster, Natalie; Souvignier, Elmar
2011-01-01
The purpose of this study was to examine the technical adequacy of a computer-based assessment instrument which is based on hierarchical models of text comprehension for monitoring student reading progress following the Curriculum-Based Measurement (CBM) approach. At intervals of two weeks, 120 third-grade students finished eight CBM tests. To…
10 CFR 961.11 - Text of the contract.
Code of Federal Regulations, 2014 CFR
2014-01-01
... program including information on cost projections, project plans and progress reports. 5. (a) Beginning on...-type documents or computer software (including computer programs, computer software data bases, and computer software documentation). Examples of technical data include research and engineering data...
10 CFR 961.11 - Text of the contract.
Code of Federal Regulations, 2013 CFR
2013-01-01
... program including information on cost projections, project plans and progress reports. 5. (a) Beginning on...-type documents or computer software (including computer programs, computer software data bases, and computer software documentation). Examples of technical data include research and engineering data...
NASA Technical Reports Server (NTRS)
Afjeh, Abdollah A.; Reed, John A.
2003-01-01
The following reports are presented on this project:A first year progress report on: Development of a Dynamically Configurable,Object-Oriented Framework for Distributed, Multi-modal Computational Aerospace Systems Simulation; A second year progress report on: Development of a Dynamically Configurable, Object-Oriented Framework for Distributed, Multi-modal Computational Aerospace Systems Simulation; An Extensible, Interchangeable and Sharable Database Model for Improving Multidisciplinary Aircraft Design; Interactive, Secure Web-enabled Aircraft Engine Simulation Using XML Databinding Integration; and Improving the Aircraft Design Process Using Web-based Modeling and Simulation.
Results of the First National Assessment of Computer Competence (The Printout).
ERIC Educational Resources Information Center
Balajthy, Ernest
1988-01-01
Discusses the findings of the National Assessment of Educational Progress 1985-86 survey of American students' computer competence, focusing on findings of interest to reading teachers who use computers. (MM)
Advanced computations in plasma physics
NASA Astrophysics Data System (ADS)
Tang, W. M.
2002-05-01
Scientific simulation in tandem with theory and experiment is an essential tool for understanding complex plasma behavior. In this paper we review recent progress and future directions for advanced simulations in magnetically confined plasmas with illustrative examples chosen from magnetic confinement research areas such as microturbulence, magnetohydrodynamics, magnetic reconnection, and others. Significant recent progress has been made in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics, giving increasingly good agreement between experimental observations and computational modeling. This was made possible by innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales together with access to powerful new computational resources. In particular, the fusion energy science community has made excellent progress in developing advanced codes for which computer run-time and problem size scale well with the number of processors on massively parallel machines (MPP's). A good example is the effective usage of the full power of multi-teraflop (multi-trillion floating point computations per second) MPP's to produce three-dimensional, general geometry, nonlinear particle simulations which have accelerated progress in understanding the nature of turbulence self-regulation by zonal flows. It should be emphasized that these calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In general, results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. The associated scientific excitement should serve to stimulate improved cross-cutting collaborations with other fields and also to help attract bright young talent to plasma science.
Domanskyi, Sergii; Nicholatos, Justin W; Schilling, Joshua E; Privman, Vladimir; Libert, Sergiy
2017-11-01
Apoptosis is essential for numerous processes, such as development, resistance to infections, and suppression of tumorigenesis. Here, we investigate the influence of the nutrient sensing and longevity-assuring enzyme SIRT6 on the dynamics of apoptosis triggered by serum starvation. Specifically, we characterize the progression of apoptosis in wild type and SIRT6 deficient mouse embryonic fibroblasts using time-lapse flow cytometry and computational modelling based on rate-equations and cell distribution analysis. We find that SIRT6 deficient cells resist apoptosis by delaying its initiation. Interestingly, once apoptosis is initiated, the rate of its progression is higher in SIRT6 null cells compared to identically cultured wild type cells. However, SIRT6 null cells succumb to apoptosis more slowly, not only in response to nutrient deprivation but also in response to other stresses. Our data suggest that SIRT6 plays a role in several distinct steps of apoptosis. Overall, we demonstrate the utility of our computational model to describe stages of apoptosis progression and the integrity of the cellular membrane. Such measurements will be useful in a broad range of biological applications.
CAGI: Computer Aided Grid Interface. A work in progress
NASA Technical Reports Server (NTRS)
Soni, Bharat K.; Yu, Tzu-Yi; Vaughn, David
1992-01-01
Progress realized in the development of a Computer Aided Grid Interface (CAGI) software system in integrating CAD/CAM geometric system output and/or Interactive Graphics Exchange Standard (IGES) files, geometry manipulations associated with grid generation, and robust grid generation methodologies is presented. CAGI is being developed in a modular fashion and will offer fast, efficient and economical response to geometry/grid preparation, allowing the ability to upgrade basic geometry in a step-by-step fashion interactively and under permanent visual control along with minimizing the differences between the actual hardware surface descriptions and corresponding numerical analog. The computer code GENIE is used as a basis. The Non-Uniform Rational B-Splines (NURBS) representation of sculptured surfaces is utilized for surface grid redistribution. The computer aided analysis system, PATRAN, is adapted as a CAD/CAM system. The progress realized in NURBS surface grid generation, the development of IGES transformer, and geometry adaption using PATRAN will be presented along with their applicability to grid generation associated with rocket propulsion applications.
How controllers compensate for the lack of flight progress strips.
DOT National Transportation Integrated Search
1996-02-01
The role of the Flight Progress Strip, currently used to display important flight data, has been debated because of long range plans to automate the air traffic control (ATC) human-computer interface. Currently, the Fight Progress Strip is viewed by ...
Modelling UV irradiances on arbitrarily oriented surfaces: effects of sky obstructions
NASA Astrophysics Data System (ADS)
Hess, M.; Koepke, P.
2008-02-01
A method is presented to calculate UV irradiances on inclined surfaces that additionally takes into account the influence of sky obstructions caused by obstacles such as mountains, houses, trees, or umbrellas. Thus the method allows calculating the impact of UV radiation on biological systems, such as for instance the human skin or eye, in any natural or artificial environment. The method, a combination of radiation models, is explained and the correctness of its results is demonstrated. The effect of a natural skyline is shown for an Alpine ski area, where the UV irradiance even on a horizontal surface may increase due to reflection at snow by more than 10%. In contrast in a street canyon the irradiance on a horizontal surface is reduced down to 30% in shadow and to about 75% for a position in the sun.
Teaching 1H NMR Spectrometry Using Computer Modeling.
ERIC Educational Resources Information Center
Habata, Yoichi; Akabori, Sadatoshi
2001-01-01
Molecular modeling by computer is used to display stereochemistry, molecular orbitals, structure of transition states, and progress of reactions. Describes new ideas for teaching 1H NMR spectroscopy using computer modeling. (Contains 12 references.) (ASK)
NASA Astrophysics Data System (ADS)
Wei, Tzu-Chieh; Huang, Ching-Yu
2017-09-01
Recent progress in the characterization of gapped quantum phases has also triggered the search for a universal resource for quantum computation in symmetric gapped phases. Prior works in one dimension suggest that it is a feature more common than previously thought, in that nontrivial one-dimensional symmetry-protected topological (SPT) phases provide quantum computational power characterized by the algebraic structure defining these phases. Progress in two and higher dimensions so far has been limited to special fixed points. Here we provide two families of two-dimensional Z2 symmetric wave functions such that there exists a finite region of the parameter in the SPT phases that supports universal quantum computation. The quantum computational power appears to lose its universality at the boundary between the SPT and the symmetry-breaking phases.
76 FR 13984 - Cloud Computing Forum & Workshop III
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-15
... DEPARTMENT OF COMMERCE National Institute of Standards and Technology Cloud Computing Forum... public workshop. SUMMARY: NIST announces the Cloud Computing Forum & Workshop III to be held on April 7... provide information on the NIST strategic and tactical Cloud Computing program, including progress on the...
Ozgul, Betul Memis; Orhan, Kaan; Oz, Firdevs Tulga
2015-09-01
We investigated inhibition of lesion progression in artificial enamel lesions. Lesions were created on primary and permanent anterior teeth (n = 10 each) and were divided randomly into two groups with two windows: Group 1 (window A: resin infiltration; window B: negative control) and Group 2 (window A: resin infiltration + fluoride varnish; window B: fluoride varnish). After pH cycling, micro-computed tomography was used to analyze progression of lesion depth and changes in mineral density. Resin infiltration and resin infiltration + fluoride varnish significantly inhibited progression of lesion depth in primary teeth (P < 0.05). Inhibition of lesion depth progression in permanent teeth was significantly greater after treatment with resin infiltration + fluoride varnish than in the negative control (P < 0.05). Change in mineral density was smaller in the resin infiltration and resin infiltration + fluoride varnish groups; however, the difference was not significant for either group (P > 0.05). Resin infiltration is a promising method of inhibiting progression of caries lesions.
NASA Astrophysics Data System (ADS)
MacDonald, Christopher L.; Bhattacharya, Nirupama; Sprouse, Brian P.; Silva, Gabriel A.
2015-09-01
Computing numerical solutions to fractional differential equations can be computationally intensive due to the effect of non-local derivatives in which all previous time points contribute to the current iteration. In general, numerical approaches that depend on truncating part of the system history while efficient, can suffer from high degrees of error and inaccuracy. Here we present an adaptive time step memory method for smooth functions applied to the Grünwald-Letnikov fractional diffusion derivative. This method is computationally efficient and results in smaller errors during numerical simulations. Sampled points along the system's history at progressively longer intervals are assumed to reflect the values of neighboring time points. By including progressively fewer points backward in time, a temporally 'weighted' history is computed that includes contributions from the entire past of the system, maintaining accuracy, but with fewer points actually calculated, greatly improving computational efficiency.
Application of Interface Technology in Progressive Failure Analysis of Composite Panels
NASA Technical Reports Server (NTRS)
Sleight, D. W.; Lotts, C. G.
2002-01-01
A progressive failure analysis capability using interface technology is presented. The capability has been implemented in the COMET-AR finite element analysis code developed at the NASA Langley Research Center and is demonstrated on composite panels. The composite panels are analyzed for damage initiation and propagation from initial loading to final failure using a progressive failure analysis capability that includes both geometric and material nonlinearities. Progressive failure analyses are performed on conventional models and interface technology models of the composite panels. Analytical results and the computational effort of the analyses are compared for the conventional models and interface technology models. The analytical results predicted with the interface technology models are in good correlation with the analytical results using the conventional models, while significantly reducing the computational effort.
1984-12-01
business contractor which is receiving 100% flexible progress payments as computed by Progress Payment Model and approved by Headquarters. o The present...EPA clauses or indemnification. A request for increased progress payments wias motivated by the new flexible progress payments model . Both requests...Capital investment. * 40. The flexible progress payment model is: a) ___Too complex to administer. b) ___Too beneficial to the contractor. c
Learning about Tasks Computers Can Perform. ERIC Digest.
ERIC Educational Resources Information Center
Brosnan, Patricia A.
Knowing what different kinds of computer equipment can do is the first step in choosing the computer that is right for you. This digest describes a developmental progression of computer capabilities. First the basic three software programs (word processing, spreadsheets, and database programs) are discussed using examples. Next, an explanation of…
The Development of Sociocultural Competence with the Help of Computer Technology
ERIC Educational Resources Information Center
Rakhimova, Alina E.; Yashina, Marianna E.; Mukhamadiarova, Albina F.; Sharipova, Astrid V.
2017-01-01
The article deals with the description of the process of development sociocultural knowledge and competences using computer technologies. On the whole the development of modern computer technologies allows teachers to broaden trainees' sociocultural outlook and trace their progress online. Observation of modern computer technologies and estimation…
Research in progress at the Institute for Computer Applications in Science and Engineering
NASA Technical Reports Server (NTRS)
1987-01-01
This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period April 1, 1987 through October 1, 1987.
Correction of Patellofemoral Malalignment With Patellofemoral Arthroplasty.
Valoroso, Marco; Saffarini, Mo; La Barbera, Giuseppe; Toanen, Cécile; Hannink, Gerjon; Nover, Luca; Dejour, David H
2017-12-01
The goal of patellofemoral arthroplasty (PFA) is to replace damaged cartilage and correct underlying deformities to reduce pain and prevent maltracking. We aimed to determine how PFA modifies patellar height, tilt, and tibial tuberosity-trochlear groove (TT-TG) distance. The hypothesis was that PFA would correct trochlear dysplasia or extensor mechanism malalignment. The authors prospectively studied a series of 16 patients (13 women and 3 men) aged 64.9 ± 16.3 years (range 41-86 years) who received PFA. All knees were assessed preoperatively and 6 months postoperatively using frontal, lateral, and "skyline" x-rays, and computed tomography scans to calculate patellar tilt, patellar height, and TT-TG distance. The interobserver agreement was excellent for all parameters (intraclass correlation coefficient >0.95). Preoperatively, the median patellar tilt without quadriceps contraction (QC) was 17.5° (range 5.3°-33.4°) and with QC was 19.8° (range 0°-52.0°). The median Caton-Deschamps index was 0.91 (range 0.80-1.22) and TT-TG distance was 14.5 mm (range 4.0-22.0 mm). Postoperatively, the median patellar tilt without QC was 0.3° (range -15.3° to 9.5°) and with QC was 6.1° (range -11.5° to 13.3°). The median Caton-Deschamps index was 1.11 (range 0.81-1.20) and TT-TG distance was 10.1 mm (range 1.8-13.8 mm). The present study demonstrates that beyond replacing arthritic cartilage, trochlear-cutting PFA improves patellofemoral congruence by correcting trochlear dysplasia and standardizing radiological measurements as patellar tilt and TT-TG. The association of lateral patellar facetectomy improves patellar tracking by reducing the patellar tilt. Copyright © 2017 Elsevier Inc. All rights reserved.
González-Wevar, C A; Saucède, T; Morley, S A; Chown, S L; Poulin, E
2013-10-01
Quaternary glaciations in Antarctica drastically modified geographical ranges and population sizes of marine benthic invertebrates and thus affected the amount and distribution of intraspecific genetic variation. Here, we present new genetic information in the Antarctic limpet Nacella concinna, a dominant Antarctic benthic species along shallow ice-free rocky ecosystems. We examined the patterns of genetic diversity and structure in this broadcast spawner along maritime Antarctica and from the peri-Antarctic island of South Georgia. Genetic analyses showed that N. concinna represents a single panmictic unit in maritime Antarctic. Low levels of genetic diversity characterized this population; its median-joining haplotype network revealed a typical star-like topology with a short genealogy and a dominant haplotype broadly distributed. As previously reported with nuclear markers, we detected significant genetic differentiation between South Georgia Island and maritime Antarctica populations. Higher levels of genetic diversity, a more expanded genealogy and the presence of more private haplotypes support the hypothesis of glacial persistence in this peri-Antarctic island. Bayesian Skyline plot and mismatch distribution analyses recognized an older demographic history in South Georgia. Approximate Bayesian computations did not support the persistence of N. concinna along maritime Antarctica during the last glacial period, but indicated the resilience of the species in peri-Antarctic refugia (South Georgia Island). We proposed a model of Quaternary Biogeography for Antarctic marine benthic invertebrates with shallow and narrow bathymetric ranges including (i) extinction of maritime Antarctic populations during glacial periods; (ii) persistence of populations in peri-Antarctic refugia; and (iii) recolonization of maritime Antarctica following the deglaciation process. © 2013 John Wiley & Sons Ltd.
Lanier, Hayley C; Gunderson, Aren M; Weksler, Marcelo; Fedorov, Vadim B; Olson, Link E
2015-01-01
Recent studies suggest that alpine and arctic organisms may have distinctly different phylogeographic histories from temperate or tropical taxa, with recent range contraction into interglacial refugia as opposed to post-glacial expansion out of refugia. We use a combination of phylogeographic inference, demographic reconstructions, and hierarchical Approximate Bayesian Computation to test for phylodemographic concordance among five species of alpine-adapted small mammals in eastern Beringia. These species (Collared Pikas, Hoary Marmots, Brown Lemmings, Arctic Ground Squirrels, and Singing Voles) vary in specificity to alpine and boreal-tundra habitat but share commonalities (e.g., cold tolerance and nunatak survival) that might result in concordant responses to Pleistocene glaciations. All five species contain a similar phylogeographic disjunction separating eastern and Beringian lineages, which we show to be the result of simultaneous divergence. Genetic diversity is similar within each haplogroup for each species, and there is no support for a post-Pleistocene population expansion in eastern lineages relative to those from Beringia. Bayesian skyline plots for four of the five species do not support Pleistocene population contraction. Brown Lemmings show evidence of late Quaternary demographic expansion without subsequent population decline. The Wrangell-St. Elias region of eastern Alaska appears to be an important zone of recent secondary contact for nearctic alpine mammals. Despite differences in natural history and ecology, similar phylogeographic histories are supported for all species, suggesting that these, and likely other, alpine- and arctic-adapted taxa are already experiencing population and/or range declines that are likely to synergistically accelerate in the face of rapid climate change. Climate change may therefore be acting as a double-edged sword that erodes genetic diversity within populations but promotes divergence and the generation of biodiversity.
Lanier, Hayley C.; Gunderson, Aren M.; Weksler, Marcelo; Fedorov, Vadim B.; Olson, Link E.
2015-01-01
Recent studies suggest that alpine and arctic organisms may have distinctly different phylogeographic histories from temperate or tropical taxa, with recent range contraction into interglacial refugia as opposed to post-glacial expansion out of refugia. We use a combination of phylogeographic inference, demographic reconstructions, and hierarchical Approximate Bayesian Computation to test for phylodemographic concordance among five species of alpine-adapted small mammals in eastern Beringia. These species (Collared Pikas, Hoary Marmots, Brown Lemmings, Arctic Ground Squirrels, and Singing Voles) vary in specificity to alpine and boreal-tundra habitat but share commonalities (e.g., cold tolerance and nunatak survival) that might result in concordant responses to Pleistocene glaciations. All five species contain a similar phylogeographic disjunction separating eastern and Beringian lineages, which we show to be the result of simultaneous divergence. Genetic diversity is similar within each haplogroup for each species, and there is no support for a post-Pleistocene population expansion in eastern lineages relative to those from Beringia. Bayesian skyline plots for four of the five species do not support Pleistocene population contraction. Brown Lemmings show evidence of late Quaternary demographic expansion without subsequent population decline. The Wrangell-St. Elias region of eastern Alaska appears to be an important zone of recent secondary contact for nearctic alpine mammals. Despite differences in natural history and ecology, similar phylogeographic histories are supported for all species, suggesting that these, and likely other, alpine- and arctic-adapted taxa are already experiencing population and/or range declines that are likely to synergistically accelerate in the face of rapid climate change. Climate change may therefore be acting as a double-edged sword that erodes genetic diversity within populations but promotes divergence and the generation of biodiversity. PMID:25734275
Progress in Earth System Modeling since the ENIAC Calculation
NASA Astrophysics Data System (ADS)
Fung, I.
2009-05-01
The success of the first numerical weather prediction experiment on the ENIAC computer in 1950 was hinged on the expansion of the meteorological observing network, which led to theoretical advances in atmospheric dynamics and subsequently the implementation of the simplified equations on the computer. This paper briefly reviews the progress in Earth System Modeling and climate observations, and suggests a strategy to sustain and expand the observations needed to advance climate science and prediction.
NASA Astrophysics Data System (ADS)
Fiorini, Rodolfo A.; Dacquino, Gianfranco
2005-03-01
GEOGINE (GEOmetrical enGINE), a state-of-the-art OMG (Ontological Model Generator) based on n-D Tensor Invariants for n-Dimensional shape/texture optimal synthetic representation, description and learning, was presented in previous conferences elsewhere recently. Improved computational algorithms based on the computational invariant theory of finite groups in Euclidean space and a demo application is presented. Progressive model automatic generation is discussed. GEOGINE can be used as an efficient computational kernel for fast reliable application development and delivery in advanced biomedical engineering, biometric, intelligent computing, target recognition, content image retrieval, data mining technological areas mainly. Ontology can be regarded as a logical theory accounting for the intended meaning of a formal dictionary, i.e., its ontological commitment to a particular conceptualization of the world object. According to this approach, "n-D Tensor Calculus" can be considered a "Formal Language" to reliably compute optimized "n-Dimensional Tensor Invariants" as specific object "invariant parameter and attribute words" for automated n-Dimensional shape/texture optimal synthetic object description by incremental model generation. The class of those "invariant parameter and attribute words" can be thought as a specific "Formal Vocabulary" learned from a "Generalized Formal Dictionary" of the "Computational Tensor Invariants" language. Even object chromatic attributes can be effectively and reliably computed from object geometric parameters into robust colour shape invariant characteristics. As a matter of fact, any highly sophisticated application needing effective, robust object geometric/colour invariant attribute capture and parameterization features, for reliable automated object learning and discrimination can deeply benefit from GEOGINE progressive automated model generation computational kernel performance. Main operational advantages over previous, similar approaches are: 1) Progressive Automated Invariant Model Generation, 2) Invariant Minimal Complete Description Set for computational efficiency, 3) Arbitrary Model Precision for robust object description and identification.
Advanced Computation in Plasma Physics
NASA Astrophysics Data System (ADS)
Tang, William
2001-10-01
Scientific simulation in tandem with theory and experiment is an essential tool for understanding complex plasma behavior. This talk will review recent progress and future directions for advanced simulations in magnetically-confined plasmas with illustrative examples chosen from areas such as microturbulence, magnetohydrodynamics, magnetic reconnection, and others. Significant recent progress has been made in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics, giving increasingly good agreement between experimental observations and computational modeling. This was made possible by innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales together with access to powerful new computational resources. In particular, the fusion energy science community has made excellent progress in developing advanced codes for which computer run-time and problem size scale well with the number of processors on massively parallel machines (MPP's). A good example is the effective usage of the full power of multi-teraflop MPP's to produce 3-dimensional, general geometry, nonlinear particle simulations which have accelerated progress in understanding the nature of turbulence self-regulation by zonal flows. It should be emphasized that these calculations, which typically utilized billions of particles for tens of thousands time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In general, results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. The associated scientific excitement should serve to stimulate improved cross-cutting collaborations with other fields and also to help attract bright young talent to plasma science.
Influence of indoor and outdoor activities on progression of myopia during puberty.
Öner, Veysi; Bulut, Asker; Oruç, Yavuz; Özgür, Gökhan
2016-02-01
The purpose of this study was to investigate whether time spent on indoor and outdoor activities or the other possible risk factors including age, gender, parental history, and initial refraction was associated with progression of myopia, during puberty. Fifty eyes of 50 myopic children aged 9-14 years were enrolled in the study. The parents were interviewed to determine the amounts of time in hours per day spent on reading and writing, using computer, watching TV, and outdoor activities (i.e., sports, games, or being outdoor with no activities) on an average day. The annual myopia progression rate (diopters per year) was calculated for each subject and was used in the statistical analyses. The mean initial age of the subjects was 10.9 ± 1.5 (ranging from 9 to 14) years. The mean follow-up period was 33.3 ± 10.3 (ranging from 17 to 55) months. There was a significant increase in the mean myopia value of the subjects after follow-up period (p < 0.001). The mean daily time spent on reading and writing and initial refraction value were independently associated with annual myopic progression rate. On the other hand, age, gender, parental myopia, and the mean daily times spent on computer use, watching TV, and outdoor activities had no correlations with annual myopia progression rate. The present study showed that myopia progression was associated with time spent on reading and writing and initial refraction value, during puberty. However, myopia progression was not associated with parental myopia, age, gender, and daily times spent on using computer, watching TV, and outdoor activities.
Global Static Indexing for Real-Time Exploration of Very Large Regular Grids
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pascucci, V; Frank, R
2001-07-23
In this paper we introduce a new indexing scheme for progressive traversal and visualization of large regular grids. We demonstrate the potential of our approach by providing a tool that displays at interactive rates planar slices of scalar field data with very modest computing resources. We obtain unprecedented results both in terms of absolute performance and, more importantly, in terms of scalability. On a laptop computer we provide real time interaction with a 2048{sup 3} grid (8 Giga-nodes) using only 20MB of memory. On an SGI Onyx we slice interactively an 8192{sup 3} grid (1/2 tera-nodes) using only 60MB ofmore » memory. The scheme relies simply on the determination of an appropriate reordering of the rectilinear grid data and a progressive construction of the output slice. The reordering minimizes the amount of I/O performed during the out-of-core computation. The progressive and asynchronous computation of the output provides flexible quality/speed tradeoffs and a time-critical and interruptible user interface.« less
ERIC Educational Resources Information Center
Fahy, Patrick J.
Computer-assisted learning (CAL) can be used for adults functioning at any academic or grade level. In adult basic education (ABE), CAL can promote greater learning effectiveness and faster progress, concurrent learning and experience with computer literacy skills, privacy, and motivation. Adults who face barriers (financial, geographic, personal,…
Research on the Intensity Analysis and Result Visualization of Construction Land in Urban Planning
NASA Astrophysics Data System (ADS)
Cui, J.; Dong, B.; Li, J.; Li, L.
2017-09-01
As a fundamental work of urban planning, the intensity analysis of construction land involves many repetitive data processing works that are prone to cause errors or data precision loss, and the lack of efficient methods and tools to visualizing the analysis results in current urban planning. In the research a portable tool is developed by using the Model Builder technique embedded in ArcGIS to provide automatic data processing and rapid result visualization for the works. A series of basic modules provided by ArcGIS are linked together to shape a whole data processing chain in the tool. Once the required data is imported, the analysis results and related maps and graphs including the intensity values and zoning map, the skyline analysis map etc. are produced automatically. Finally the tool is installation-free and can be dispatched quickly between planning teams.
Modelling UV irradiances on arbitrarily oriented surfaces: effects of sky obstructions
NASA Astrophysics Data System (ADS)
Hess, M.; Koepke, P.
2008-07-01
A method is presented to calculate UV irradiances on inclined surfaces that additionally takes into account the influence of sky obstructions caused by obstacles such as mountains, houses, trees, or umbrellas. With this method it is thus possible to calculate the impact of UV radiation on biological systems, such as, for instance, the human skin or eye, in any natural or artificial environment. The method, which consists of a combination of radiation models, is explained here and the accuracy of its results is demonstrated. The effect of a natural skyline is shown for an Alpine ski area, where the UV irradiance even on a horizontal surface may increase due to reflection from snow by more than 10 percent. In contrast, in a street canyon the irradiance on a horizontal surface is reduced to 30% in shadow and to about 75% for a position in the sun.
Pluto Majestic Mountains, Frozen Plains and Foggy Hazes
2015-09-17
Just 15 minutes after its closest approach to Pluto on July 14, 2015, NASA's New Horizons spacecraft looked back toward the sun and captured this near-sunset view of the rugged, icy mountains and flat ice plains extending to Pluto's horizon. The smooth expanse of the informally named icy plain Sputnik Planum (right) is flanked to the west (left) by rugged mountains up to 11,000 feet (3,500 meters) high, including the informally named Norgay Montes in the foreground and Hillary Montes on the skyline. To the right, east of Sputnik, rougher terrain is cut by apparent glaciers. The backlighting highlights more than a dozen layers of haze in Pluto's tenuous but distended atmosphere. The image was taken from a distance of 11,000 miles (18,000 kilometers) to Pluto; the scene is 780 miles (1,250 kilometers) wide. http://photojournal.jpl.nasa.gov/catalog/PIA19948
High Rise Building: The Mega Sculpture Made Of Steel, Concrete and Glass
NASA Astrophysics Data System (ADS)
Stefańska, Alicja; Załuski, Daniel
2017-10-01
High rise building has transformed from providing not only the expansion of floor space but functioning as mega sculpture in the city. The shift away from economic efficiency driven need is only expected to grow in the future. Based on literature studies; after analysing planning documents and case studies, it was examined whether the presumption that gaining the maximum amount of usable area is the only driving factor; or if the need for the creation of an image for the city provided a supplementary reason. The results showed that forming high rise buildings as three-dimensional sculptures is influenced not only by aesthetics, but also marketing. Visual distinction in the city skyline is economically beneficial for investors gaining not only functionality but art, enriching the cultural landscape. Organizing architectural competitions, public debates and following the latest art trends is therefore possible due to large budgets of such projects.
Eddhif, Balkis; Guignard, Nadia; Batonneau, Yann; Clarhaut, Jonathan; Papot, Sébastien; Geffroy-Rodier, Claude; Poinot, Pauline
2018-04-01
The data presented here are related to the research paper entitled "Study of a Novel Agent for TCA Precipitated Proteins Washing - Comprehensive Insights into the Role of Ethanol/HCl on Molten Globule State by Multi-Spectroscopic Analyses" (Eddhif et al., submitted for publication) [1]. The suitability of ethanol/HCl for the washing of TCA-precipitated proteins was first investigated on standard solution of HSA, cellulase, ribonuclease and lysozyme. Recoveries were assessed by one-dimensional gel electrophoresis, Bradford assays and UPLC-HRMS. The mechanistic that triggers protein conformational changes at each purification stage was then investigated by Raman spectroscopy and spectrofluorometry. Finally, the efficiency of the method was evaluated on three different complex samples (mouse liver, river biofilm, loamy soil surface). Proteins profiling was assessed by gel electrophoresis and by UPLC-HRMS.
Yang, J; Chen, C S; Chen, S H; Ding, P; Fan, Z Y; Lu, Y W; Yu, L P; Lin, H D
2016-06-10
Amji's salamander (Hynobius amjiensis) is a critically endangered species (IUCN Red List), which is endemic to mainland China. In the present study, five haplotypes were genotyped for the mtDNA cyt b gene in 45 specimens from three populations. Relatively low levels of haplotype diversity (h = 0.524) and nucleotide diversity (π = 0.00532) were detected. Analyses of the phylogenic structure of H. amjiensis showed no evidence of major geographic partitions or substantial barriers to historical gene flow throughout the species' range. Two major phylogenetic haplotype groups were revealed, and were estimated to have diverged about 1.262 million years ago. Mismatch distribution analysis, neutrality tests, and Bayesian skyline plots revealed no evidence of dramatic changes in the effective population size. According to the SAMOVA and STRUCTURE analyses, H. amjiensis should be regarded as two different management units.
Web-Based Mathematics Progress Monitoring in Second Grade
ERIC Educational Resources Information Center
Salaschek, Martin; Souvignier, Elmar
2014-01-01
We examined a web-based mathematics progress monitoring tool for second graders. The tool monitors the learning progress of two competences, number sense and computation. A total of 414 students from 19 classrooms in Germany were checked every 3 weeks from fall to spring. Correlational analyses indicate that alternate-form reliability was adequate…
ERIC Educational Resources Information Center
Gorski, Paul C.
2009-01-01
In the United States, where technological progress is portrayed as humanistic progress, computer technologies often are hailed as the great equalizers. Even within progressive education movements, such as multicultural education, the conversation about instructional technology tends to center more on this or that wonderful Web site or piece of…
77 FR 65417 - Proposal Review Panel for Computing Communication Foundations; Notice of Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-26
...: To assess the progress of the EIC Award, ``Collaborative Research: Computational Behavioral Science... NATIONAL SCIENCE FOUNDATION Proposal Review Panel for Computing Communication Foundations; Notice... National Science Foundation announces the following meeting: Name: Site Visit, Proposal Panel Review for...
ERM TLB Teaching-Learning Behavior News
ERIC Educational Resources Information Center
LeBold, William K., Ed.
1978-01-01
Describes a graduate electrical engineering mini-course, computer graphics gaming and simulation, classroom management and student progress records, student reaction to instruction, and computer graphics in undergraduate education. (SL)
Using Data Warehouses to extract knowledge from Agro-Hydrological simulations
NASA Astrophysics Data System (ADS)
Bouadi, Tassadit; Gascuel-Odoux, Chantal; Cordier, Marie-Odile; Quiniou, René; Moreau, Pierre
2013-04-01
In recent years, simulation models have been used more and more in hydrology to test the effect of scenarios and help stakeholders in decision making. Agro-hydrological models have oriented agricultural water management, by testing the effect of landscape structure and farming system changes on water and chemical emission in rivers. Such models generate a large amount of data while few of them, such as daily concentrations at the outlet of the catchment, or annual budgets regarding soil, water and atmosphere emissions, are stored and analyzed. Thus, a great amount of information is lost from the simulation process. This is due to the large volumes of simulated data, but also to the difficulties in analyzing and transforming the data in an usable information. In this talk we illustrate a data warehouse which has been built to store and manage simulation data coming from the agro-hydrological model TNT (Topography-based nitrogen transfer and transformations, (Beaujouan et al., 2002)). This model simulates the transfer and transformation of nitrogen in agricultural catchments. TNT was used over 10 years on the Yar catchment (western France), a 50 km2 square area which present a detailed data set and have to facing to environmental issue (coastal eutrophication). 44 output key simulated variables are stored at a daily time step, i.e, 8 GB of storage size, which allows the users to explore the N emission in space and time, to quantify all the processes of transfer and transformation regarding the cropping systems, their location within the catchment, the emission in water and atmosphere, and finally to get new knowledge and help in making specific and detailed decision in space and time. We present the dimensional modeling process of the Nitrogen in catchment data warehouse (i.e. the snowflake model). After identifying the set of multileveled dimensions with complex hierarchical structures and relationships among related dimension levels, we chose the snowflake model to design our agri-environmental data warehouse. The snowflake schema is required for flexible querying complex dimension relationships. We have designed the Nitrogen in catchment data warehouse using the open source Business Intelligence Platform Pentaho Version 3.5. We use the online analytical processing (OLAP) to access and exploit, intuitively and quickly, the multidimensional and aggregated data from the Nitrogen in catchment data warehouse. We illustrate how the data warehouse can be efficiently used to explore spatio-temporal dimensions and to discover new knowledge and enrich the exploitation level of simulations. We show how the OLAP tool can be used to provide the user with the ability to synthesize environmental information and to understand nitrates emission in surface water by using comparative, personalized views on historical data. To perform advanced analyses that aim to find meaningful patterns and relationships in the data, the Nitrogen in catchment data warehouse should be extended with data mining or information retrieval methods as Skyline queries (Bouadi et al., 2012). (Beaujouan et al., 2002) Beaujouan, V., Durand, P., Ruiz, L., Aurousseau, P., and Cotteret, G. (2002). A hydrological model dedicated to topography-based simulation of nitrogen transfer and transformation: rationale and application to the geomorphology denitrification relationship. Hydrological Processes, pages 493-507. (Bouadi et al., 2012) Bouadi, T., Cordier, M., and Quiniou, R. (2012). Incremental computation of skyline queries with dynamic preferences. In DEXA (1), pages 219-233.
Ohkubo, Hirotsugu; Nakagawa, Hiroaki; Niimi, Akio
2018-01-01
Idiopathic pulmonary fibrosis (IPF) is the most common type of progressive idiopathic interstitial pneumonia in adults. Many computer-based image analysis methods of chest computed tomography (CT) used in patients with IPF include the mean CT value of the whole lungs, density histogram analysis, density mask technique, and texture classification methods. Most of these methods offer good assessment of pulmonary functions, disease progression, and mortality. Each method has merits that can be used in clinical practice. One of the texture classification methods is reported to be superior to visual CT scoring by radiologist for correlation with pulmonary function and prediction of mortality. In this mini review, we summarize the current literature on computer-based CT image analysis of IPF and discuss its limitations and several future directions. Copyright © 2017 The Japanese Respiratory Society. Published by Elsevier B.V. All rights reserved.
Probabilistic Assessment of Fracture Progression in Composite Structures
NASA Technical Reports Server (NTRS)
Chamis, Christos C.; Minnetyan, Levon; Mauget, Bertrand; Huang, Dade; Addi, Frank
1999-01-01
This report describes methods and corresponding computer codes that are used to evaluate progressive damage and fracture and to perform probabilistic assessment in built-up composite structures. Structural response is assessed probabilistically, during progressive fracture. The effects of design variable uncertainties on structural fracture progression are quantified. The fast probability integrator (FPI) is used to assess the response scatter in the composite structure at damage initiation. The sensitivity of the damage response to design variables is computed. The methods are general purpose and are applicable to stitched and unstitched composites in all types of structures and fracture processes starting from damage initiation to unstable propagation and to global structure collapse. The methods are demonstrated for a polymer matrix composite stiffened panel subjected to pressure. The results indicated that composite constituent properties, fabrication parameters, and respective uncertainties have a significant effect on structural durability and reliability. Design implications with regard to damage progression, damage tolerance, and reliability of composite structures are examined.
Biomedical Computing Technology Information Center: introduction and report of early progress
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maskewitz, B.F.; Henne, R.L.; McClain, W.J.
1976-01-01
In July 1975, the Biomedical Computing Technology Information Center (BCTIC) was established by the Division of Biomedical and Environmental Research of the U. S. Energy Research and Development Administration (ERDA) at the Oak Ridge National Laboratory. BCTIC collects, organizes, evaluates, and disseminates information on computing technology pertinent to biomedicine, providing needed routes of communication between installations and serving as a clearinghouse for the exchange of biomedical computing software, data, and interface designs. This paper presents BCTIC's functions and early progress to the MUMPS Users' Group in order to stimulate further discussion and cooperation between the two organizations. (BCTIC services aremore » available to its sponsors and their contractors and to any individual/group willing to participate in mutual exchange.) 1 figure.« less
Asia Federation Report on International Symposium on Grid Computing 2009 (ISGC 2009)
NASA Astrophysics Data System (ADS)
Grey, Francois
This report provides an overview of developments in the Asia-Pacific region, based on presentations made at the International Symposium on Grid Computing 2009 (ISGC 09), held 21-23 April. This document contains 14 sections, including a progress report on general Asia-EU Grid activities as well as progress reports by representatives of 13 Asian countries presented at ISGC 09. In alphabetical order, these are: Australia, China, India, Indonesia, Japan, Malaysia, Pakistan, Philippines, Singapore, South Korea, Taiwan, Thailand and Vietnam.
ERIC Educational Resources Information Center
West, Patti; Rutstein, Daisy Wise; Mislevy, Robert J.; Liu, Junhui; Choi, Younyoung; Levy, Roy; Crawford, Aaron; DiCerbo, Kristen E.; Chappel, Kristina; Behrens, John T.
2010-01-01
A major issue in the study of learning progressions (LPs) is linking student performance on assessment tasks to the progressions. This report describes the challenges faced in making this linkage using Bayesian networks to model LPs in the field of computer networking. The ideas are illustrated with exemplar Bayesian networks built on Cisco…
Technology and Mathematics Education: A Survey of Recent Developments and Important Problems.
ERIC Educational Resources Information Center
Fey, James T.
1989-01-01
Provided is an overview and analysis of recent progress in applying electronic information technology to creation of new environments for intellectual work in mathematics. Describes the impact of numerical computation, graphic computation, symbolic computation, multiple representation of information, programing and information, and artificial…
3-D Perspective View, Kamchatka Peninsula, Russia
NASA Technical Reports Server (NTRS)
2000-01-01
This perspective view shows the western side of the volcanically active Kamchatka Peninsula in eastern Russia. The image was generated using the first data collected during the Shuttle Radar Topography Mission (SRTM). In the foreground is the Sea of Okhotsk. Inland from the coast, vegetated floodplains and low relief hills rise toward snow capped peaks. The topographic effects on snow and vegetation distribution are very clear in this near-horizontal view. Forming the skyline is the Sredinnyy Khrebet, the volcanic mountain range that makes up the spine of the peninsula. High resolution SRTM topographic data will be used by geologists to study how volcanoes form and to understand the hazards posed by future eruptions.
This image was generated using topographic data from SRTM and an enhanced true-color image from the Landsat 7 satellite. This image contains about 2,400 meters (7,880 feet) of total relief. The topographic expression was enhanced by adding artificial shading as calculated from the SRTM elevation model. The Landsat data was provided by the United States Geological Survey's Earth Resources Observations Systems (EROS) Data Center, Sioux Falls, South Dakota.SRTM, launched on February 11, 2000, used the same radar instrument that comprised the Spaceborne Imaging Radar-C/X-Band Synthetic Aperture Radar(SIR-C/X-SAR) that flew twice on the Space Shuttle Endeavour in 1994. To collect the 3-D SRTM data, engineers added a 60-meter-long (200-foot) mast, installed additional C-band and X-band antennas, and improved tracking and navigation devices. SRTM collected three-dimensional measurements of nearly 80 percent of the Earth's surface. SRTM is a cooperative project between NASA, the National Imagery and Mapping Agency (NIMA) of the U.S. Department of Defense, and the German and Italian space agencies. It is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Earth Science Enterprise, Washington, D.C.Size: 33.3 km (20.6 miles) wide x 136 km (84 miles) coast to skyline Location: 58.3 deg. North lat., 160 deg. East long. Orientation: Easterly view, 2 degrees down from horizontal Original Data Resolution: 30 meters (99 feet) Vertical Exaggeration: 3 times Date Acquired: February 12, 2000 (SRTM) August 1, 1999 (Landsat) Image: NASA/JPL/NIMAGraeco-Roman Astro-Architecture: The Temples of Pompeii
NASA Astrophysics Data System (ADS)
Tiede, Vance R.
2014-01-01
Roman architect Marcus Vetruvius Pollio (ca. 75-15 BC) wrote, “[O]ne who professes himself as an architect should be…acquainted with astronomy and the theory of the heavens…. From astronomy we find the east, west, south, and north, as well as the theory of the heavens, the Equinox, Solstice and courses of the Stars.” (De Architectura Libri Decem I:i:3,10). In order to investigate the role of astronomy in temple orientation, the author conducted a preliminary GIS DEM/Satellite Imaging survey of 11 temples at Pompeii, Italy (N 40d 45', E 14d 29'). The GIS survey measured the true azimuth and horizon altitude of each temple’s major axis and was field checked by a Ground Truth survey with theodolite and GPS, 5-18 April 2013. The resulting 3D vector data was analyzed with Program STONEHENGE (Hawkins 1983, 328) to identify the local skyline declinations aligned with the temple major axes. Analysis suggests that the major axes of the temples of Apollo, Jupiter and Venus are equally as likely to have been oriented to Pompeii’s urban grid, itself oriented NW-SE on Mt. Vesuvius’ slope and hydraulic gradient to optimize urban sewer/street drainage (cf. Hodge 1992). However, the remaining nine temples appear to be oriented to astronomical targets on the local horizon associated with Graeco-Roman calendrics and mythology. TEMPLE/ DATE/ MAJOR AXIS ASTRO-TARGET (Skyline Declination in degrees) Public Lares/AD 50/ Cross-Quarter 7 Nov/3 Feb Sun Set, Last Gleam (-16.5) Vespsian/ AD 69-79/ Cross-Quarter 7 Nov/3 Feb Sun Set, LG (-16.2) Fortuna Augusta/ AD 1/ Winter Solstice Sun Set, LG (-22.9) Aesculapius/ 100 BC/ Perseus Rise (β Persei-Algol = +33.0) & Midsummer Moon Major Stand Still Set, LG (-28.1) Isis/ 100 BC/ Midwinter Moon Major Stand Still Rise, Tangent (+28.5) & Equinox Sun Set, Tangent (-0.3) Jupiter/ 150 BC/ Θ Scorpionis-Sargas Rise (-38.0) Apollo/ 550 (rebuilt 70 BC)/ α Columbae-Phact Rise (-37.1) Venus/ 150 BC (rebuilt 70 BC)/ α Columbae-Phact Rise (-37.7) Ceres/ 250 BC/ Midsummer Moon Major Stand Still Set, LG (-27.9) Dioysyus/ 250 BC/ Equinox Sun Set, LG (+0.3) Doric/ 550 BC/ β Orionis-Rigel Rise (-14.6)
Computational approach for deriving cancer progression roadmaps from static sample data
Yao, Jin; Yang, Le; Chen, Runpu; Nowak, Norma J.
2017-01-01
Abstract As with any biological process, cancer development is inherently dynamic. While major efforts continue to catalog the genomic events associated with human cancer, it remains difficult to interpret and extrapolate the accumulating data to provide insights into the dynamic aspects of the disease. Here, we present a computational strategy that enables the construction of a cancer progression model using static tumor sample data. The developed approach overcame many technical limitations of existing methods. Application of the approach to breast cancer data revealed a linear, branching model with two distinct trajectories for malignant progression. The validity of the constructed model was demonstrated in 27 independent breast cancer data sets, and through visualization of the data in the context of disease progression we were able to identify a number of potentially key molecular events in the advance of breast cancer to malignancy. PMID:28108658
Computational Simulation of Composite Structural Fatigue
NASA Technical Reports Server (NTRS)
Minnetyan, Levon; Chamis, Christos C. (Technical Monitor)
2005-01-01
Progressive damage and fracture of composite structures subjected to monotonically increasing static, tension-tension cyclic, pressurization, and flexural cyclic loading are evaluated via computational simulation. Constituent material properties, stress and strain limits are scaled up to the structure level to evaluate the overall damage and fracture propagation for composites. Damage initiation, growth, accumulation, and propagation to fracture due to monotonically increasing static and cyclic loads are included in the simulations. Results show the number of cycles to failure at different temperatures and the damage progression sequence during different degradation stages. A procedure is outlined for use of computational simulation data in the assessment of damage tolerance, determination of sensitive parameters affecting fracture, and interpretation of results with insight for design decisions.
Computational Simulation of Composite Structural Fatigue
NASA Technical Reports Server (NTRS)
Minnetyan, Levon
2004-01-01
Progressive damage and fracture of composite structures subjected to monotonically increasing static, tension-tension cyclic, pressurization, and flexural cyclic loading are evaluated via computational simulation. Constituent material properties, stress and strain limits are scaled up to the structure level to evaluate the overall damage and fracture propagation for composites. Damage initiation, growth, accumulation, and propagation to fracture due to monotonically increasing static and cyclic loads are included in the simulations. Results show the number of cycles to failure at different temperatures and the damage progression sequence during different degradation stages. A procedure is outlined for use of computational simulation data in the assessment of damage tolerance, determination of sensitive parameters affecting fracture, and interpretation of results with insight for design decisions.
InPRO: Automated Indoor Construction Progress Monitoring Using Unmanned Aerial Vehicles
NASA Astrophysics Data System (ADS)
Hamledari, Hesam
In this research, an envisioned automated intelligent robotic solution for automated indoor data collection and inspection that employs a series of unmanned aerial vehicles (UAV), entitled "InPRO", is presented. InPRO consists of four stages, namely: 1) automated path planning; 2) autonomous UAV-based indoor inspection; 3) automated computer vision-based assessment of progress; and, 4) automated updating of 4D building information models (BIM). The works presented in this thesis address the third stage of InPRO. A series of computer vision-based methods that automate the assessment of construction progress using images captured at indoor sites are introduced. The proposed methods employ computer vision and machine learning techniques to detect the components of under-construction indoor partitions. In particular, framing (studs), insulation, electrical outlets, and different states of drywall sheets (installing, plastering, and painting) are automatically detected using digital images. High accuracy rates, real-time performance, and operation without a priori information are indicators of the methods' promising performance.
Efficient Predictions of Excited State for Nanomaterials Using Aces 3 and 4
2017-12-20
by first-principle methods in the software package ACES by using large parallel computers, growing tothe exascale. 15. SUBJECT TERMS Computer...modeling, excited states, optical properties, structure, stability, activation barriers first principle methods , parallel computing 16. SECURITY...2 Progress with new density functional methods
Gendered Patterns in Computing Work in the Late 1990s.
ERIC Educational Resources Information Center
Panteli, Niki; Stack, Janet; Ramsay, Harvie
2001-01-01
Data on information technology employment in Britain and interviews in four companies depicted experiences of women in computing. Gender disparities in numbers and distribution, salaries, division of labor, and career progression were found. Masculine values in computing culture, gender differences in working style, and attitudes toward computers…
Summary of research in progress at ICASE
NASA Technical Reports Server (NTRS)
1993-01-01
This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, fluid mechanics, and computer science during the period October 1, 1992 through March 31, 1993.
For operation of the Computer Software Management and Information Center (COSMIC)
NASA Technical Reports Server (NTRS)
Carmon, J. L.
1983-01-01
Progress report on current status of computer software management and information center (COSMIC) includes the following areas: inventory, evaluation and publication, marketing, customer service, maintenance and support, and budget summary.
On-board computer progress in development of A 310 flight testing program
NASA Technical Reports Server (NTRS)
Reau, P.
1981-01-01
Onboard computer progress in development of an Airbus A 310 flight testing program is described. Minicomputers were installed onboard three A 310 airplanes in 1979 in order to: (1) assure the flight safety by exercising a limit check of a given set of parameters; (2) improve the efficiency of flight tests and allow cost reduction; and (3) perform test analysis on an external basis by utilizing onboard flight types. The following program considerations are discussed: (1) conclusions based on simulation of an onboard computer system; (2) brief descriptions of A 310 airborne computer equipment, specifically the onboard universal calculator (CUB) consisting of a ROLM 1666 system and visualization system using an AFIGRAF CRT; (3) the ground system and flight information inputs; and (4) specifications and execution priorities for temporary and permanent programs.
Lee, Jeong Woo; Kim, Ho Gak; Lee, Dong Wook; Han, Jimin; Kwon, Hyuk Yong; Seo, Chang Jin; Oh, Ji Hye; Lee, Joo Hyoung; Jung, Jin Tae; Kwon, Joong Goo; Kim, Eun Young
2016-05-23
Smoking and alcohol intake are two wellknown risk factors for chronic pancreatitis. However, there are few studies examining the association between smoking and changes in computed tomography (CT) findings in chronic pancreatitis. The authors evaluated associations between smoking, drinking and the progression of calcification on CT in chronic pancreatitis. In this retrospective study, 59 patients with chronic pancreatitis who had undergone initial and follow-up CT between January 2002 and September 2010 were included. Progression of calcification among CT findings was compared according to the amount of alcohol intake and smoking. The median duration of followup was 51.6 months (range, 17.1 to 112.7 months). At initial CT findings, there was pancreatic calcification in 35 patients (59.3%). In the follow-up CT, progression of calcification was observed in 37 patients (62.7%). Progression of calcification was more common in smokers according to the multivariate analysis (odds ratio [OR], 9.987; p=0.006). The amount of smoking was a significant predictor for progression of calcification in the multivariate analysis (OR, 6.051 in less than 1 pack per day smokers; OR, 36.562 in more than 1 pack per day smokers; p=0.008). Continued smoking accelerates pancreatic calcification, and the amount of smoking is associated with the progression of calcification in chronic pancreatitis.
Proceedings: Computer Science and Data Systems Technical Symposium, volume 1
NASA Technical Reports Server (NTRS)
Larsen, Ronald L.; Wallgren, Kenneth
1985-01-01
Progress reports and technical updates of programs being performed by NASA centers are covered. Presentations in viewgraph form are included for topics in three categories: computer science, data systems and space station applications.
QUARTERLY TECHNICAL PROGRESS REPORT, JULY, AUGUST, SEPTEMBER 1967.
Contents: Circuit research program; Hardware systems research; Computer system software research; Illinois pattern recognition computer: ILLIAC II... service , use, and program development; IBM 7094/1401 service , use, and program development; Problem specifications; General laboratory information.
ERIC Educational Resources Information Center
Balajthy, Ernest
Results of the 1985-86 National Assessment of Educational Progress (NAEP) survey of American students' knowledge of computers suggest that American schools have a long way to go before computers can be said to have made a significant impact. The survey covered the 3rd, 7th, and 11th grade levels and assessed competence in knowledge of computers,…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Box, D.; Boyd, J.; Di Benedetto, V.
2016-01-01
The FabrIc for Frontier Experiments (FIFE) project is an initiative within the Fermilab Scientific Computing Division designed to steer the computing model for non-LHC Fermilab experiments across multiple physics areas. FIFE is a collaborative effort between experimenters and computing professionals to design and develop integrated computing models for experiments of varying size, needs, and infrastructure. The major focus of the FIFE project is the development, deployment, and integration of solutions for high throughput computing, data management, database access and collaboration management within an experiment. To accomplish this goal, FIFE has developed workflows that utilize Open Science Grid compute sites alongmore » with dedicated and commercial cloud resources. The FIFE project has made significant progress integrating into experiment computing operations several services including a common job submission service, software and reference data distribution through CVMFS repositories, flexible and robust data transfer clients, and access to opportunistic resources on the Open Science Grid. The progress with current experiments and plans for expansion with additional projects will be discussed. FIFE has taken the leading role in defining the computing model for Fermilab experiments, aided in the design of experiments beyond those hosted at Fermilab, and will continue to define the future direction of high throughput computing for future physics experiments worldwide.« less
Discovery and Development of ATP-Competitive mTOR Inhibitors Using Computational Approaches.
Luo, Yao; Wang, Ling
2017-11-16
The mammalian target of rapamycin (mTOR) is a central controller of cell growth, proliferation, metabolism, and angiogenesis. This protein is an attractive target for new anticancer drug development. Significant progress has been made in hit discovery, lead optimization, drug candidate development and determination of the three-dimensional (3D) structure of mTOR. Computational methods have been applied to accelerate the discovery and development of mTOR inhibitors helping to model the structure of mTOR, screen compound databases, uncover structure-activity relationship (SAR) and optimize the hits, mine the privileged fragments and design focused libraries. Besides, computational approaches were also applied to study protein-ligand interactions mechanisms and in natural product-driven drug discovery. Herein, we survey the most recent progress on the application of computational approaches to advance the discovery and development of compounds targeting mTOR. Future directions in the discovery of new mTOR inhibitors using computational methods are also discussed. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
NASA Technical Reports Server (NTRS)
Mital, Subodh K.; Murthy, Pappu L. N.; Chamis, Christos C.
1994-01-01
A computational simulation procedure is presented for nonlinear analyses which incorporates microstress redistribution due to progressive fracture in ceramic matrix composites. This procedure facilitates an accurate simulation of the stress-strain behavior of ceramic matrix composites up to failure. The nonlinearity in the material behavior is accounted for at the constituent (fiber/matrix/interphase) level. This computational procedure is a part of recent upgrades to CEMCAN (Ceramic Matrix Composite Analyzer) computer code. The fiber substructuring technique in CEMCAN is used to monitor the damage initiation and progression as the load increases. The room-temperature tensile stress-strain curves for SiC fiber reinforced reaction-bonded silicon nitride (RBSN) matrix unidirectional and angle-ply laminates are simulated and compared with experimentally observed stress-strain behavior. Comparison between the predicted stress/strain behavior and experimental stress/strain curves is good. Collectively the results demonstrate that CEMCAN computer code provides the user with an effective computational tool to simulate the behavior of ceramic matrix composites.
Design for progressive fracture in composite shell structures
NASA Technical Reports Server (NTRS)
Minnetyan, Levon; Murthy, Pappu L. N.
1992-01-01
The load carrying capability and structural behavior of composite shell structures and stiffened curved panels are investigated to provide accurate early design loads. An integrated computer code is utilized for the computational simulation of composite structural degradation under practical loading for realistic design. Damage initiation, growth, accumulation, and propagation to structural fracture are included in the simulation. Progressive fracture investigations providing design insight for several classes of composite shells are presented. Results demonstrate the significance of local defects, interfacial regions, and stress concentrations on the structural durability of composite shells.
ERIC Educational Resources Information Center
Johnson, James Nathaniel
2013-01-01
Ubiquitous computing is a near reality in both the private and public arena. Business and personal spaces are seeing a proliferation of mobile computing devices and pervasive computing technologies. This phenomenon is creating a unique set of challenges for organizational IT professionals, specifically in the numerous spillover effects of having…
Realizing the Promise of Visualization in the Theory of Computing
ERIC Educational Resources Information Center
Cogliati, Joshua J.; Goosey, Frances W.; Grinder, Michael T.; Pascoe, Bradley A.; Ross, Rockford J.; Williams, Cheston J.
2005-01-01
Progress on a hypertextbook on the theory of computing is presented. The hypertextbook is a novel teaching and learning resource built around web technologies that incorporates text, sound, pictures, illustrations, slide shows, video clips, and--most importantly--active learning models of the key concepts of the theory of computing into an…
THE DEVELOPMENT AND PRESENTATION OF FOUR COLLEGE COURSES BY COMPUTER TELEPROCESSING. FINAL REPORT.
ERIC Educational Resources Information Center
MITZEL, HAROLD E.
THIS IS A FINAL REPORT ON THE DEVELOPMENT AND PRESENTATION OF FOUR COLLEGE COURSES BY COMPUTER TELEPROCESSING FROM APRIL 1964 TO JUNE 1967. IT OUTLINES THE PROGRESS MADE TOWARDS THE PREPARATION, DEVELOPMENT, AND EVALUATION OF MATERIALS FOR COMPUTER PRESENTATION OF COURSES IN AUDIOLOGY, MANAGEMENT ACCOUNTING, ENGINEERING ECONOMICS, AND MODERN…
Models and techniques for evaluating the effectiveness of aircraft computing systems
NASA Technical Reports Server (NTRS)
Meyer, J. F.
1978-01-01
Progress in the development of system models and techniques for the formulation and evaluation of aircraft computer system effectiveness is reported. Topics covered include: analysis of functional dependence: a prototype software package, METAPHOR, developed to aid the evaluation of performability; and a comprehensive performability modeling and evaluation exercise involving the SIFT computer.
Computer-Based Simulations for Maintenance Training: Current ARI Research. Technical Report 544.
ERIC Educational Resources Information Center
Knerr, Bruce W.; And Others
Three research efforts that used computer-based simulations for maintenance training were in progress when this report was written: Game-Based Learning, which investigated the use of computer-based games to train electronics diagnostic skills; Human Performance in Fault Diagnosis Tasks, which evaluated the use of context-free tasks to train…
Rajendran, Barani Kumar; Deng, Chu-Xia
2017-01-01
Breast cancer is the second most frequently occurring form of cancer and is also the second most lethal cancer in women worldwide. A genetic mutation is one of the key factors that alter multiple cellular regulatory pathways and drive breast cancer initiation and progression yet nature of these cancer drivers remains elusive. In this article, we have reviewed various computational perspectives and algorithms for exploring breast cancer driver mutation genes. Using both frequency based and mutational exclusivity based approaches, we identified 195 driver genes and shortlisted 63 of them as candidate drivers for breast cancer using various computational approaches. Finally, we conducted network and pathway analysis to explore their functions in breast tumorigenesis including tumor initiation, progression, and metastasis. PMID:28477017
Design for inadvertent damage in composite laminates
NASA Technical Reports Server (NTRS)
Singhal, Surendra N.; Chamis, Christos C.
1992-01-01
Simplified predictive methods and models to computationally simulate durability and damage in polymer matrix composite materials/structures are described. The models include (1) progressive fracture, (2) progressively damaged structural behavior, (3) progressive fracture in aggressive environments, (4) stress concentrations, and (5) impact resistance. Several examples are included to illustrate applications of the models and to identify significant parameters and sensitivities. Comparisons with limited experimental data are made.
NASA Technical Reports Server (NTRS)
Gassaway, J. D.; Mahmood, Q.; Trotter, J. D.
1980-01-01
Quarterly report describes progress in three programs: dc sputtering machine for aluminum and aluminum alloys; two dimensional computer modeling of MOS transistors; and development of computer techniques for calculating redistribution diffusion of dopants in silicon on sapphire films.
Reproducible research in vadose zone sciences
USDA-ARS?s Scientific Manuscript database
A significant portion of present-day soil and Earth science research is computational, involving complex data analysis pipelines, advanced mathematical and statistical models, and sophisticated computer codes. Opportunities for scientific progress are greatly diminished if reproducing and building o...
Bai, Xiao-ping; Zhang, Xi-wei
2013-01-01
Selecting construction schemes of the building engineering project is a complex multiobjective optimization decision process, in which many indexes need to be selected to find the optimum scheme. Aiming at this problem, this paper selects cost, progress, quality, and safety as the four first-order evaluation indexes, uses the quantitative method for the cost index, uses integrated qualitative and quantitative methodologies for progress, quality, and safety indexes, and integrates engineering economics, reliability theories, and information entropy theory to present a new evaluation method for building construction project. Combined with a practical case, this paper also presents detailed computing processes and steps, including selecting all order indexes, establishing the index matrix, computing score values of all order indexes, computing the synthesis score, sorting all selected schemes, and making analysis and decision. Presented method can offer valuable references for risk computing of building construction projects.
Natural History of Ground-Glass Lesions Among Patients With Previous Lung Cancer.
Shewale, Jitesh B; Nelson, David B; Rice, David C; Sepesi, Boris; Hofstetter, Wayne L; Mehran, Reza J; Vaporciyan, Ara A; Walsh, Garrett L; Swisher, Stephen G; Roth, Jack A; Antonoff, Mara B
2018-06-01
Among patients with previous lung cancer, the malignant potential of subsequent ground-glass opacities (GGOs) on computed tomography remains unknown, with a lack of consensus regarding surveillance and intervention. This study sought to describe the natural history of GGO in patients with a history of lung cancer. A retrospective review was performed of 210 patients with a history of lung cancer and ensuing computed tomography evidence of pure or mixed GGOs between 2007 and 2013. Computed tomography reports were reviewed to determine the fate of the GGOs, by classifying all lesions as stable, resolved, or progressive over the course of the study. Multivariable analysis was performed to identify predictors of GGO progression and resolution. The mean follow-up time was 13 months. During this period, 55 (26%) patients' GGOs were stable, 131 (62%) resolved, and 24 (11%) progressed. Of the 24 GGOs that progressed, three were subsequently diagnosed as adenocarcinoma. Patients of black race (odds ratio [OR], 0.26) and other races besides white (OR, 0.89) had smaller odds of GGO resolution (p = 0.033), whereas patients with previous lung squamous cell carcinoma (OR, 5.16) or small cell carcinoma (OR, 5.36) were more likely to experience GGO resolution (p < 0.001). On multivariable analysis, only a history of adenocarcinoma was an independent predictor of GGO progression (OR, 6.9; p = 0.011). Among patients with a history of lung cancer, prior adenocarcinoma emerged as a predictor of GGO progression, whereas a history of squamous cell carcinoma or small cell carcinoma and white race were identified as predictors of GGO resolution. Copyright © 2018 The Society of Thoracic Surgeons. Published by Elsevier Inc. All rights reserved.
GENOA-PFA: Progressive Fracture in Composites Simulated Computationally
NASA Technical Reports Server (NTRS)
Murthy, Pappu L. N.
2000-01-01
GENOA-PFA is a commercial version of the Composite Durability Structural Analysis (CODSTRAN) computer program that simulates the progression of damage ultimately leading to fracture in polymer-matrix-composite (PMC) material structures under various loading and environmental conditions. GENOA-PFA offers several capabilities not available in other programs developed for this purpose, making it preferable for use in analyzing the durability and damage tolerance of complex PMC structures in which the fiber reinforcements occur in two- and three-dimensional weaves and braids. GENOA-PFA implements a progressive-fracture methodology based on the idea that a structure fails when flaws that may initially be small (even microscopic) grow and/or coalesce to a critical dimension where the structure no longer has an adequate safety margin to avoid catastrophic global fracture. Damage is considered to progress through five stages: (1) initiation, (2) growth, (3) accumulation (coalescence of propagating flaws), (4) stable propagation (up to the critical dimension), and (5) unstable or very rapid propagation (beyond the critical dimension) to catastrophic failure. The computational simulation of progressive failure involves formal procedures for identifying the five different stages of damage and for relating the amount of damage at each stage to the overall behavior of the deteriorating structure. In GENOA-PFA, mathematical modeling of the composite physical behavior involves an integration of simulations at multiple, hierarchical scales ranging from the macroscopic (lamina, laminate, and structure) to the microscopic (fiber, matrix, and fiber/matrix interface), as shown in the figure. The code includes algorithms to simulate the progression of damage from various source defects, including (1) through-the-thickness cracks and (2) voids with edge, pocket, internal, or mixed-mode delaminations.
Mathematics and statistics research progress report, period ending June 30, 1983
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beauchamp, J. J.; Denson, M. V.; Heath, M. T.
1983-08-01
This report is the twenty-sixth in the series of progress reports of Mathematics and Statistics Research of the Computer Sciences organization, Union Carbide Corporation Nuclear Division. Part A records research progress in analysis of large data sets, applied analysis, biometrics research, computational statistics, materials science applications, numerical linear algebra, and risk analysis. Collaboration and consulting with others throughout the Oak Ridge Department of Energy complex are recorded in Part B. Included are sections on biological sciences, energy, engineering, environmental sciences, health and safety, and safeguards. Part C summarizes the various educational activities in which the staff was engaged. Part Dmore » lists the presentations of research results, and Part E records the staff's other professional activities during the report period.« less
Damage Progression in Bolted Composites
NASA Technical Reports Server (NTRS)
Minnetyan, Levon; Chamis, Christos C.; Gotsis, Pascal K.
1998-01-01
Structural durability, damage tolerance, and progressive fracture characteristics of bolted graphite/epoxy composite laminates are evaluated via computational simulation. Constituent material properties and stress and strain limits are scaled up to the structure level to evaluate the overall damage and fracture propagation for bolted composites. Single and double bolted composite specimens with various widths and bolt spacings are evaluated. The effect of bolt spacing is investigated with regard to the structural durability of a bolted joint. Damage initiation, growth, accumulation, and propagation to fracture are included in the simulations. Results show the damage progression sequence and structural fracture resistance during different degradation stages. A procedure is outlined for the use of computational simulation data in the assessment of damage tolerance, determination of sensitive parameters affecting fracture, and interpretation of experimental results with insight for design decisions.
Damage Progression in Bolted Composites
NASA Technical Reports Server (NTRS)
Minnetyan, Levon; Chamis, Christos; Gotsis, Pascal K.
1998-01-01
Structural durability,damage tolerance,and progressive fracture characteristics of bolted graphite/epoxy composite laminates are evaluated via computational simulation. Constituent material properties and stress and strain limits are scaled up to the structure level to evaluate the overall damage and fracture propagation for bolted composites. Single and double bolted composite specimens with various widths and bolt spacings are evaluated. The effect of bolt spacing is investigated with regard to the structural durability of a bolted joint. Damage initiation, growth, accumulation, and propagation to fracture are included in the simulations. Results show the damage progression sequence and structural fracture resistance during different degradation stages. A procedure is outlined for the use of computational simulation data in the assessment of damage tolerance, determination of sensitive parameters affecting fracture, and interpretation of experimental results with insight for design decisions.
NASA Technical Reports Server (NTRS)
Bibb, Karen L.; Prabhu, Ramadas K.
2004-01-01
In support of the Columbia Accident Investigation, inviscid computations of the aerodynamic characteristics for various Shuttle Orbiter damage scenarios were performed using the FELISA unstructured CFD solver. Computed delta aerodynamics were compared with the reconstructed delta aerodynamics in order to postulate a progression of damage through the flight trajectory. By performing computations at hypervelocity flight and CF4 tunnel conditions, a bridge was provided between wind tunnel testing in Langley's 20-Inch CF4 facility and the flight environment experienced by Columbia during re-entry. The rapid modeling capability of the unstructured methodology allowed the computational effort to keep pace with the wind tunnel and, at times, guide the wind tunnel efforts. These computations provided a detailed view of the flowfield characteristics and the contribution of orbiter components (such as the vertical tail and wing) to aerodynamic forces and moments that were unavailable from wind tunnel testing. The damage scenarios are grouped into three categories. Initially, single and multiple missing full RCC panels were analyzed to determine the effect of damage location and magnitude on the aerodynamics. Next is a series of cases with progressive damage, increasing in severity, in the region of RCC panel 9. The final group is a set of wing leading edge and windward surface deformations that model possible structural deformation of the wing skin due to internal heating of the wing structure. By matching the aerodynamics from selected damage scenarios to the reconstructed flight aerodynamics, a progression of damage that is consistent with the flight data, debris forensics, and wind tunnel data is postulated.
Convergence properties of simple genetic algorithms
NASA Technical Reports Server (NTRS)
Bethke, A. D.; Zeigler, B. P.; Strauss, D. M.
1974-01-01
The essential parameters determining the behaviour of genetic algorithms were investigated. Computer runs were made while systematically varying the parameter values. Results based on the progress curves obtained from these runs are presented along with results based on the variability of the population as the run progresses.
Modeling Geometry and Progressive Failure of Material Interfaces in Plain Weave Composites
NASA Technical Reports Server (NTRS)
Hsu, Su-Yuen; Cheng, Ron-Bin
2010-01-01
A procedure combining a geometrically nonlinear, explicit-dynamics contact analysis, computer aided design techniques, and elasticity-based mesh adjustment is proposed to efficiently generate realistic finite element models for meso-mechanical analysis of progressive failure in textile composites. In the procedure, the geometry of fiber tows is obtained by imposing a fictitious expansion on the tows. Meshes resulting from the procedure are conformal with the computed tow-tow and tow-matrix interfaces but are incongruent at the interfaces. The mesh interfaces are treated as cohesive contact surfaces not only to resolve the incongruence but also to simulate progressive failure. The method is employed to simulate debonding at the material interfaces in a ceramic-matrix plain weave composite with matrix porosity and in a polymeric matrix plain weave composite without matrix porosity, both subject to uniaxial cyclic loading. The numerical results indicate progression of the interfacial damage during every loading and reverse loading event in a constant strain amplitude cyclic process. However, the composites show different patterns of damage advancement.
Proceedings: Computer Science and Data Systems Technical Symposium, volume 2
NASA Technical Reports Server (NTRS)
Larsen, Ronald L.; Wallgren, Kenneth
1985-01-01
Progress reports and technical updates of programs being performed by NASA centers are covered. Presentations in viewgraph form, along with abstracts, are included for topics in three catagories: computer science, data systems, and space station applications.
Faster, Better, Cheaper: A Decade of PC Progress.
ERIC Educational Resources Information Center
Crawford, Walt
1997-01-01
Reviews the development of personal computers and how computer components have changed in price and value. Highlights include disk drives; keyboards; displays; memory; color graphics; modems; CPU (central processing unit); storage; direct mail vendors; and future possibilities. (LRW)
Yue, Tao; Jia, Xinghua; Petrosino, Jennifer; Sun, Leming; Fan, Zhen; Fine, Jesse; Davis, Rebecca; Galster, Scott; Kuret, Jeff; Scharre, Douglas W.; Zhang, Mingjun
2017-01-01
With the increasing prevalence of Alzheimer’s disease (AD), significant efforts have been directed toward developing novel diagnostics and biomarkers that can enhance AD detection and management. AD affects the cognition, behavior, function, and physiology of patients through mechanisms that are still being elucidated. Current AD diagnosis is contingent on evaluating which symptoms and signs a patient does or does not display. Concerns have been raised that AD diagnosis may be affected by how those measurements are analyzed. Unbiased means of diagnosing AD using computational algorithms that integrate multidisciplinary inputs, ranging from nanoscale biomarkers to cognitive assessments, and integrating both biochemical and physical changes may provide solutions to these limitations due to lack of understanding for the dynamic progress of the disease coupled with multiple symptoms in multiscale. We show that nanoscale physical properties of protein aggregates from the cerebral spinal fluid and blood of patients are altered during AD pathogenesis and that these properties can be used as a new class of “physical biomarkers.” Using a computational algorithm, developed to integrate these biomarkers and cognitive assessments, we demonstrate an approach to impartially diagnose AD and predict its progression. Real-time diagnostic updates of progression could be made on the basis of the changes in the physical biomarkers and the cognitive assessment scores of patients over time. Additionally, the Nyquist-Shannon sampling theorem was used to determine the minimum number of necessary patient checkups to effectively predict disease progression. This integrated computational approach can generate patient-specific, personalized signatures for AD diagnosis and prognosis. PMID:28782028
Raman, Fabio; Scribner, Elizabeth; Saut, Olivier; Wenger, Cornelia; Colin, Thierry; Fathallah-Shaykh, Hassan M.
2016-01-01
Glioblastoma multiforme is a malignant brain tumor with poor prognosis and high morbidity due to its invasiveness. Hypoxia-driven motility and concentration-driven motility are two mechanisms of glioblastoma multiforme invasion in the brain. The use of anti-angiogenic drugs has uncovered new progression patterns of glioblastoma multiforme associated with significant differences in overall survival. Here, we apply a mathematical model of glioblastoma multiforme growth and invasion in humans and design computational trials using agents that target angiogenesis, tumor replication rates, or motility. The findings link highly-dispersive, moderately-dispersive, and hypoxia-driven tumors to the patterns observed in glioblastoma multiforme treated by anti-angiogenesis, consisting of progression by Expanding FLAIR, Expanding FLAIR + Necrosis, and Expanding Necrosis, respectively. Furthermore, replication rate-reducing strategies (e.g. Tumor Treating Fields) appear to be effective in highly-dispersive and moderately-dispersive tumors but not in hypoxia-driven tumors. The latter may respond to motility-reducing agents. In a population computational trial, with all three phenotypes, a correlation was observed between the efficacy of the rate-reducing agent and the prolongation of overall survival times. This research highlights the potential applications of computational trials and supports new hypotheses on glioblastoma multiforme phenotypes and treatment options. PMID:26756205
Jedynak, Bruno M.; Liu, Bo; Lang, Andrew; Gel, Yulia; Prince, Jerry L.
2014-01-01
Understanding the time-dependent changes of biomarkers related to Alzheimer’s disease (AD) is a key to assessing disease progression and to measuring the outcomes of disease-modifying therapies. In this paper, we validate an Alzheimer’s disease progression score model which uses multiple biomarkers to quantify the AD progression of subjects following three assumptions: (1) there is a unique disease progression for all subjects, (2) each subject has a different age of onset and rate of progression, and (3) each biomarker is sigmoidal as a function of disease progression. Fitting the parameters of this model is a challenging problem which we approach using an alternating least squares optimization algorithm. In order to validate this optimization scheme under realistic conditions, we use the Alzheimer’s Disease Neuroimaging Initiative (ADNI) cohort. With the help of Monte Carlo simulations, we show that most of the global parameters of the model are tightly estimated, thus enabling an ordering of the biomarkers that fit the model well, ordered as: the Rey auditory verbal learning test with 30 minutes delay, the sum of the two lateral hippocampal volumes divided by the intra-cranial volume, followed by (the clinical dementia rating sum of boxes score and the mini mental state examination score) in no particular order and lastly the Alzheimer’s disease assessment scale-cognitive subscale. PMID:25444605
Lee, Jeong Woo; Kim, Ho Gak; Lee, Dong Wook; Han, Jimin; Kwon, Hyuk Yong; Seo, Chang Jin; Oh, Ji Hye; Lee, Joo Hyoung; Jung, Jin Tae; Kwon, Joong Goo; Kim, Eun Young
2016-01-01
Background/Aims Smoking and alcohol intake are two well-known risk factors for chronic pancreatitis. However, there are few studies examining the association between smoking and changes in computed tomography (CT) findings in chronic pancreatitis. The authors evaluated associations between smoking, drinking and the progression of calcification on CT in chronic pancreatitis. Methods In this retrospective study, 59 patients with chronic pancreatitis who had undergone initial and follow-up CT between January 2002 and September 2010 were included. Progression of calcification among CT findings was compared according to the amount of alcohol intake and smoking. Results The median duration of follow-up was 51.6 months (range, 17.1 to 112.7 months). At initial CT findings, there was pancreatic calcification in 35 patients (59.3%). In the follow-up CT, progression of calcification was observed in 37 patients (62.7%). Progression of calcification was more common in smokers according to the multivariate analysis (odds ratio [OR], 9.987; p=0.006). The amount of smoking was a significant predictor for progression of calcification in the multivariate analysis (OR, 6.051 in less than 1 pack per day smokers; OR, 36.562 in more than 1 pack per day smokers; p=0.008). Conclusions Continued smoking accelerates pancreatic calcification, and the amount of smoking is associated with the progression of calcification in chronic pancreatitis. PMID:26601825
ERIC Educational Resources Information Center
Olusi, F. I.; Asokhia, M. O.; Longe, B. O.
2009-01-01
The importance of studying what affects adult learners and the use computer is motivated by the fact that technological innovations are being churned out in geometrical progression in the 21st century. Not to be computer literate is to be in the realm of darkness. Despite the popularity of computer training some problems still inhibit adults in…
dV/dt - Accelerating the Rate of Progress towards Extreme Scale Collaborative Science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Livny, Miron
This report introduces publications that report the results of a project that aimed to design a computational framework that enables computational experimentation at scale while supporting the model of “submit locally, compute globally”. The project focuses on estimating application resource needs, finding the appropriate computing resources, acquiring those resources,deploying the applications and data on the resources, managing applications and resources during run.
STIC: Photonic Quantum Computation through Cavity Assisted Interaction
2007-12-28
PRA ; available as quant-ph/06060791. Report for the grant “Photonic Quantum Computation through Cavity Assisted Interaction” from DTO Luming Duan...cavity •B. Wang, L.-M. Duan, PRA 72 (in press, 2005) Single-photon source Photonic Quantum Computation through Cavity-Assisted Interaction H. Jeff Kimble...interaction [Duan, Wang, Kimble, PRA 05] • “Investigate more efficient methods for combating noise in photonic quantum computation ” • Partial progress
The River Basin Model: Computer Output. Water Pollution Control Research Series.
ERIC Educational Resources Information Center
Envirometrics, Inc., Washington, DC.
This research report is part of the Water Pollution Control Research Series which describes the results and progress in the control and abatement of pollution in our nation's waters. The River Basin Model described is a computer-assisted decision-making tool in which a number of computer programs simulate major processes related to water use that…
ERIC Educational Resources Information Center
Palumbo, Debra L; Palumbo, David B.
1993-01-01
Computer-based problem-solving software exposure was compared to Lego TC LOGO instruction. Thirty fifth graders received either Lego LOGO instruction, which couples Lego building block activities with LOGO computer programming, or instruction with various problem-solving computer programs. Although both groups showed significant progress, the Lego…
Dwindling Numbers of Female Computer Students: What Are We Missing?
ERIC Educational Resources Information Center
Saulsberry, Donna
2012-01-01
There is common agreement among researchers that women are under-represented in both 2-year and 4-year collegiate computer study programs. This leads to women being under-represented in the computer industry which may be limiting the progress of technology developments that will benefit mankind. It may also be depriving women of the opportunity to…
NASA Astrophysics Data System (ADS)
Tuccimei, P.; Giordano, G.; Tedeschi, M.
2006-03-01
The Colli Albani is the quiescent volcano that dominates the southwestern skyline of Roma (Italy). The last eruption occurred during the Holocene, from the eccentric Albano maar, along its western slope. The volcano is presently affected by cyclic seismic swarms, ground uplift and diffuse CO 2 degassing. The degassing has caused several deadly incidents during the last years and constitutes a major civil protection concern, as the volcano slopes are densely inhabited. Nevertheless, the volcano does not have a permanent monitoring network, and the background level and anomalous CO 2 levels, the relationship between the gas release and the seismic and ground deformation activity at the Colli Albani are still to be defined. The aim of this work is to define the historical record of CO 2 release. Evidences of deep CO 2 periodic release during the last 2000 years in the area of Colli Albani volcano (Roma, Italy) are offered from speleothems studies. A Roman-age stone mine, now used for mushroom cultivation, is decorated with actively growing speleothems, characterised by depositional hiatuses. Different levels of four stalactites, separated by depositional unconformities, and several samples from a single depositional cycle belonging to a stalagmite have been dated by U/Th method and analysed for their O and C isotopic composition. Eight cycles of deposition have been identified from 90-110 A.D. to 1350-1370 A.D., some of which are recognised across different speleothems. The age gap dividing different growth layers is in the order of one to few hundred years giving a temporal span for periodic interruption of speleothems deposition. O and C isotopic analyses performed on the samples collected from a single cycle (the oldest) have shown that the composition of the mother solutions was initially mainly meteoric and that a progressive increase in the input of a deep component rich in CO 2 (up to a proportion of 20-30%) occurred just before the interruption of the speleothem deposition. This could be due to a progressive increase of the acidity of the water solutions that caused the undersaturation of fluids. If we extrapolate this mechanism to the other cycles of deposition, being characterised by analogue isotopic compositions, we can hypothesise periods of deposition interrupted by episodes of CO 2 release which in the Colli Albani volcano are often recorded in coincidence with earthquakes. Therefore we have correlated the hiatuses with some of the largest historical earthquakes interesting to the city of Rome.
NASA Astrophysics Data System (ADS)
Marhadi, Kun Saptohartyadi
Structural optimization for damage tolerance under various unforeseen damage scenarios is computationally challenging. It couples non-linear progressive failure analysis with sampling-based stochastic analysis of random damage. The goal of this research was to understand the relationship between alternate load paths available in a structure and its damage tolerance, and to use this information to develop computationally efficient methods for designing damage tolerant structures. Progressive failure of a redundant truss structure subjected to small random variability was investigated to identify features that correlate with robustness and predictability of the structure's progressive failure. The identified features were used to develop numerical surrogate measures that permit computationally efficient deterministic optimization to achieve robustness and predictability of progressive failure. Analysis of damage tolerance on designs with robust progressive failure indicated that robustness and predictability of progressive failure do not guarantee damage tolerance. Damage tolerance requires a structure to redistribute its load to alternate load paths. In order to investigate the load distribution characteristics that lead to damage tolerance in structures, designs with varying degrees of damage tolerance were generated using brute force stochastic optimization. A method based on principal component analysis was used to describe load distributions (alternate load paths) in the structures. Results indicate that a structure that can develop alternate paths is not necessarily damage tolerant. The alternate load paths must have a required minimum load capability. Robustness analysis of damage tolerant optimum designs indicates that designs are tailored to specified damage. A design Optimized under one damage specification can be sensitive to other damages not considered. Effectiveness of existing load path definitions and characterizations were investigated for continuum structures. A load path definition using a relative compliance change measure (U* field) was demonstrated to be the most useful measure of load path. This measure provides quantitative information on load path trajectories and qualitative information on the effectiveness of the load path. The use of the U* description of load paths in optimizing structures for effective load paths was investigated.
[The laboratory of tomorrow. Particular reference to hematology].
Cazal, P
1985-01-01
A serious prediction can only be an extrapolation of recent developments. To be exact, the development has to continue in the same direction, which is only a probability. Probable development of hematological technology: Progress in methods. Development of new labelling methods: radio-elements, antibodies. Monoclonal antibodies. Progress in equipment: Cell counters and their adaptation to routine hemograms is a certainty. From analyzers: a promise that will perhaps become reality. Coagulometers: progress still to be made. Hemagglutination detectors and their application to grouping: good achievements, but the market is too limited. Computerization and automation: What form will the computerizing take? What will the computer do? Who will the computer control? What should the automatic analyzers be? Two current levels. Relationships between the automatic analysers and the computer. rapidity, fidelity and above all, reliability. Memory: large capacity and easy access. Disadvantages: conservatism and technical dependency. How can they be avoided? Development of the environment: Laboratory input: outside supplies, electricity, reagents, consumables. Samples and their identification. Output: distribution of results and communication problems. Centralization or decentralization? What will tomorrow's laboratory be? 3 hypotheses: optimistic, pessimistic, and balanced.
Transport theory and fluid dynamics
NASA Astrophysics Data System (ADS)
Greenberg, W.; Zweifel, P. F.
We report progress in various areas of applied mathematics relevant to transport theory under the subjects: abstract transport theory, explicit transport models and computation, and fluid dynamics. We present a brief review of progress during the past year and personnel supported, and we indicate the direction of our future research.
Shaikhouni, Ammar; Elder, J Bradley
2012-11-01
At the turn of the twentieth century, the only computational device used in neurosurgical procedures was the brain of the surgeon. Today, most neurosurgical procedures rely at least in part on the use of a computer to help perform surgeries accurately and safely. The techniques that revolutionized neurosurgery were mostly developed after the 1950s. Just before that era, the transistor was invented in the late 1940s, and the integrated circuit was invented in the late 1950s. During this time, the first automated, programmable computational machines were introduced. The rapid progress in the field of neurosurgery not only occurred hand in hand with the development of modern computers, but one also can state that modern neurosurgery would not exist without computers. The focus of this article is the impact modern computers have had on the practice of neurosurgery. Neuroimaging, neuronavigation, and neuromodulation are examples of tools in the armamentarium of the modern neurosurgeon that owe each step in their evolution to progress made in computer technology. Advances in computer technology central to innovations in these fields are highlighted, with particular attention to neuroimaging. Developments over the last 10 years in areas of sensors and robotics that promise to transform the practice of neurosurgery further are discussed. Potential impacts of advances in computers related to neurosurgery in developing countries and underserved regions are also discussed. As this article illustrates, the computer, with its underlying and related technologies, is central to advances in neurosurgery over the last half century. Copyright © 2012 Elsevier Inc. All rights reserved.
Progressive damage, fracture predictions and post mortem correlations for fiber composites
NASA Technical Reports Server (NTRS)
1985-01-01
Lewis Research Center is involved in the development of computational mechanics methods for predicting the structural behavior and response of composite structures. In conjunction with the analytical methods development, experimental programs including post failure examination are conducted to study various factors affecting composite fracture such as laminate thickness effects, ply configuration, and notch sensitivity. Results indicate that the analytical capabilities incorporated in the CODSTRAN computer code are effective in predicting the progressive damage and fracture of composite structures. In addition, the results being generated are establishing a data base which will aid in the characterization of composite fracture.
Mathematics and Statistics Research Department progress report, period ending June 30, 1982
DOE Office of Scientific and Technical Information (OSTI.GOV)
Denson, M.V.; Funderlic, R.E.; Gosslee, D.G.
1982-08-01
This report is the twenty-fifth in the series of progress reports of the Mathematics and Statistics Research Department of the Computer Sciences Division, Union Carbide Corporation Nuclear Division (UCC-ND). Part A records research progress in analysis of large data sets, biometrics research, computational statistics, materials science applications, moving boundary problems, numerical linear algebra, and risk analysis. Collaboration and consulting with others throughout the UCC-ND complex are recorded in Part B. Included are sections on biology, chemistry, energy, engineering, environmental sciences, health and safety, materials science, safeguards, surveys, and the waste storage program. Part C summarizes the various educational activities inmore » which the staff was engaged. Part D lists the presentations of research results, and Part E records the staff's other professional activities during the report period.« less
Certification trails and software design for testability
NASA Technical Reports Server (NTRS)
Sullivan, Gregory F.; Wilson, Dwight S.; Masson, Gerald M.
1993-01-01
Design techniques which may be applied to make program testing easier were investigated. Methods for modifying a program to generate additional data which we refer to as a certification trail are presented. This additional data is designed to allow the program output to be checked more quickly and effectively. Certification trails were described primarily from a theoretical perspective. A comprehensive attempt to assess experimentally the performance and overall value of the certification trail method is reported. The method was applied to nine fundamental, well-known algorithms for the following problems: convex hull, sorting, huffman tree, shortest path, closest pair, line segment intersection, longest increasing subsequence, skyline, and voronoi diagram. Run-time performance data for each of these problems is given, and selected problems are described in more detail. Our results indicate that there are many cases in which certification trails allow for significantly faster overall program execution time than a 2-version programming approach, and also give further evidence of the breadth of applicability of this method.
Kawahara, Rebeca; Bollinger, James G.; Rivera, César; Ribeiro, Ana Carolina P.; Brandão, Thaís Bianca; Paes Leme, Adriana F.; MacCoss, Michael J.
2015-01-01
Head and neck cancers, including oral squamous cell carcinoma (OSCC), are the sixth most common malignancy in the world and are characterized by poor prognosis and a low survival rate. Saliva is oral fluid with intimate contact with OSCC. Besides non-invasive, simple, and rapid to collect, saliva is a potential source of biomarkers. In this study, we build an SRM assay that targets fourteen OSCC candidate biomarker proteins, which were evaluated in a set of clinically-derived saliva samples. Using Skyline software package, we demonstrated a statistically significant higher abundance of the C1R, LCN2, SLPI, FAM49B, TAGLN2, CFB, C3, C4B, LRG1, SERPINA1 candidate biomarkers in the saliva of OSCC patients. Furthermore, our study also demonstrated that CFB, C3, C4B, SERPINA1 and LRG1 are associated with the risk of developing OSCC. Overall, this study successfully used targeted proteomics to measure in saliva a panel of biomarker candidates for OSCC. PMID:26552850
A multi-center study benchmarks software tools for label-free proteome quantification
Gillet, Ludovic C; Bernhardt, Oliver M.; MacLean, Brendan; Röst, Hannes L.; Tate, Stephen A.; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I.; Aebersold, Ruedi; Tenzer, Stefan
2016-01-01
The consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from SWATH-MS (sequential window acquisition of all theoretical fragment ion spectra), a method that uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test datasets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation windows setups. For consistent evaluation we developed LFQbench, an R-package to calculate metrics of precision and accuracy in label-free quantitative MS, and report the identification performance, robustness and specificity of each software tool. Our reference datasets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics. PMID:27701404
A multicenter study benchmarks software tools for label-free proteome quantification.
Navarro, Pedro; Kuharev, Jörg; Gillet, Ludovic C; Bernhardt, Oliver M; MacLean, Brendan; Röst, Hannes L; Tate, Stephen A; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I; Aebersold, Ruedi; Tenzer, Stefan
2016-11-01
Consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH 2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from sequential window acquisition of all theoretical fragment-ion spectra (SWATH)-MS, which uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test data sets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation-window setups. For consistent evaluation, we developed LFQbench, an R package, to calculate metrics of precision and accuracy in label-free quantitative MS and report the identification performance, robustness and specificity of each software tool. Our reference data sets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics.
Using iRT, a normalized retention time for more targeted measurement of peptides
Escher, Claudia; Reiter, Lukas; MacLean, Brendan; Ossola, Reto; Herzog, Franz; Chilton, John; MacCoss, Michael J.; Rinner, Oliver
2014-01-01
Multiple reaction monitoring (MRM) has recently become the method of choice for targeted quantitative measurement of proteins using mass spectrometry. The method, however, is limited in the number of peptides that can be measured in one run. This number can be markedly increased by scheduling the acquisition if the accurate retention time (RT) of each peptide is known. Here we present iRT, an empirically derived dimensionless peptide-specific value that allows for highly accurate RT prediction. The iRT of a peptide is a fixed number relative to a standard set of reference iRT-peptides that can be transferred across laboratories and chromatographic systems. We show that iRT facilitates the setup of multiplexed experiments with acquisition windows more than 4 times smaller compared to in silico RT predictions resulting in improved quantification accuracy. iRTs can be determined by any laboratory and shared transparently. The iRT concept has been implemented in Skyline, the most widely used software for MRM experiments. PMID:22577012
Emerging Coxsackievirus A6 Causing Hand, Foot and Mouth Disease, Vietnam
Anh, Nguyen To; Nhu, Le Nguyen Truc; Van, Hoang Minh Tu; Hong, Nguyen Thi Thu; Thanh, Tran Tan; Hang, Vu Thi Ty; Ny, Nguyen Thi Han; Nguyet, Lam Anh; Phuong, Tran Thi Lan; Nhan, Le Nguyen Thanh; Hung, Nguyen Thanh; Khanh, Truong Huu; Tuan, Ha Manh; Viet, Ho Lu; Nam, Nguyen Tran; Viet, Do Chau; Qui, Phan Tu; Wills, Bridget; Sabanathan, Sarawathy; Chau, Nguyen Van Vinh; Thwaites, Louise; Rogier van Doorn, H.; Thwaites, Guy; Rabaa, Maia A.
2018-01-01
Hand, foot and mouth disease (HFMD) is a major public health issue in Asia and has global pandemic potential. Coxsackievirus A6 (CV-A6) was detected in 514/2,230 (23%) of HFMD patients admitted to 3 major hospitals in southern Vietnam during 2011–2015. Of these patients, 93 (18%) had severe HFMD. Phylogenetic analysis of 98 genome sequences revealed they belonged to cluster A and had been circulating in Vietnam for 2 years before emergence. CV-A6 movement among localities within Vietnam occurred frequently, whereas viral movement across international borders appeared rare. Skyline plots identified fluctuations in the relative genetic diversity of CV-A6 corresponding to large CV-A6–associated HFMD outbreaks worldwide. These data show that CV-A6 is an emerging pathogen and emphasize the necessity of active surveillance and understanding the mechanisms that shape the pathogen evolution and emergence, which is essential for development and implementation of intervention strategies. PMID:29553326
Streck, André Felipe; Homeier, Timo; Foerster, Tessa; Truyen, Uwe
2013-09-01
To estimate the impact of porcine parvovirus (PPV) vaccines on the emergence of new phenotypes, the population dynamic history of the virus was calculated using the Bayesian Markov chain Monte Carlo method with a Bayesian skyline coalescent model. Additionally, an in vitro model was performed with consecutive passages of the 'Challenge' strain (a virulent field strain) and NADL2 strain (a vaccine strain) in a PK-15 cell line supplemented with polyclonal antibodies raised against the vaccine strain. A decrease in genetic diversity was observed in the presence of antibodies in vitro or after vaccination (as estimated by the in silico model). We hypothesized that the antibodies induced a selective pressure that may reduce the incidence of neutral selection, which should play a major role in the emergence of new mutations. In this scenario, vaccine failures and non-vaccinated populations (e.g. wild boars) may have an important impact in the emergence of new phenotypes.
1986-07-01
COMPUTER-AIDED OPERATION MANAGEMENT SYSTEM ................. 29 Functions of an Off-Line Computer-Aided Operation Management System Applications of...System Comparisons 85 DISTRIBUTION 5V J. • 0. FIGURES Number Page 1 Hardware Components 21 2 Basic Functions of a Computer-Aided Operation Management System...Plant Visits 26 4 Computer-Aided Operation Management Systems Reviewed for Analysis of Basic Functions 29 5 Progress of Software System Installation and
User Interfaces for Patient-Centered Communication of Health Status and Care Progress
ERIC Educational Resources Information Center
Wilcox-Patterson, Lauren
2013-01-01
The recent trend toward patients participating in their own healthcare has opened up numerous opportunities for computing research. This dissertation focuses on how technology can foster this participation, through user interfaces to effectively communicate personal health status and care progress to hospital patients. I first characterize the…
Rock mechanics. Practical use in civil engineering
DOE Office of Scientific and Technical Information (OSTI.GOV)
Murakami, S.
1985-01-01
Because of the recent development of computer technology, a systematic analysis of the stability and behavior of rock is gradually progressing as rock mechanics. Although its progress is still behind that of engineering geology, the book aims to contribute to the systematization of the subject. Examples of design are given.
The Impact of the Measures of Academic Progress (MAP) Program on Student Reading Achievement
ERIC Educational Resources Information Center
Cordray, David S.; Pion, Georgine M.; Brandt, Chris; Molefe, Ayrin
2013-01-01
One of the most widely used commercially available systems incorporating benchmark assessment and training in differentiated instruction is the Northwest Evaluation Association's (NWEA) Measures of Academic Progress (MAP) program. The MAP program involves two components: (1) computer-adaptive assessments administered to students three to four…
Nonlinear Real-Time Optical Signal Processing
1990-09-01
pattern recognition. Additional work concerns the relationship of parallel computation paradigms to optical computing and halftone screen techniques...paradigms to optical computing and halftone screen techniques for implementing general nonlinear functions. 3\\ 2 Research Progress This section...Vol. 23, No. 8, pp. 34-57, 1986. 2.4 Nonlinear Optical Processing with Halftones : Degradation and Compen- sation Models This paper is concerned with
What Does CALL Have to Offer Computer Science and What Does Computer Science Have to Offer CALL?
ERIC Educational Resources Information Center
Cushion, Steve
2006-01-01
We will argue that CALL can usefully be viewed as a subset of computer software engineering and can profit from adopting some of the recent progress in software development theory. The unified modelling language has become the industry standard modelling technique and the accompanying unified process is rapidly gaining acceptance. The manner in…
Wusor II: A Computer Aided Instruction Program with Student Modelling Capabilities. AI Memo 417.
ERIC Educational Resources Information Center
Carr, Brian
Wusor II is the second intelligent computer aided instruction (ICAI) program that has been developed to monitor the progress of, and offer suggestions to, students playing Wumpus, a computer game designed to teach logical thinking and problem solving. From the earlier efforts with Wusor I, it was possible to produce a rule-based expert which…
Nuclear Physics Laboratory 1979 annual report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adelberger, E.G.
1979-07-01
Research progress is reported in the following areas: astrophysics and cosmology, fundamental symmetries, nuclear structure, radiative capture, medium energy physics, heavy ion reactions, research by users and visitors, accelerator and ion source development, instrumentation and experimental techniques, and computers and computing. Publications are listed. (WHK)
Progress in Unsteady Turbopump Flow Simulations
NASA Technical Reports Server (NTRS)
Kiris, Cetin C.; Chan, William; Kwak, Dochan; Williams, Robert
2002-01-01
This viewgraph presentation discusses unsteady flow simulations for a turbopump intended for a reusable launch vehicle (RLV). The simulation process makes use of computational grids and parallel processing. The architecture of the parallel computers used is discussed, as is the scripting of turbopump simulations.
AN INTELLIGENT REPRODUCTIVE AND DEVELOPMENTAL TESTING PARADIGM FOR THE 21ST CENTURY
Addressing the chemical evaluation bottleneck that currently exists can only be achieved through progressive changes to the current testing paradigm. The primary resources for addressing these issues lie in computational toxicology, a field enriched by recent advances in computer...
Scientific computations section monthly report, November 1993
DOE Office of Scientific and Technical Information (OSTI.GOV)
Buckner, M.R.
1993-12-30
This progress report from the Savannah River Technology Center contains abstracts from papers from the computational modeling, applied statistics, applied physics, experimental thermal hydraulics, and packaging and transportation groups. Specific topics covered include: engineering modeling and process simulation, criticality methods and analysis, plutonium disposition.
2011-11-01
elastic range, and with some simple forms of progressing damage . However, a general physics-based methodology to assess the initial and lifetime... damage evolution in the RVE for all possible load histories. Microstructural data on initial configuration and damage progression in CMCs were...the damaged elements will have changed, hence, a progressive damage model. The crack opening for each crack type in each element is stored as a
Computations of turbulent lean premixed combustion using conditional moment closure
NASA Astrophysics Data System (ADS)
Amzin, Shokri; Swaminathan, Nedunchezhian
2013-12-01
Conditional Moment Closure (CMC) is a suitable method for predicting scalars such as carbon monoxide with slow chemical time scales in turbulent combustion. Although this method has been successfully applied to non-premixed combustion, its application to lean premixed combustion is rare. In this study the CMC method is used to compute piloted lean premixed combustion in a distributed combustion regime. The conditional scalar dissipation rate of the conditioning scalar, the progress variable, is closed using an algebraic model and turbulence is modelled using the standard k-ɛ model. The conditional mean reaction rate is closed using a first order CMC closure with the GRI-3.0 chemical mechanism to represent the chemical kinetics of methane oxidation. The PDF of the progress variable is obtained using a presumed shape with the Beta function. The computed results are compared with the experimental measurements and earlier computations using the transported PDF approach. The results show reasonable agreement with the experimental measurements and are consistent with the transported PDF computations. When the compounded effects of shear-turbulence and flame are strong, second order closures may be required for the CMC.
Computational Hemodynamics Involving Artificial Devices
NASA Technical Reports Server (NTRS)
Kwak, Dochan; Kiris, Cetin; Feiereisen, William (Technical Monitor)
2001-01-01
This paper reports the progress being made towards developing complete blood flow simulation capability in human, especially, in the presence of artificial devices such as valves and ventricular assist devices. Devices modeling poses unique challenges different from computing the blood flow in natural hearts and arteries. There are many elements needed such as flow solvers, geometry modeling including flexible walls, moving boundary procedures and physiological characterization of blood. As a first step, computational technology developed for aerospace applications was extended in the recent past to the analysis and development of mechanical devices. The blood flow in these devices is practically incompressible and Newtonian, and thus various incompressible Navier-Stokes solution procedures can be selected depending on the choice of formulations, variables and numerical schemes. Two primitive variable formulations used are discussed as well as the overset grid approach to handle complex moving geometry. This procedure has been applied to several artificial devices. Among these, recent progress made in developing DeBakey axial flow blood pump will be presented from computational point of view. Computational and clinical issues will be discussed in detail as well as additional work needed.
Saxena, Rohit; Vashist, Praveen; Tandon, Radhika; Pandey, Ravindra M; Bhardawaj, Amit; Gupta, Vivek; Menon, Vimala
2017-01-01
To evaluate the incidence and progression of myopia and factors associated with progression of myopia in school going children in Delhi. Prospective longitudinal study of 10,000 school children aged 5 to 15 years screened after an interval of 1 year to identify new myopes (Spherical Equivalent≤ -0.5D) and progression of myopia in previously diagnosed myopic children. Association between risk factors and progression was analyzed using adjusted odds ratio. Of the 9,616 children re-screened (97.3% coverage), annual incidence of myopia was 3.4%with mean dioptric change of -1.09 ± 0.55. There was a significant higher incidence of myopia in younger children compared to older children (P = 0.012) and among girls compared to boys (P = 0.002). Progression was observed in 49.2%children with mean dioptric change of -0.27 ± 0.42 diopters. The demographic and behavioral risk factors were analyzed for children with progression (n = 629) and adjusted odds ratio values were estimated. Hours of reading-writing/week (p<0.001), use of computers/ video games (P<0.001) and watching television (P = 0.048) were significant risk factors for progression of myopia. Outdoor activities / time spent outdoors> 2 hours in a day were protective with an inverse association with progression of myopia (P< 0.001). Myopia is an important health issue in India and is associated with long hours of reading and screen time with use of computers and video games. An annual eye vision screening should be conducted, and outdoor activities be promoted to prevent the increase of myopia among school children.
Koraishy, Farrukh M; Hooks-Anderson, Denise; Salas, Joanne; Scherrer, Jeffrey F
2017-08-01
Late nephrology referral is associated with adverse outcomes especially among minorities. Research on the association of the rate of chronic kidney disease (CKD) progression with nephrology referral in white versus black patients is lacking. Compute the odds of nephrology referral in primary care and their associations with race and the rate of CKD progression. Electronic health record data were obtained from 2170 patients in primary care clinics in the Saint Louis metropolitan area with at least two estimated glomerular filtration rate (eGFR) values over a 7-year observation period. Fast CKD progression was defined as a decline in eGFR of ≥5 ml/min/1.73 m2/year. Logistic regression models were computed to measure the associations between eGFR progression, race and nephrology referral before and after adjusting for potential confounding factors. Nephrology referrals were significantly more prevalent among those with fast compared to slow progression (5.6 versus 2.0%, P < 0.0001), however, a majority of fast progressors were not referred. Fast CKD progression and black race were associated with increased odds of nephrology referral (OR = 2.74; 95% CI: 1.60-4.72 and OR = 2.42; 95% CI: 1.28-4.56, respectively). The interaction of race and eGFR progression in nephrology referral was found to be non-significant. Nephrology referrals are more common in fast CKD progression, but referrals are underutilized. Nephrology referral is more common among blacks but its' association with rate of decline does not differ by race. Further studies are required to investigate the benefit of early referral of patients at risk of fast CKD progression. © The Author 2017. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Online Assistants in Children's Hypermedia Software
ERIC Educational Resources Information Center
Garcia, Penny Ann
2002-01-01
The classroom teacher's comfort and familiarity with computers and software influences student-computer use in the classroom. Teachers remain mired in repetitive introduction of basic software mechanics and rarely progress with students to advanced concepts or complex applications. An Online Assistant (OLA) was developed to accompany the…
Precision Learning Assessment: An Alternative to Traditional Assessment Techniques.
ERIC Educational Resources Information Center
Caltagirone, Paul J.; Glover, Christopher E.
1985-01-01
A continuous and curriculum-based assessment method, Precision Learning Assessment (PLA), which integrates precision teaching and norm-referenced techniques, was applied to a math computation curriculum for 214 third graders. The resulting districtwide learning curves defining average annual progress through the computation curriculum provided…
Team Production of Learner-Controlled Courseware: A Progress Report.
ERIC Educational Resources Information Center
Bunderson, C. Victor
A project being conducted by the MITRE Corporation and Brigham Young University (BYU) is developing hardware, software, and courseware for the TICCIT (Time Shared, Interactive, Computer Controlled Information Television) computer-assisted instructional system. Four instructional teams at BYU, each having an instructional psychologist, subject…
Transportation Research and Analysis Computing Center (TRACC) Year 6 Quarter 4 Progress Report
DOT National Transportation Integrated Search
2013-03-01
Argonne National Laboratory initiated a FY2006-FY2009 multi-year program with the US Department of Transportation (USDOT) on October 1, 2006, to establish the Transportation Research and Analysis Computing Center (TRACC). As part of the TRACC project...
Environmental models are products of the computer architecture and software tools available at the time of development. Scientifically sound algorithms may persist in their original state even as system architectures and software development approaches evolve and progress. Dating...
Burger, Gerhard A.; Danen, Erik H. J.; Beltman, Joost B.
2017-01-01
Epithelial–mesenchymal transition (EMT), the process by which epithelial cells can convert into motile mesenchymal cells, plays an important role in development and wound healing but is also involved in cancer progression. It is increasingly recognized that EMT is a dynamic process involving multiple intermediate or “hybrid” phenotypes rather than an “all-or-none” process. However, the role of EMT in various cancer hallmarks, including metastasis, is debated. Given the complexity of EMT regulation, computational modeling has proven to be an invaluable tool for cancer research, i.e., to resolve apparent conflicts in experimental data and to guide experiments by generating testable hypotheses. In this review, we provide an overview of computational modeling efforts that have been applied to regulation of EMT in the context of cancer progression and its associated tumor characteristics. Moreover, we identify possibilities to bridge different modeling approaches and point out outstanding questions in which computational modeling can contribute to advance our understanding of pathological EMT. PMID:28824874
Progressive Fracture of Composite Structures
NASA Technical Reports Server (NTRS)
Chamis, Christos C.; Minnetyan, Levon
2008-01-01
A new approach is described for evaluating fracture in composite structures. This approach is independent of classical fracture mechanics parameters like fracture toughness. It relies on computational simulation and is programmed in a stand-alone integrated computer code. It is multiscale, multifunctional because it includes composite mechanics for the composite behavior and finite element analysis for predicting the structural response. It contains seven modules; layered composite mechanics (micro, macro, laminate), finite element, updating scheme, local fracture, global fracture, stress based failure modes, and fracture progression. The computer code is called CODSTRAN (Composite Durability Structural ANalysis). It is used in the present paper to evaluate the global fracture of four composite shell problems and one composite built-up structure. Results show that the composite shells and the built-up composite structure global fracture are enhanced when internal pressure is combined with shear loads.
Specialized computer architectures for computational aerodynamics
NASA Technical Reports Server (NTRS)
Stevenson, D. K.
1978-01-01
In recent years, computational fluid dynamics has made significant progress in modelling aerodynamic phenomena. Currently, one of the major barriers to future development lies in the compute-intensive nature of the numerical formulations and the relative high cost of performing these computations on commercially available general purpose computers, a cost high with respect to dollar expenditure and/or elapsed time. Today's computing technology will support a program designed to create specialized computing facilities to be dedicated to the important problems of computational aerodynamics. One of the still unresolved questions is the organization of the computing components in such a facility. The characteristics of fluid dynamic problems which will have significant impact on the choice of computer architecture for a specialized facility are reviewed.
Progress in Fully Automated Abdominal CT Interpretation
Summers, Ronald M.
2016-01-01
OBJECTIVE Automated analysis of abdominal CT has advanced markedly over just the last few years. Fully automated assessment of organs, lymph nodes, adipose tissue, muscle, bowel, spine, and tumors are some examples where tremendous progress has been made. Computer-aided detection of lesions has also improved dramatically. CONCLUSION This article reviews the progress and provides insights into what is in store in the near future for automated analysis for abdominal CT, ultimately leading to fully automated interpretation. PMID:27101207
Generalizations of polylogarithms for Feynman integrals
NASA Astrophysics Data System (ADS)
Bogner, Christian
2016-10-01
In this talk, we discuss recent progress in the application of generalizations of polylogarithms in the symbolic computation of multi-loop integrals. We briefly review the Maple program MPL which supports a certain approach for the computation of Feynman integrals in terms of multiple polylogarithms. Furthermore we discuss elliptic generalizations of polylogarithms which have shown to be useful in the computation of the massive two-loop sunrise integral.
Progress in Machine Learning Studies for the CMS Computing Infrastructure
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bonacorsi, Daniele; Kuznetsov, Valentin; Magini, Nicolo
Here, computing systems for LHC experiments developed together with Grids worldwide. While a complete description of the original Grid-based infrastructure and services for LHC experiments and its recent evolutions can be found elsewhere, it is worth to mention here the scale of the computing resources needed to fulfill the needs of LHC experiments in Run-1 and Run-2 so far.
Progress in Machine Learning Studies for the CMS Computing Infrastructure
Bonacorsi, Daniele; Kuznetsov, Valentin; Magini, Nicolo; ...
2017-12-06
Here, computing systems for LHC experiments developed together with Grids worldwide. While a complete description of the original Grid-based infrastructure and services for LHC experiments and its recent evolutions can be found elsewhere, it is worth to mention here the scale of the computing resources needed to fulfill the needs of LHC experiments in Run-1 and Run-2 so far.
A forward view on reliable computers for flight control
NASA Technical Reports Server (NTRS)
Goldberg, J.; Wensley, J. H.
1976-01-01
The requirements for fault-tolerant computers for flight control of commercial aircraft are examined; it is concluded that the reliability requirements far exceed those typically quoted for space missions. Examination of circuit technology and alternative computer architectures indicates that the desired reliability can be achieved with several different computer structures, though there are obvious advantages to those that are more economic, more reliable, and, very importantly, more certifiable as to fault tolerance. Progress in this field is expected to bring about better computer systems that are more rigorously designed and analyzed even though computational requirements are expected to increase significantly.
Private quantum computation: an introduction to blind quantum computing and related protocols
NASA Astrophysics Data System (ADS)
Fitzsimons, Joseph F.
2017-06-01
Quantum technologies hold the promise of not only faster algorithmic processing of data, via quantum computation, but also of more secure communications, in the form of quantum cryptography. In recent years, a number of protocols have emerged which seek to marry these concepts for the purpose of securing computation rather than communication. These protocols address the task of securely delegating quantum computation to an untrusted device while maintaining the privacy, and in some instances the integrity, of the computation. We present a review of the progress to date in this emerging area.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1991-12-31
This report details progress made in setting up a laboratory for optical microscopy of genes. The apparatus including a fluorescence microscope, a scanning optical microscope, various spectrometers, and supporting computers is described. Results in developing photon and exciton tips, and in preparing samples are presented. (GHH)
Pewaukee School District, Wisconsin. Case Study: Measures of Academic Progress
ERIC Educational Resources Information Center
Northwest Evaluation Association, 2015
2015-01-01
For more than a decade, Pewaukee School District Superintendent JoAnn Sternke has watched her district get better and better at its mission: opening the door to each student's future. The Wisconsin district began using Measures of Academic Progress® (MAP®) computer adaptive interim assessments from Northwest Evaluation Association™ (NWEA™) in 2004…
Progress Monitoring in Grade 5 Science for Low Achievers
ERIC Educational Resources Information Center
Vannest, Kimberly J.; Parker, Richard; Dyer, Nicole
2011-01-01
This article presents procedures and results from a 2-year project developing science key vocabulary (KV) short tests suitable for progress monitoring Grade 5 science in Texas public schools using computer-generated, -administered, and -scored assessments. KV items included KV definitions and important usages in a multiple-choice cloze format. A…
Implementation of a 3D mixing layer code on parallel computers
NASA Technical Reports Server (NTRS)
Roe, K.; Thakur, R.; Dang, T.; Bogucz, E.
1995-01-01
This paper summarizes our progress and experience in the development of a Computational-Fluid-Dynamics code on parallel computers to simulate three-dimensional spatially-developing mixing layers. In this initial study, the three-dimensional time-dependent Euler equations are solved using a finite-volume explicit time-marching algorithm. The code was first programmed in Fortran 77 for sequential computers. The code was then converted for use on parallel computers using the conventional message-passing technique, while we have not been able to compile the code with the present version of HPF compilers.
Chacón, Matías; Eleta, Martín; Espindola, Adriel Rodríguez; Roca, Enrique; Méndez, Guillermo; Rojo, Sandra; Pupareli, Carmen
2015-01-01
Imatinib is the standard first-line therapy for advanced gastrointestinal stromal tumor. (18)F-fluorodeoxyglucose PET computed tomography (FDG PET/CT) shows a faster response than computed tomography in nonpretreated patients. After disease progression on imatinib 400 mg, 16 patients were exposed to 800 mg. Tumor response was evaluated by FDG PET/CT on days 7 and 37. Primary objective was to correlate early metabolic response (EMR) with progression-free survival (PFS). EMR by FDG PET/CT scan was not predictive of PFS. Median PFS in these patients was 3 months. Overall survival was influenced by gastric primary site (p = 0.05). The assessment of EMR by FDG PET/CT in patients with advanced gastrointestinal stromal tumor exposed to imatinib 800 mg was not predictive of PFS or overall survival.
HIV-1 Strategies of Immune Evasion
NASA Astrophysics Data System (ADS)
Castiglione, F.; Bernaschi, M.
We simulate the progression of the HIV-1 infection in untreated host organisms. The phenotype features of the virus are represented by the replication rate, the probability of activating the transcription, the mutation rate and the capacity to stimulate an immune response (the so-called immunogenicity). It is very difficult to study in-vivo or in-vitro how these characteristics of the virus influence the evolution of the disease. Therefore we resorted to simulations based on a computer model validated in previous studies. We observe, by means of computer experiments, that the virus continuously evolves under the selective pressure of an immune response whose effectiveness downgrades along with the disease progression. The results of the simulations show that immunogenicity is the most important factor in determining the rate of disease progression but, by itself, it is not sufficient to drive the disease to a conclusion in all cases.
ERIC Educational Resources Information Center
Wheeler, David L.
1988-01-01
Scientists feel that progress in artificial intelligence and the availability of thousands of experimental results make this the right time to build and test theories on how people think and learn, using the computer to model minds. (MSE)
Effect of ram semen extenders and supplements on computer assisted sperm analysis parameters
USDA-ARS?s Scientific Manuscript database
A study evaluated the effects of ram semen extender and extender supplementation on computer assisted sperm analysis (CASA) parameters positively correlated with progressive motility. Semen collected from 5 rams was distributed across treatment combinations consisting of either TRIS citrate (T) or ...
A Mixed-Methods Exploration of an Environment for Learning Computer Programming
ERIC Educational Resources Information Center
Mather, Richard
2015-01-01
A mixed-methods approach is evaluated for exploring collaborative behaviour, acceptance and progress surrounding an interactive technology for learning computer programming. A review of literature reveals a compelling case for using mixed-methods approaches when evaluating technology-enhanced-learning environments. Here, ethnographic approaches…
Discontinuously Stiffened Composite Panel under Compressive Loading
NASA Technical Reports Server (NTRS)
Minnetyan, Levon; Rivers, James M.; Chamis, Christos C.; Murthy, Pappu L. N.
1995-01-01
The design of composite structures requires an evaluation of their safety and durability under service loads and possible overload conditions. This paper presents a computational tool that has been developed to examine the response of stiffened composite panels via the simulation of damage initiation, growth, accumulation, progression, and propagation to structural fracture or collapse. The structural durability of a composite panel with a discontinuous stiffener is investigated under compressive loading induced by the gradual displacement of an end support. Results indicate damage initiation and progression to have significant effects on structural behavior under loading. Utilization of an integrated computer code for structural durability assessment is demonstrated.
Health-adjusted premium subsidies in the Netherlands.
van de Ven, Wynand P M M; van Vliet, René C J A; Lamers, Leida M
2004-01-01
The Dutch government has decided to proceed with managed competition in health care. In this paper we report on progress made with health-based risk adjustment, a key issue in managed competition. In 2004 both Diagnostic Cost Groups (DCGs) computed from hospital diagnoses only and Pharmacy-based Cost Groups (PCGs) computed from out-patient prescription drugs are used to set the premium subsidies for competing risk-bearing sickness funds. These health-based risk adjusters appear to be effective and complementary. Risk selection is not a major problem in the Netherlands. Despite the progress made, we are still faced with a full research agenda for risk adjustment in the coming years.
An attempt at the computer-aided management of HIV infection
NASA Astrophysics Data System (ADS)
Ida, A.; Oharu, Y.; Sankey, O.
2007-07-01
The immune system is a complex and diverse system in the human body and HIV virus disrupts and destroys it through extremely complicated but surprisingly logical process. The purpose of this paper is to make an attempt to present a method for the computer-aided management of HIV infection process by means of a mathematical model describing the dynamics of the host pathogen interaction with HIV-1. Treatments for the AIDS disease must be changed to more efficient ones in accordance with the disease progression and the status of the immune system. The level of progression and the status are represented by parameters which are governed by our mathematical model. It is then exhibited that our model is numerically stable and uniquely solvable. With this knowledge, our mathematical model for HIV disease progression is formulated and physiological interpretations are provided. The results of our numerical simulations are visualized, and it is seen that our results agree with medical aspects from the point of view of antiretroviral therapy. It is then expected that our approach will take to address practical clinical issues and will be applied to the computer-aided management of antiretroviral therapies.
Computational wear simulation of patellofemoral articular cartilage during in vitro testing.
Li, Lingmin; Patil, Shantanu; Steklov, Nick; Bae, Won; Temple-Wong, Michele; D'Lima, Darryl D; Sah, Robert L; Fregly, Benjamin J
2011-05-17
Though changes in normal joint motions and loads (e.g., following anterior cruciate ligament injury) contribute to the development of knee osteoarthritis, the precise mechanism by which these changes induce osteoarthritis remains unknown. As a first step toward identifying this mechanism, this study evaluates computational wear simulations of a patellofemoral joint specimen wear tested on a knee simulator machine. A multibody dynamic model of the specimen mounted in the simulator machine was constructed in commercial computer-aided engineering software. A custom elastic foundation contact model was used to calculate contact pressures and wear on the femoral and patellar articular surfaces using geometry created from laser scan and MR data. Two different wear simulation approaches were investigated--one that wore the surface geometries gradually over a sequence of 10 one-cycle dynamic simulations (termed the "progressive" approach), and one that wore the surface geometries abruptly using results from a single one-cycle dynamic simulation (termed the "non-progressive" approach). The progressive approach with laser scan geometry reproduced the experimentally measured wear depths and areas for both the femur and patella. The less costly non-progressive approach predicted deeper wear depths, especially on the patella, but had little influence on predicted wear areas. Use of MR data for creating the articular and subchondral bone geometry altered wear depth and area predictions by at most 13%. These results suggest that MR-derived geometry may be sufficient for simulating articular cartilage wear in vivo and that a progressive simulation approach may be needed for the patella and tibia since both remain in continuous contact with the femur. Copyright © 2011 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Johannes, J. D.
1974-01-01
Techniques, methods, and system requirements are reported for an onboard computerized communications system that provides on-line computing capability during manned space exploration. Communications between man and computer take place by sequential execution of each discrete step of a procedure, by interactive progression through a tree-type structure to initiate tasks or by interactive optimization of a task requiring man to furnish a set of parameters. Effective communication between astronaut and computer utilizes structured vocabulary techniques and a word recognition system.
Research on Influence of Cloud Environment on Traditional Network Security
NASA Astrophysics Data System (ADS)
Ming, Xiaobo; Guo, Jinhua
2018-02-01
Cloud computing is a symbol of the progress of modern information network, cloud computing provides a lot of convenience to the Internet users, but it also brings a lot of risk to the Internet users. Second, one of the main reasons for Internet users to choose cloud computing is that the network security performance is great, it also is the cornerstone of cloud computing applications. This paper briefly explores the impact on cloud environment on traditional cybersecurity, and puts forward corresponding solutions.
Weidling, Patrick; Jaschinski, Wolfgang
2015-01-01
When presbyopic employees are wearing general-purpose progressive lenses, they have clear vision only with a lower gaze inclination to the computer monitor, given the head assumes a comfortable inclination. Therefore, in the present intervention field study the monitor position was lowered, also with the aim to reduce musculoskeletal symptoms. A comparison group comprised users of lenses that do not restrict the field of clear vision. The lower monitor positions led the participants to lower their head inclination, which was linearly associated with a significant reduction in musculoskeletal symptoms. However, for progressive lenses a lower head inclination means a lower zone of clear vision, so that clear vision of the complete monitor was not achieved, rather the monitor should have been placed even lower. The procedures of this study may be useful for optimising the individual monitor position depending on the comfortable head and gaze inclination and the vertical zone of clear vision of progressive lenses. For users of general-purpose progressive lenses, it is suggested that low monitor positions allow for clear vision at the monitor and for a physiologically favourable head inclination. Employees may improve their workplace using a flyer providing ergonomic-optometric information.
Jung, Jae-Joon; Razavian, Mahmoud; Kim, Hye-Yeong; Ye, Yunpeng; Golestani, Reza; Toczek, Jakub; Zhang, Jiasheng; Sadeghi, Mehran M
2016-09-13
Calcific aortic valve disease (CAVD) is the most common cause of aortic stenosis. Currently, there is no non-invasive medical therapy for CAVD. Matrix metalloproteinases (MMPs) are upregulated in CAVD and play a role in its pathogenesis. Here, we evaluated the effect of doxycycline, a nonselective MMP inhibitor on CAVD progression in the mouse. Apolipoprotein (apo)E(-/-) mice (n = 20) were fed a Western diet (WD) to induce CAVD. After 3 months, half of the animals was treated with doxycycline, while the others continued WD alone. After 6 months, we evaluated the effect of doxycycline on CAVD progression by echocardiography, MMP-targeted micro single photon emission computed tomography (SPECT)/computed tomography (CT), and tissue analysis. Despite therapeutic blood levels, doxycycline had no significant effect on MMP activation, aortic valve leaflet separation or flow velocity. This lack of effect on in vivo images was confirmed on tissue analysis which showed a similar level of aortic valve gelatinase activity, and inflammation between the two groups of animals. In conclusion, doxycycline (100 mg/kg/day) had no effect on CAVD progression in apoE(-/-) mice with early disease. Studies with more potent and specific inhibitors are needed to establish any potential role of MMP inhibition in CAVD development and progression.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lottes, S.A.; Bojanowski, C.; Shen, J.
2012-04-09
The computational fluid dynamics (CFD) and computational structural mechanics (CSM) focus areas at Argonne's Transportation Research and Analysis Computing Center (TRACC) initiated a project to support and compliment the experimental programs at the Turner-Fairbank Highway Research Center (TFHRC) with high performance computing based analysis capabilities in August 2010. The project was established with a new interagency agreement between the Department of Energy and the Department of Transportation to provide collaborative research, development, and benchmarking of advanced three-dimensional computational mechanics analysis methods to the aerodynamics and hydraulics laboratories at TFHRC for a period of five years, beginning in October 2010. Themore » analysis methods employ well-benchmarked and supported commercial computational mechanics software. Computational mechanics encompasses the areas of Computational Fluid Dynamics (CFD), Computational Wind Engineering (CWE), Computational Structural Mechanics (CSM), and Computational Multiphysics Mechanics (CMM) applied in Fluid-Structure Interaction (FSI) problems. The major areas of focus of the project are wind and water effects on bridges - superstructure, deck, cables, and substructure (including soil), primarily during storms and flood events - and the risks that these loads pose to structural failure. For flood events at bridges, another major focus of the work is assessment of the risk to bridges caused by scour of stream and riverbed material away from the foundations of a bridge. Other areas of current research include modeling of flow through culverts to improve design allowing for fish passage, modeling of the salt spray transport into bridge girders to address suitability of using weathering steel in bridges, CFD analysis of the operation of the wind tunnel in the TFHRC wind engineering laboratory. This quarterly report documents technical progress on the project tasks for the period of October through December 2011.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lottes, S.A.; Bojanowski, C.; Shen, J.
2012-06-28
The computational fluid dynamics (CFD) and computational structural mechanics (CSM) focus areas at Argonne's Transportation Research and Analysis Computing Center (TRACC) initiated a project to support and compliment the experimental programs at the Turner-Fairbank Highway Research Center (TFHRC) with high performance computing based analysis capabilities in August 2010. The project was established with a new interagency agreement between the Department of Energy and the Department of Transportation to provide collaborative research, development, and benchmarking of advanced three-dimensional computational mechanics analysis methods to the aerodynamics and hydraulics laboratories at TFHRC for a period of five years, beginning in October 2010. Themore » analysis methods employ well benchmarked and supported commercial computational mechanics software. Computational mechanics encompasses the areas of Computational Fluid Dynamics (CFD), Computational Wind Engineering (CWE), Computational Structural Mechanics (CSM), and Computational Multiphysics Mechanics (CMM) applied in Fluid-Structure Interaction (FSI) problems. The major areas of focus of the project are wind and water effects on bridges - superstructure, deck, cables, and substructure (including soil), primarily during storms and flood events - and the risks that these loads pose to structural failure. For flood events at bridges, another major focus of the work is assessment of the risk to bridges caused by scour of stream and riverbed material away from the foundations of a bridge. Other areas of current research include modeling of flow through culverts to improve design allowing for fish passage, modeling of the salt spray transport into bridge girders to address suitability of using weathering steel in bridges, CFD analysis of the operation of the wind tunnel in the TFHRC wind engineering laboratory. This quarterly report documents technical progress on the project tasks for the period of January through March 2012.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lottes, S.A.; Kulak, R.F.; Bojanowski, C.
2011-08-26
The computational fluid dynamics (CFD) and computational structural mechanics (CSM) focus areas at Argonne's Transportation Research and Analysis Computing Center (TRACC) initiated a project to support and compliment the experimental programs at the Turner-Fairbank Highway Research Center (TFHRC) with high performance computing based analysis capabilities in August 2010. The project was established with a new interagency agreement between the Department of Energy and the Department of Transportation to provide collaborative research, development, and benchmarking of advanced three-dimensional computational mechanics analysis methods to the aerodynamics and hydraulics laboratories at TFHRC for a period of five years, beginning in October 2010. Themore » analysis methods employ well-benchmarked and supported commercial computational mechanics software. Computational mechanics encompasses the areas of Computational Fluid Dynamics (CFD), Computational Wind Engineering (CWE), Computational Structural Mechanics (CSM), and Computational Multiphysics Mechanics (CMM) applied in Fluid-Structure Interaction (FSI) problems. The major areas of focus of the project are wind and water loads on bridges - superstructure, deck, cables, and substructure (including soil), primarily during storms and flood events - and the risks that these loads pose to structural failure. For flood events at bridges, another major focus of the work is assessment of the risk to bridges caused by scour of stream and riverbed material away from the foundations of a bridge. Other areas of current research include modeling of flow through culverts to assess them for fish passage, modeling of the salt spray transport into bridge girders to address suitability of using weathering steel in bridges, vehicle stability under high wind loading, and the use of electromagnetic shock absorbers to improve vehicle stability under high wind conditions. This quarterly report documents technical progress on the project tasks for the period of April through June 2011.« less
Report on Information Retrieval and Library Automation Studies.
ERIC Educational Resources Information Center
Alberta Univ., Edmonton. Dept. of Computing Science.
Short abstracts of works in progress or completed in the Department of Computing Science at the University of Alberta are presented under five major headings. The five categories are: Storage and search techniques for document data bases, Automatic classification, Study of indexing and classification languages through computer manipulation of data…
The Computer's Debt to Science.
ERIC Educational Resources Information Center
Branscomb, Lewis M.
1984-01-01
Discusses discoveries and applications of science that have enabled the computer industry to introduce new technology each year and produce 25 percent more for the customer at constant cost. Potential limits to progress, disc storage technology, programming and end-user interface, and designing for ease of use are considered. Glossary is included.…
Research in progress in applied mathematics, numerical analysis, and computer science
NASA Technical Reports Server (NTRS)
1990-01-01
Research conducted at the Institute in Science and Engineering in applied mathematics, numerical analysis, and computer science is summarized. The Institute conducts unclassified basic research in applied mathematics in order to extend and improve problem solving capabilities in science and engineering, particularly in aeronautics and space.
Computer-Assisted Learning in Elementary Reading: A Randomized Control Trial
ERIC Educational Resources Information Center
Shannon, Lisa Cassidy; Styers, Mary Koenig; Wilkerson, Stephanie Baird; Peery, Elizabeth
2015-01-01
This study evaluated the efficacy of Accelerated Reader, a computer-based learning program, at improving student reading. Accelerated Reader is a progress-monitoring, assessment, and practice tool that supports classroom instruction and guides independent reading. Researchers used a randomized controlled trial to evaluate the program with 344…
A Computer-Aided Exercise for Checking Novices' Understanding of Market Equilibrium Changes.
ERIC Educational Resources Information Center
Katz, Arnold
1999-01-01
Describes a computer-aided supplement to the introductory microeconomics course that enhances students' understanding with simulation-based tools for reviewing what they have learned from lectures and conventional textbooks about comparing market equilibria. Includes a discussion of students' learning progressions and retention after using the…
ERIC Educational Resources Information Center
Moore, Colleen F.; And Others
1991-01-01
Examined the development of proportional reasoning by means of a temperature mixture task. Results show the importance of distinguishing between intuitive knowledge and formal computational knowledge of proportional concepts. Provides a new perspective on the relation of intuitive and computational knowledge during development. (GLR)
A Deep Learning Approach to Neuroanatomical Characterisation of Alzheimer's Disease.
Ambastha, Abhinit Kumar; Leong, Tze-Yun
2017-01-01
Alzheimer's disease (AD) is a neurological degenerative disorder that leads to progressive mental deterioration. This work introduces a computational approach to improve our understanding of the progression of AD. We use ensemble learning methods and deep neural networks to identify salient structural correlations among brain regions that degenerate together in AD; this provides an understanding of how AD progresses in the brain. The proposed technique has a classification accuracy of 81.79% for AD against healthy subjects using a single modality imaging dataset.
Integration of progressive hedging and dual decomposition in stochastic integer programs
Watson, Jean -Paul; Guo, Ge; Hackebeil, Gabriel; ...
2015-04-07
We present a method for integrating the Progressive Hedging (PH) algorithm and the Dual Decomposition (DD) algorithm of Carøe and Schultz for stochastic mixed-integer programs. Based on the correspondence between lower bounds obtained with PH and DD, a method to transform weights from PH to Lagrange multipliers in DD is found. Fast progress in early iterations of PH speeds up convergence of DD to an exact solution. As a result, we report computational results on server location and unit commitment instances.
ERIC Educational Resources Information Center
White, Sheida; Kim, Young Yee; Chen, Jing; Liu, Fei
2015-01-01
This study examined whether or not fourth-graders could fully demonstrate their writing skills on the computer and factors associated with their performance on the National Assessment of Educational Progress (NAEP) computer-based writing assessment. The results suggest that high-performing fourth-graders (those who scored in the upper 20 percent…
Technology in Education. The Progress of Education Reform, 2006. Volume 6, Number 6
ERIC Educational Resources Information Center
Weiss, Suzanne
2006-01-01
For policymakers, educators and others interested in learning more about the one-to-one computing movement, this issue of "The Progress of Education Reform" spotlights three particularly useful resources: (1) a detailed review of the challenges faced by states and districts implementing laptop programs, and of lessons learned to date in…
ERIC Educational Resources Information Center
Koffarnus, Mikhail N.; DeFulio, Anthony; Sigurdsson, Sigurdur O.; Silverman, Kenneth
2013-01-01
Advancing the education of low-income adults could increase employment and income, but adult education programs have not successfully engaged low-income adults. Monetary reinforcement may be effective in promoting progress in adult education. This experiment evaluated the benefits of providing incentives for performance in a job-skills training…
Research in Progress--Update April 1990. Occasional Paper InTER/14/90.
ERIC Educational Resources Information Center
Boots, Maureen, Comp.
This document contains abstracts of 29 research projects in progress in Great Britain divided into six sections: (1) the current phase of Information Technology in Education Research (InTER) programs on groupwork with computers, tools for exploratory learning, conceptual change in science, and bubble dialogue as an ethnographic research tool; (2)…
Quantification of Energy Release in Composite Structures
NASA Technical Reports Server (NTRS)
Minnetyan, Levon
2003-01-01
Energy release rate is usually suggested as a quantifier for assessing structural damage tolerance. Computational prediction of energy release rate is based on composite mechanics with micro-stress level damage assessment, finite element structural analysis and damage progression tracking modules. This report examines several issues associated with energy release rates in composite structures as follows: Chapter I demonstrates computational simulation of an adhesively bonded composite joint and validates the computed energy release rates by comparison with acoustic emission signals in the overall sense. Chapter II investigates the effect of crack plane orientation with respect to fiber direction on the energy release rates. Chapter III quantifies the effects of contiguous constraint plies on the residual stiffness of a 90 ply subjected to transverse tensile fractures. Chapter IV compares ICAN and ICAN/JAVA solutions of composites. Chapter V examines the effects of composite structural geometry and boundary conditions on damage progression characteristics.
Quantification of Energy Release in Composite Structures
NASA Technical Reports Server (NTRS)
Minnetyan, Levon; Chamis, Christos C. (Technical Monitor)
2003-01-01
Energy release rate is usually suggested as a quantifier for assessing structural damage tolerance. Computational prediction of energy release rate is based on composite mechanics with micro-stress level damage assessment, finite element structural analysis and damage progression tracking modules. This report examines several issues associated with energy release rates in composite structures as follows: Chapter I demonstrates computational simulation of an adhesively bonded composite joint and validates the computed energy release rates by comparison with acoustic emission signals in the overall sense. Chapter II investigates the effect of crack plane orientation with respect to fiber direction on the energy release rates. Chapter III quantifies the effects of contiguous constraint plies on the residual stiffness of a 90 deg ply subjected to transverse tensile fractures. Chapter IV compares ICAN and ICAN/JAVA solutions of composites. Chapter V examines the effects of composite structural geometry and boundary conditions on damage progression characteristics.
Multiscale Multifunctional Progressive Fracture of Composite Structures
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Minnetyan, L.
2012-01-01
A new approach is described for evaluating fracture in composite structures. This approach is independent of classical fracture mechanics parameters like fracture toughness. It relies on computational simulation and is programmed in a stand-alone integrated computer code. It is multiscale, multifunctional because it includes composite mechanics for the composite behavior and finite element analysis for predicting the structural response. It contains seven modules; layered composite mechanics (micro, macro, laminate), finite element, updating scheme, local fracture, global fracture, stress based failure modes, and fracture progression. The computer code is called CODSTRAN (Composite Durability Structural ANalysis). It is used in the present paper to evaluate the global fracture of four composite shell problems and one composite built-up structure. Results show that the composite shells. Global fracture is enhanced when internal pressure is combined with shear loads. The old reference denotes that nothing has been added to this comprehensive report since then.
The Mesa Arizona Pupil Tracking System
NASA Technical Reports Server (NTRS)
Wright, D. L.
1973-01-01
A computer-based Pupil Tracking/Teacher Monitoring System was designed for Mesa Public Schools, Mesa, Arizona. The established objectives of the system were to: (1) facilitate the economical collection and storage of student performance data necessary to objectively evaluate the relative effectiveness of teachers, instructional methods, materials, and applied concepts; and (2) identify, on a daily basis, those students requiring special attention in specific subject areas. The system encompasses computer hardware/software and integrated curricula progression/administration devices. It provides daily evaluation and monitoring of performance as students progress at class or individualized rates. In the process, it notifies the student and collects information necessary to validate or invalidate subject presentation devices, methods, materials, and measurement devices in terms of direct benefit to the students. The system utilizes a small-scale computer (e.g., IBM 1130) to assure low-cost replicability, and may be used for many subjects of instruction.
NASA Astrophysics Data System (ADS)
Kubina, Stanley J.
1989-09-01
The review of the status of computational electromagnetics by Miller and the exposition by Burke of the developments in one of the more important computer codes in the application of the electric field integral equation method, the Numerical Electromagnetic Code (NEC), coupled with Molinet's summary of progress in techniques based on the Geometrical Theory of Diffraction (GTD), provide a clear perspective on the maturity of the modern discipline of computational electromagnetics and its potential. Audone's exposition of the application to the computation of Radar Scattering Cross-section (RCS) is an indication of the breadth of practical applications and his exploitation of modern near-field measurement techniques reminds one of progress in the measurement discipline which is essential to the validation or calibration of computational modeling methodology when applied to complex structures such as aircraft and ships. The latter monograph also presents some comparison results with computational models. Some of the results presented for scale model and flight measurements show some serious disagreements in the lobe structure which would require some detailed examination. This also applies to the radiation patterns obtained by flight measurement compared with those obtained using wire-grid models and integral equation modeling methods. In the examples which follow, an attempt is made to match measurements results completely over the entire 2 to 30 MHz HF range for antennas on a large patrol aircraft. The problem of validating computer models of HF antennas on a helicopter and using computer models to generate radiation pattern information which cannot be obtained by measurements are discussed. The use of NEC computer models to analyze top-side ship configurations where measurement results are not available and only self-validation measures are available or at best comparisons with an alternate GTD computer modeling technique is also discussed.
NASA Technical Reports Server (NTRS)
1993-01-01
This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics and computer science during the period April 1, 1993 through September 30, 1993. The major categories of the current ICASE research program are: (1) applied and numerical mathematics, including numerical analysis and algorithm development; (2) theoretical and computational research in fluid mechanics in selected areas of interest to LaRC, including acoustic and combustion; (3) experimental research in transition and turbulence and aerodynamics involving LaRC facilities and scientists; and (4) computer science.
Chinese Herbal Medicine Meets Biological Networks of Complex Diseases: A Computational Perspective
Gu, Shuo
2017-01-01
With the rapid development of cheminformatics, computational biology, and systems biology, great progress has been made recently in the computational research of Chinese herbal medicine with in-depth understanding towards pharmacognosy. This paper summarized these studies in the aspects of computational methods, traditional Chinese medicine (TCM) compound databases, and TCM network pharmacology. Furthermore, we chose arachidonic acid metabolic network as a case study to demonstrate the regulatory function of herbal medicine in the treatment of inflammation at network level. Finally, a computational workflow for the network-based TCM study, derived from our previous successful applications, was proposed. PMID:28690664
Chinese Herbal Medicine Meets Biological Networks of Complex Diseases: A Computational Perspective.
Gu, Shuo; Pei, Jianfeng
2017-01-01
With the rapid development of cheminformatics, computational biology, and systems biology, great progress has been made recently in the computational research of Chinese herbal medicine with in-depth understanding towards pharmacognosy. This paper summarized these studies in the aspects of computational methods, traditional Chinese medicine (TCM) compound databases, and TCM network pharmacology. Furthermore, we chose arachidonic acid metabolic network as a case study to demonstrate the regulatory function of herbal medicine in the treatment of inflammation at network level. Finally, a computational workflow for the network-based TCM study, derived from our previous successful applications, was proposed.
NASA Technical Reports Server (NTRS)
1994-01-01
This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, fluid mechanics, and computer science during the period October 1, 1993 through March 31, 1994. The major categories of the current ICASE research program are: (1) applied and numerical mathematics, including numerical analysis and algorithm development; (2) theoretical and computational research in fluid mechanics in selected areas of interest to LaRC, including acoustics and combustion; (3) experimental research in transition and turbulence and aerodynamics involving LaRC facilities and scientists; and (4) computer science.
Damage Tolerance of Large Shell Structures
NASA Technical Reports Server (NTRS)
Minnetyan, L.; Chamis, C. C.
1999-01-01
Progressive damage and fracture of large shell structures is investigated. A computer model is used for the assessment of structural response, progressive fracture resistance, and defect/damage tolerance characteristics. Critical locations of a stiffened conical shell segment are identified. Defective and defect-free computer models are simulated to evaluate structural damage/defect tolerance. Safe pressurization levels are assessed for the retention of structural integrity at the presence of damage/ defects. Damage initiation, growth, accumulation, and propagation to fracture are included in the simulations. Damage propagation and burst pressures for defective and defect-free shells are compared to evaluate damage tolerance. Design implications with regard to defect and damage tolerance of a large steel pressure vessel are examined.
Quantum Computation: Entangling with the Future
NASA Technical Reports Server (NTRS)
Jiang, Zhang
2017-01-01
Commercial applications of quantum computation have become viable due to the rapid progress of the field in the recent years. Efficient quantum algorithms are discovered to cope with the most challenging real-world problems that are too hard for classical computers. Manufactured quantum hardware has reached unprecedented precision and controllability, enabling fault-tolerant quantum computation. Here, I give a brief introduction on what principles in quantum mechanics promise its unparalleled computational power. I will discuss several important quantum algorithms that achieve exponential or polynomial speedup over any classical algorithm. Building a quantum computer is a daunting task, and I will talk about the criteria and various implementations of quantum computers. I conclude the talk with near-future commercial applications of a quantum computer.
Numerical propulsion system simulation: An interdisciplinary approach
NASA Technical Reports Server (NTRS)
Nichols, Lester D.; Chamis, Christos C.
1991-01-01
The tremendous progress being made in computational engineering and the rapid growth in computing power that is resulting from parallel processing now make it feasible to consider the use of computer simulations to gain insights into the complex interactions in aerospace propulsion systems and to evaluate new concepts early in the design process before a commitment to hardware is made. Described here is a NASA initiative to develop a Numerical Propulsion System Simulation (NPSS) capability.
Numerical propulsion system simulation - An interdisciplinary approach
NASA Technical Reports Server (NTRS)
Nichols, Lester D.; Chamis, Christos C.
1991-01-01
The tremendous progress being made in computational engineering and the rapid growth in computing power that is resulting from parallel processing now make it feasible to consider the use of computer simulations to gain insights into the complex interactions in aerospace propulsion systems and to evaluate new concepts early in the design process before a commitment to hardware is made. Described here is a NASA initiative to develop a Numerical Propulsion System Simulation (NPSS) capability.
MIT Laboratory for Computer Science Progress Report, July 1984-June 1985
1985-06-01
larger (up to several thousand machines) multiprocessor systems. This facility, funded by the newly formed Strategic Computing Program of the Defense...Szolovits, Group Leader R. Patil Collaborating Investigators M. Criscitiello, M.D., Tufts-New England Medical Center Hospital J. Dzierzanowski, Ph.D., Dept...COMPUTATION STRUCTURES Academic Staff J. B. Dennis, Group Leader Research Staff W. B. Ackerman G. A. Boughton W. Y-P. Lim Graduate Students T-A. Chu S
ERIC Educational Resources Information Center
Yau, Maria; And Others
Fifty-six Toronto (Ontario, Canada) seventh-grade and eighth-grade learning-disabled students whose handwriting was very difficult to read were randomly assigned to either an experimental or comparison group. Experimental group students were loaned a portable computer to use freely at school and at home during the course of the experiment.…
Remote Science Operation Center research
NASA Technical Reports Server (NTRS)
Banks, P. M.
1986-01-01
Progress in the following areas is discussed: the design, planning and operation of a remote science payload operations control center; design and planning of a data link via satellite; and the design and prototyping of an advanced workstation environment for multi-media (3-D computer aided design/computer aided engineering, voice, video, text) communications and operations.
Alternative Delivery Systems for the Computer-Aided Instruction Study Management System (CAISMS).
ERIC Educational Resources Information Center
Nievergelt, Jurg; And Others
The Computer-Assisted Instruction Study Management System (CAISMS) was developed and implemented on the PLATO system to monitor and guide student study of text materials. It administers assignments, gives quizzes, and automatically keeps track of a student's progress. This report describes CAISMS and several hypothetical implementations of CAISMS…
Computer-Aided Engineering Tools | Water Power | NREL
energy converters that will provide a full range of simulation capabilities for single devices and arrays simulation of water power technologies on high-performance computers enables the study of complex systems and experimentation. Such simulation is critical to accelerate progress in energy programs within the U.S. Department
Lumber Grading With A Computer Vision System
Richard W. Conners; Tai-Hoon Cho; Philip A. Araman
1989-01-01
Over the past few years significant progress has been made in developing a computer vision system for locating and identifying defects on surfaced hardwood lumber. Unfortunately, until September of 1988 little research had gone into developing methods for analyzing rough lumber. This task is arguably more complex than the analysis of surfaced lumber. The prime...
An Empirical Generative Framework for Computational Modeling of Language Acquisition
ERIC Educational Resources Information Center
Waterfall, Heidi R.; Sandbank, Ben; Onnis, Luca; Edelman, Shimon
2010-01-01
This paper reports progress in developing a computer model of language acquisition in the form of (1) a generative grammar that is (2) algorithmically learnable from realistic corpus data, (3) viable in its large-scale quantitative performance and (4) psychologically real. First, we describe new algorithmic methods for unsupervised learning of…
ERIC Educational Resources Information Center
King, Dolores
2013-01-01
African American women are underrepresented in computer technology disciplines in institutions of higher education throughout the United States. Although equitable gender representation is progressing in most fields, much less information is available on why institutions are still lagging in workforce diversity, a problem which can be lessened by…
ERIC Educational Resources Information Center
Mitzel, Harold E.; Brandon, George L.
A series of five reports is presented which describes the activities carried out by the Pennsylvania State University group engaged in research in computer-assisted instruction (CAI) in vocational-technical education. The reports cover the period January 1968-June 1968 and deal with: 1) prior knowledge and individualized instruction; 2) numerical…
Tying Theory To Practice: Cognitive Aspects of Computer Interaction in the Design Process.
ERIC Educational Resources Information Center
Mikovec, Amy E.; Dake, Dennis M.
The new medium of computer-aided design requires changes to the creative problem-solving methodologies typically employed in the development of new visual designs. Most theoretical models of creative problem-solving suggest a linear progression from preparation and incubation to some type of evaluative study of the "inspiration." These…
Creation and Development of an Integrated Model of New Technologies and ESP
ERIC Educational Resources Information Center
Garcia Laborda, Jesus
2004-01-01
It seems irrefutable that the world is progressing in concert with computer science. Educational applications and projects for first and second language acquisition have not been left behind. However, currently it seems that the reputation of completely computer-based language learning courses has taken a nosedive, and, consequently there has been…
Component-Based Approach for Educating Students in Bioinformatics
ERIC Educational Resources Information Center
Poe, D.; Venkatraman, N.; Hansen, C.; Singh, G.
2009-01-01
There is an increasing need for an effective method of teaching bioinformatics. Increased progress and availability of computer-based tools for educating students have led to the implementation of a computer-based system for teaching bioinformatics as described in this paper. Bioinformatics is a recent, hybrid field of study combining elements of…
Performance Measures in Courses Using Computer-Aided Personalized System of Instruction
ERIC Educational Resources Information Center
Springer, C. R.; Pear, J. J.
2008-01-01
Archived data from four courses taught with computer-aided personalized system of instruction (CAPSI)--an online, self-paced, instructional program--were used to explore the relationship between objectively rescored final exam grades, peer reviewing, and progress rate--i.e., the rate at which students completed unit tests. There was a strong…
2013-01-01
Background The glacial and interglacial cycles that characterized the Quaternary greatly affected the distribution and genetic diversity of plants. In the Neotropics, few phylogeographic studies have focused on coastal species outside of the Atlantic Rainforest. Climatic and sea level changes during the Quaternary played an important role in the evolutionary history of many organisms found in coastal regions. To contribute to a better understanding of plant evolution in this environment in Southern South America, we focused on Calibrachoa heterophylla (Solanaceae), an endemic and vulnerable wild petunia species from the South Atlantic Coastal Plain (SACP). Results We assessed DNA sequences from two cpDNA intergenic spacers and analyzed them using a phylogeographic approach. The present phylogeographic study reveals the influence of complex geologic and climatic events on patterns of genetic diversification. The results indicate that C. heterophylla originated inland and subsequently colonized the SACP; the data show that the inland haplogroup is more ancient than the coastal one and that the inland was not affected by sea level changes in the Quaternary. The major diversification of C. heterophylla that occurred after 0.4 Myr was linked to sea level oscillations in the Quaternary, and any diversification that occurred before this time was obscured by marine transgressions that occurred before the coastal sand barrier’s formation. Results of the Bayesian skyline plot showed a recent population expansion detected in C. heterophylla seems to be related to an increase in temperature and humidity that occurred at the beginning of the Holocene. Conclusions The geographic clades have been formed when the coastal plain was deeply dissected by paleochannels and these correlate very well with the distributional limits of the clades. The four major sea transgressions formed a series of four sand barriers parallel to the coast that progressively increased the availability of coastal areas after the regressions and that may have promoted the geographic structuring of genetic diversity observed today. The recent population expansion for the entire species may be linked with the event of marine regression after the most recent sea transgression at ~5 kya. PMID:23987105
Mesoscale Models of Fluid Dynamics
NASA Astrophysics Data System (ADS)
Boghosian, Bruce M.; Hadjiconstantinou, Nicolas G.
During the last half century, enormous progress has been made in the field of computational materials modeling, to the extent that in many cases computational approaches are used in a predictive fashion. Despite this progress, modeling of general hydrodynamic behavior remains a challenging task. One of the main challenges stems from the fact that hydrodynamics manifests itself over a very wide range of length and time scales. On one end of the spectrum, one finds the fluid's "internal" scale characteristic of its molecular structure (in the absence of quantum effects, which we omit in this chapter). On the other end, the "outer" scale is set by the characteristic sizes of the problem's domain. The resulting scale separation or lack thereof as well as the existence of intermediate scales are key to determining the optimal approach. Successful treatments require a judicious choice of the level of description which is a delicate balancing act between the conflicting requirements of fidelity and manageable computational cost: a coarse description typically requires models for underlying processes occuring at smaller length and time scales; on the other hand, a fine-scale model will incur a significantly larger computational cost.
Progress in fast, accurate multi-scale climate simulations
Collins, W. D.; Johansen, H.; Evans, K. J.; ...
2015-06-01
We present a survey of physical and computational techniques that have the potential to contribute to the next generation of high-fidelity, multi-scale climate simulations. Examples of the climate science problems that can be investigated with more depth with these computational improvements include the capture of remote forcings of localized hydrological extreme events, an accurate representation of cloud features over a range of spatial and temporal scales, and parallel, large ensembles of simulations to more effectively explore model sensitivities and uncertainties. Numerical techniques, such as adaptive mesh refinement, implicit time integration, and separate treatment of fast physical time scales are enablingmore » improved accuracy and fidelity in simulation of dynamics and allowing more complete representations of climate features at the global scale. At the same time, partnerships with computer science teams have focused on taking advantage of evolving computer architectures such as many-core processors and GPUs. As a result, approaches which were previously considered prohibitively costly have become both more efficient and scalable. In combination, progress in these three critical areas is poised to transform climate modeling in the coming decades.« less
Recent achievements in restorative neurology: Progressive neuromuscular diseases
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dimitrijevic, M.R.; Kakulas, B.A.; Vrbova, G.
1986-01-01
This book contains 27 chapters. Some of the chapter titles are: Computed Tomography of Muscles in Neuromuscular Disease; Mapping the Genes for Muscular Dystrophy; Trophic Factors and Motor Neuron Development; Size of Motor Units and Firing Rate in Muscular Dystrophy; Restorative Possibilities in Relation to the Pathology of Progressive Neuromuscular Disease; and An Approach to the Pathogenesis of some Congenital Myopathies.
Celestial mechanics during the last two decades
NASA Technical Reports Server (NTRS)
Szebehely, V.
1978-01-01
The unprecedented progress in celestial mechanics (orbital mechanics, astrodynamics, space dynamics) is reviewed from 1957 to date. The engineering, astronomical and mathematical aspects are synthesized. The measuring and computational techniques developed parallel with the theoretical advances are outlined. Major unsolved problem areas are listed with proposed approaches for their solutions. Extrapolations and predictions of the progress for the future conclude the paper.
COSMIC monthly progress report
NASA Technical Reports Server (NTRS)
1994-01-01
Activities of the Computer Software Management and Information Center (COSMIC) are summarized for the month of January 1994. Tables showing the current inventory of programs available from COSMIC are presented and program processing and evaluation activities are discussed. Marketing and customer service activities in this period are presented as is the progress report of NASTRAN maintenance and support. Tables of disseminations and budget summary conclude the report.
Computational composite mechanics for aerospace propulsion structures
NASA Technical Reports Server (NTRS)
Chamis, C. C.
1986-01-01
Specialty methods are presented for the computational simulation of specific composite behavior. These methods encompass all aspects of composite mechanics, impact, progressive fracture and component specific simulation. Some of these methods are structured to computationally simulate, in parallel, the composite behavior and history from the initial fabrication through several missions and even to fracture. Select methods and typical results obtained from such simulations are described in detail in order to demonstrate the effectiveness of computationally simulating (1) complex composite structural behavior in general and (2) specific aerospace propulsion structural components in particular.
Computational composite mechanics for aerospace propulsion structures
NASA Technical Reports Server (NTRS)
Chamis, Christos C.
1987-01-01
Specialty methods are presented for the computational simulation of specific composite behavior. These methods encompass all aspects of composite mechanics, impact, progressive fracture and component specific simulation. Some of these methods are structured to computationally simulate, in parallel, the composite behavior and history from the initial frabrication through several missions and even to fracture. Select methods and typical results obtained from such simulations are described in detail in order to demonstrate the effectiveness of computationally simulating: (1) complex composite structural behavior in general, and (2) specific aerospace propulsion structural components in particular.
In Search of the Neural Circuits of Intrinsic Motivation
Kaplan, Frederic; Oudeyer, Pierre-Yves
2007-01-01
Children seem to acquire new know-how in a continuous and open-ended manner. In this paper, we hypothesize that an intrinsic motivation to progress in learning is at the origins of the remarkable structure of children's developmental trajectories. In this view, children engage in exploratory and playful activities for their own sake, not as steps toward other extrinsic goals. The central hypothesis of this paper is that intrinsically motivating activities correspond to expected decrease in prediction error. This motivation system pushes the infant to avoid both predictable and unpredictable situations in order to focus on the ones that are expected to maximize progress in learning. Based on a computational model and a series of robotic experiments, we show how this principle can lead to organized sequences of behavior of increasing complexity characteristic of several behavioral and developmental patterns observed in humans. We then discuss the putative circuitry underlying such an intrinsic motivation system in the brain and formulate two novel hypotheses. The first one is that tonic dopamine acts as a learning progress signal. The second is that this progress signal is directly computed through a hierarchy of microcortical circuits that act both as prediction and metaprediction systems. PMID:18982131
Rejniak, Katarzyna A.; Gerlee, Philip
2013-01-01
Summary In this review we summarize our recent efforts using mathematical modeling and computation to simulate cancer invasion, with a special emphasis on the tumor microenvironment. We consider cancer progression as a complex multiscale process and approach it with three single-cell based mathematical models that examine the interactions between tumor microenvironment and cancer cells at several scales. The models exploit distinct mathematical and computational techniques, yet they share core elements and can be compared and/or related to each other. The overall aim of using mathematical models is to uncover the fundamental mechanisms that lend cancer progression its direction towards invasion and metastasis. The models effectively simulate various modes of cancer cell adaptation to the microenvironment in a growing tumor. All three point to a general mechanism underlying cancer invasion: competition for adaptation between distinct cancer cell phenotypes, driven by a tumor microenvironment with scarce resources. These theoretical predictions pose an intriguing experimental challenge: test the hypothesis that invasion is an emergent property of cancer cell populations adapting to selective microenvironment pressure, rather than culmination of cancer progression producing cells with the “invasive phenotype”. In broader terms, we propose that fundamental insights into cancer can be achieved by experimentation interacting with theoretical frameworks provided by computational and mathematical modeling. PMID:18524624
Validation of CBCT for the computation of textural biomarkers
NASA Astrophysics Data System (ADS)
Paniagua, Beatriz; Ruellas, Antonio C.; Benavides, Erika; Marron, Steve; Wolford, Larry; Cevidanes, Lucia
2015-03-01
Osteoarthritis (OA) is associated with significant pain and 42.6% of patients with TMJ disorders present with evidence of TMJ OA. However, OA diagnosis and treatment remain controversial, since there are no clear symptoms of the disease. The subchondral bone in the TMJ is believed to play a major role in the progression of OA. We hypothesize that the textural imaging biomarkers computed in high resolution Conebeam CT (hr- CBCT) and μCT scans are comparable. The purpose of this study is to test the feasibility of computing textural imaging biomarkers in-vivo using hr-CBCT, compared to those computed in μCT scans as our Gold Standard. Specimens of condylar bones obtained from condylectomies were scanned using μCT and hr- CBCT. Nine different textural imaging biomarkers (four co-occurrence features and five run-length features) from each pair of μCT and hr-CBCT were computed and compared. Pearson correlation coefficients were computed to compare textural biomarkers values of μCT and hr-CBCT. Four of the nine computed textural biomarkers showed a strong positive correlation between biomarkers computed in μCT and hr-CBCT. Higher correlations in Energy and Contrast, and in GLN (grey-level non-uniformity) and RLN (run length non-uniformity) indicate quantitative texture features can be computed reliably in hr-CBCT, when compared with μCT. The textural imaging biomarkers computed in-vivo hr-CBCT have captured the structure, patterns, contrast between neighboring regions and uniformity of healthy and/or pathologic subchondral bone. The ability to quantify bone texture non-invasively now makes it possible to evaluate the progression of subchondral bone alterations, in TMJ OA.
Validation of CBCT for the computation of textural biomarkers
Paniagua, Beatriz; Ruellas, Antonio Carlos; Benavides, Erika; Marron, Steve; Woldford, Larry; Cevidanes, Lucia
2015-01-01
Osteoarthritis (OA) is associated with significant pain and 42.6% of patients with TMJ disorders present with evidence of TMJ OA. However, OA diagnosis and treatment remain controversial, since there are no clear symptoms of the disease. The subchondral bone in the TMJ is believed to play a major role in the progression of OA. We hypothesize that the textural imaging biomarkers computed in high resolution Conebeam CT (hr-CBCT) and μCT scans are comparable. The purpose of this study is to test the feasibility of computing textural imaging biomarkers in-vivo using hr-CBCT, compared to those computed in μCT scans as our Gold Standard. Specimens of condylar bones obtained from condylectomies were scanned using μCT and hr-CBCT. Nine different textural imaging biomarkers (four co-occurrence features and five run-length features) from each pair of μCT and hr-CBCT were computed and compared. Pearson correlation coefficients were computed to compare textural biomarkers values of μCT and hr-CBCT. Four of the nine computed textural biomarkers showed a strong positive correlation between biomarkers computed in μCT and hr-CBCT. Higher correlations in Energy and Contrast, and in GLN (grey-level non-uniformity) and RLN (run length non-uniformity) indicate quantitative texture features can be computed reliably in hr-CBCT, when compared with μCT. The textural imaging biomarkers computed in-vivo hr-CBCT have captured the structure, patterns, contrast between neighboring regions and uniformity of healthy and/or pathologic subchondral bone. The ability to quantify bone texture non-invasively now makes it possible to evaluate the progression of subchondral bone alterations, in TMJ OA. PMID:26085710
Validation of CBCT for the computation of textural biomarkers.
Paniagua, Beatriz; Ruellas, Antonio Carlos; Benavides, Erika; Marron, Steve; Woldford, Larry; Cevidanes, Lucia
2015-03-17
Osteoarthritis (OA) is associated with significant pain and 42.6% of patients with TMJ disorders present with evidence of TMJ OA. However, OA diagnosis and treatment remain controversial, since there are no clear symptoms of the disease. The subchondral bone in the TMJ is believed to play a major role in the progression of OA. We hypothesize that the textural imaging biomarkers computed in high resolution Conebeam CT (hr-CBCT) and μCT scans are comparable. The purpose of this study is to test the feasibility of computing textural imaging biomarkers in-vivo using hr-CBCT, compared to those computed in μCT scans as our Gold Standard. Specimens of condylar bones obtained from condylectomies were scanned using μCT and hr-CBCT. Nine different textural imaging biomarkers (four co-occurrence features and five run-length features) from each pair of μCT and hr-CBCT were computed and compared. Pearson correlation coefficients were computed to compare textural biomarkers values of μCT and hr-CBCT. Four of the nine computed textural biomarkers showed a strong positive correlation between biomarkers computed in μCT and hr-CBCT. Higher correlations in Energy and Contrast, and in GLN (grey-level non-uniformity) and RLN (run length non-uniformity) indicate quantitative texture features can be computed reliably in hr-CBCT, when compared with μCT. The textural imaging biomarkers computed in-vivo hr-CBCT have captured the structure, patterns, contrast between neighboring regions and uniformity of healthy and/or pathologic subchondral bone. The ability to quantify bone texture non-invasively now makes it possible to evaluate the progression of subchondral bone alterations, in TMJ OA.
Theranostics of Neuroendocrine Tumors.
Lee, Sze Ting; Kulkarni, Harshad R; Singh, Aviral; Baum, Richard P
2017-10-01
Somatostatin receptor positron emission tomography/computed tomography using 68 Ga-labeled somatostatin analogs is the mainstay for the evaluation of receptor status in neuroendocrine tumors (NETs). This translates towards better therapy options, with increasing evidence of peptide receptor radionuclide therapy (PRRT) as the treatment of choice for advanced or progressive NETs. There are benefits in progression-free and overall survival as well as a significant improvement in clinical condition. In patients with progressive NETs, fractionated, personalized PRRT results in good therapeutic responses with no significant severe hematological and/or renal toxicity, thus improving quality of life.
NASA Technical Reports Server (NTRS)
Manohar, Mareboyana; Tilton, James C.
1994-01-01
A progressive vector quantization (VQ) compression approach is discussed which decomposes image data into a number of levels using full search VQ. The final level is losslessly compressed, enabling lossless reconstruction. The computational difficulties are addressed by implementation on a massively parallel SIMD machine. We demonstrate progressive VQ on multispectral imagery obtained from the Advanced Very High Resolution Radiometer instrument and other Earth observation image data, and investigate the trade-offs in selecting the number of decomposition levels and codebook training method.
Artificial intelligence and deep learning - Radiology's next frontier?
Mayo, Ray Cody; Leung, Jessica
Tracing the use of computers in the radiology department from administrative functions through image acquisition, storage, and reporting, to early attempts at improved diagnosis, we begin to imagine possible new frontiers for their use in exam interpretation. Given their initially slow but ultimately substantial progress in the noninterpretive areas, we are left desiring and even expecting more in the interpretation realm. New technological advances may provide the next wave of progress and radiologists should be early adopters. Several potential applications are discussed and hopefully will serve to inspire future progress. Published by Elsevier Inc.
Fatty replacement of lower paraspinal muscles: normal and neuromuscular disorders
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hader, H.; Gadoth, N.; Heifetz, H.
1983-11-01
The physiologic replacement of the lower paraspinal muscles by fat was evaluated in 157 patients undergoing computed tomography for reasons unrelated to abnormalities of the locomotor system. Five patients with neuromuscular disorders were similarly evaluated. The changes were graded according to severity at three spinal levels: lower thoracic-upper lumbar, midlumbar, and lumbosacral. The results were analyzed in relation to age and gender. It was found that fatty replacement of paraspinal muscles is a normal age-progressive phenomenon most prominent in females. It progresses down the spine, being most advanced in the lumbosacral region. The severest changes in the five patients withmore » neuromuscular disorders (three with poliomyelitis and two with progressive muscular dystrophy) consisted of complete muscle group replacement by fat. In postpoliomyelitis atrophy, the distribution was typically asymmetric and sometimes lacked clinical correlation. In muscular dystrophy, fatty replacement was symmetric, showing relative sparing of the psoas and multifidus muscles. In patients with neuromuscular diseases, computed tomography of muscles may be helpful in planning a better rehabilitation regimen.« less
Progressive Fracture of Composite Structures
NASA Technical Reports Server (NTRS)
Minnetyan, Levon
2001-01-01
This report includes the results of a research in which the COmposite Durability STRuctural ANalysis (CODSTRAN) computational simulation capabilities were augmented and applied to various structures for demonstration of the new features and verification. The first chapter of this report provides an introduction to the computational simulation or virtual laboratory approach for the assessment of damage and fracture progression characteristics in composite structures. The second chapter outlines the details of the overall methodology used, including the failure criteria and the incremental/iterative loading procedure with the definitions of damage, fracture, and equilibrium states. The subsequent chapters each contain an augmented feature of the code and/or demonstration examples. All but one of the presented examples contains laminated composite structures with various fiber/matrix constituents. For each structure simulated, damage initiation and progression mechanisms are identified and the structural damage tolerance is quantified at various degradation stages. Many chapters contain the simulation of defective and defect free structures to evaluate the effects of existing defects on structural durability.
Maldonado, Fabien; Duan, Fenghai; Raghunath, Sushravya M.; Rajagopalan, Srinivasan; Karwoski, Ronald A.; Garg, Kavita; Greco, Erin; Nath, Hrudaya; Robb, Richard A.; Bartholmai, Brian J.
2015-01-01
Rationale: Screening for lung cancer using low-dose computed tomography (CT) reduces lung cancer mortality. However, in addition to a high rate of benign nodules, lung cancer screening detects a large number of indolent cancers that generally belong to the adenocarcinoma spectrum. Individualized management of screen-detected adenocarcinomas would be facilitated by noninvasive risk stratification. Objectives: To validate that Computer-Aided Nodule Assessment and Risk Yield (CANARY), a novel image analysis software, successfully risk stratifies screen-detected lung adenocarcinomas based on clinical disease outcomes. Methods: We identified retrospective 294 eligible patients diagnosed with lung adenocarcinoma spectrum lesions in the low-dose CT arm of the National Lung Screening Trial. The last low-dose CT scan before the diagnosis of lung adenocarcinoma was analyzed using CANARY blinded to clinical data. Based on their parametric CANARY signatures, all the lung adenocarcinoma nodules were risk stratified into three groups. CANARY risk groups were compared using survival analysis for progression-free survival. Measurements and Main Results: A total of 294 patients were included in the analysis. Kaplan-Meier analysis of all the 294 adenocarcinoma nodules stratified into the Good, Intermediate, and Poor CANARY risk groups yielded distinct progression-free survival curves (P < 0.0001). This observation was confirmed in the unadjusted and adjusted (age, sex, race, and smoking status) progression-free survival analysis of all stage I cases. Conclusions: CANARY allows the noninvasive risk stratification of lung adenocarcinomas into three groups with distinct post-treatment progression-free survival. Our results suggest that CANARY could ultimately facilitate individualized management of incidentally or screen-detected lung adenocarcinomas. PMID:26052977
Maldonado, Fabien; Duan, Fenghai; Raghunath, Sushravya M; Rajagopalan, Srinivasan; Karwoski, Ronald A; Garg, Kavita; Greco, Erin; Nath, Hrudaya; Robb, Richard A; Bartholmai, Brian J; Peikert, Tobias
2015-09-15
Screening for lung cancer using low-dose computed tomography (CT) reduces lung cancer mortality. However, in addition to a high rate of benign nodules, lung cancer screening detects a large number of indolent cancers that generally belong to the adenocarcinoma spectrum. Individualized management of screen-detected adenocarcinomas would be facilitated by noninvasive risk stratification. To validate that Computer-Aided Nodule Assessment and Risk Yield (CANARY), a novel image analysis software, successfully risk stratifies screen-detected lung adenocarcinomas based on clinical disease outcomes. We identified retrospective 294 eligible patients diagnosed with lung adenocarcinoma spectrum lesions in the low-dose CT arm of the National Lung Screening Trial. The last low-dose CT scan before the diagnosis of lung adenocarcinoma was analyzed using CANARY blinded to clinical data. Based on their parametric CANARY signatures, all the lung adenocarcinoma nodules were risk stratified into three groups. CANARY risk groups were compared using survival analysis for progression-free survival. A total of 294 patients were included in the analysis. Kaplan-Meier analysis of all the 294 adenocarcinoma nodules stratified into the Good, Intermediate, and Poor CANARY risk groups yielded distinct progression-free survival curves (P < 0.0001). This observation was confirmed in the unadjusted and adjusted (age, sex, race, and smoking status) progression-free survival analysis of all stage I cases. CANARY allows the noninvasive risk stratification of lung adenocarcinomas into three groups with distinct post-treatment progression-free survival. Our results suggest that CANARY could ultimately facilitate individualized management of incidentally or screen-detected lung adenocarcinomas.
Wavelet-enabled progressive data Access and Storage Protocol (WASP)
NASA Astrophysics Data System (ADS)
Clyne, J.; Frank, L.; Lesperance, T.; Norton, A.
2015-12-01
Current practices for storing numerical simulation outputs hail from an era when the disparity between compute and I/O performance was not as great as it is today. The memory contents for every sample, computed at every grid point location, are simply saved at some prescribed temporal frequency. Though straightforward, this approach fails to take advantage of the coherency in neighboring grid points that invariably exists in numerical solutions to mathematical models. Exploiting such coherence is essential to digital multimedia; DVD-Video, digital cameras, streaming movies and audio are all possible today because of transform-based compression schemes that make substantial reductions in data possible by taking advantage of the strong correlation between adjacent samples in both space and time. Such methods can also be exploited to enable progressive data refinement in a manner akin to that used in ubiquitous digital mapping applications: views from far away are shown in coarsened detail to provide context, and can be progressively refined as the user zooms in on a localized region of interest. The NSF funded WASP project aims to provide a common, NetCDF-compatible software framework for supporting wavelet-based, multi-scale, progressive data, enabling interactive exploration of large data sets for the geoscience communities. This presentation will provide an overview of this work in progress to develop community cyber-infrastructure for the efficient analysis of very large data sets.
Light reflection models for computer graphics.
Greenberg, D P
1989-04-14
During the past 20 years, computer graphic techniques for simulating the reflection of light have progressed so that today images of photorealistic quality can be produced. Early algorithms considered direct lighting only, but global illumination phenomena with indirect lighting, surface interreflections, and shadows can now be modeled with ray tracing, radiosity, and Monte Carlo simulations. This article describes the historical development of computer graphic algorithms for light reflection and pictorially illustrates what will be commonly available in the near future.
1990-02-01
noise. Tobias B. Orloff Work began on developing a high quality rendering algorithm based on the radiosity method. The algorithm is similar to...previous progressive radiosity algorithms except for the following improvements: 1. At each iteration vertex radiosities are computed using a modified scan...line approach, thus eliminating the quadratic cost associated with a ray tracing computation of vortex radiosities . 2. At each iteration the scene is
Multimodal neuroelectric interface development
NASA Technical Reports Server (NTRS)
Trejo, Leonard J.; Wheeler, Kevin R.; Jorgensen, Charles C.; Rosipal, Roman; Clanton, Sam T.; Matthews, Bryan; Hibbs, Andrew D.; Matthews, Robert; Krupka, Michael
2003-01-01
We are developing electromyographic and electroencephalographic methods, which draw control signals for human-computer interfaces from the human nervous system. We have made progress in four areas: 1) real-time pattern recognition algorithms for decoding sequences of forearm muscle activity associated with control gestures; 2) signal-processing strategies for computer interfaces using electroencephalogram (EEG) signals; 3) a flexible computation framework for neuroelectric interface research; and d) noncontact sensors, which measure electromyogram or EEG signals without resistive contact to the body.
Computed Flow Through An Artificial Heart And Valve
NASA Technical Reports Server (NTRS)
Rogers, Stuart E.; Kwak, Dochan; Kiris, Cetin; Chang, I-Dee
1994-01-01
NASA technical memorandum discusses computations of flow of blood through artificial heart and through tilting-disk artificial heart valve. Represents further progress in research described in "Numerical Simulation of Flow Through an Artificial Heart" (ARC-12478). One purpose of research to exploit advanced techniques of computational fluid dynamics and capabilities of supercomputers to gain understanding of complicated internal flows of viscous, essentially incompressible fluids like blood. Another to use understanding to design better artificial hearts and valves.
HOME COMPUTER USE AND THE DEVELOPMENT OF HUMAN CAPITAL*
Malamud, Ofer; Pop-Eleches, Cristian
2012-01-01
This paper uses a regression discontinuity design to estimate the effect of home computers on child and adolescent outcomes by exploiting a voucher program in Romania. Our main results indicate that home computers have both positive and negative effects on the development of human capital. Children who won a voucher to purchase a computer had significantly lower school grades but show improved computer skills. There is also some evidence that winning a voucher increased cognitive skills, as measured by Raven’s Progressive Matrices. We do not find much evidence for an effect on non-cognitive outcomes. Parental rules regarding homework and computer use attenuate the effects of computer ownership, suggesting that parental monitoring and supervision may be important mediating factors. PMID:22719135
Model equations for the Eiffel Tower profile: historical perspective and new results
NASA Astrophysics Data System (ADS)
Weidman, Patrick; Pinelis, Iosif
2004-07-01
Model equations for the shape of the Eiffel Tower are investigated. One model purported to be based on Eiffel's writing does not give a tower with the correct curvature. A second popular model not connected with Eiffel's writings provides a fair approximation to the tower's skyline profile of 29 contiguous panels. Reported here is a third model derived from Eiffel's concern about wind loads on the tower, as documented in his communication to the French Civil Engineering Society on 30 March 1885. The result is a nonlinear, integro-differential equation which is solved to yield an exponential tower profile. It is further verified that, as Eiffel wrote, "in reality the curve exterior of the tower reproduces, at a determined scale, the same curve of the moments produced by the wind". An analysis of the actual tower profile shows that it is composed of two piecewise continuous exponentials with different growth rates. This is explained by specific safety factors for wind loading that Eiffel & Company incorporated in the design of the free-standing tower. To cite this article: P. Weidman, I. Pinelis, C. R. Mecanique 332 (2004).
Campbell, M A; Lopéz, J A
2014-02-01
Mitochondrial genetic variability among populations of the blackfish genus Dallia (Esociformes) across Beringia was examined. Levels of divergence and patterns of geographic distribution of mitochondrial DNA lineages were characterized using phylogenetic inference, median-joining haplotype networks, Bayesian skyline plots, mismatch analysis and spatial analysis of molecular variance (SAMOVA) to infer genealogical relationships and to assess patterns of phylogeography among extant mitochondrial lineages in populations of species of Dallia. The observed variation includes extensive standing mitochondrial genetic diversity and patterns of distinct spatial segregation corresponding to historical and contemporary barriers with minimal or no mixing of mitochondrial haplotypes between geographic areas. Mitochondrial diversity is highest in the common delta formed by the Yukon and Kuskokwim Rivers where they meet the Bering Sea. Other regions sampled in this study host comparatively low levels of mitochondrial diversity. The observed levels of mitochondrial diversity and the spatial distribution of that diversity are consistent with persistence of mitochondrial lineages in multiple refugia through the last glacial maximum. © 2014 The Fisheries Society of the British Isles.
Mastalerz, Maria; Hower, J.C.
1996-01-01
Botryococcus-derived alginites from the Westphalian Skyline, No. 5 Block, Leatherwood (eastern Kentucky) and Breckinridge (western Kentucky) coal beds have been analyzed for elemental composition and functional group distribution using an electron microprobe and micro-FTIR, respectively. The alginites from Kentucky show a carbon range of 81.6 to 92% and oxygen content of 3.5 to 9.5%. Sulphur content ranges from 0.66 to 0.84% and Fe, Si, Al and Ca occur in minor quantities. FTIR analysis demonstrates dominant CH2, CH3 bands and subordinate aromatic carbon in all alginites. The major differences between alginites are in the ratios of CH2 and CH3 groups and ratios between aromatic bands in the out-of-plane region. These differences suggest that, although the ancient Botryococcus derives from a selective preservation of a resistant polymer, it undergoes molecular and some elemental changes through the rank equivalent to vitrinite reflectance of 0.5-0.85%. Other differences, such as intensities of ether bridges and those of carboxyl/carbonyl groups, are attributed to differences in depositional environments.
NASA Astrophysics Data System (ADS)
Morrison, S. M.; Downs, R. T.; Golden, J. J.; Pires, A.; Fox, P. A.; Ma, X.; Zednik, S.; Eleish, A.; Prabhu, A.; Hummer, D. R.; Liu, C.; Meyer, M.; Ralph, J.; Hystad, G.; Hazen, R. M.
2016-12-01
We have developed a comprehensive database of copper (Cu) mineral characteristics. These data include crystallographic, paragenetic, chemical, locality, age, structural complexity, and physical property information for the 689 Cu mineral species approved by the International Mineralogical Association (rruff.info/ima). Synthesis of this large, varied dataset allows for in-depth exploration of statistical trends and visualization techniques. With social network analysis (SNA) and cluster analysis of minerals, we create sociograms and chord diagrams. SNA visualizations illustrate the relationships and connectivity between mineral species, which often form cliques associated with rock type and/or geochemistry. Using mineral ecology statistics, we analyze mineral-locality frequency distribution and predict the number of missing mineral species, visualized with accumulation curves. By assembly of 2-dimensional KLEE diagrams of co-existing elements in minerals, we illustrate geochemical trends within a mineral system. To explore mineral age and chemical oxidation state, we create skyline diagrams and compare trends with varying chemistry. These trends illustrate mineral redox changes through geologic time and correlate with significant geologic occurrences, such as the Great Oxidation Event (GOE) or Wilson Cycles.
NASA Technical Reports Server (NTRS)
Phillips, Jennifer K.
1995-01-01
Two of the current and most popular implementations of the Message-Passing Standard, Message Passing Interface (MPI), were contrasted: MPICH by Argonne National Laboratory, and LAM by the Ohio Supercomputer Center at Ohio State University. A parallel skyline matrix solver was adapted to be run in a heterogeneous environment using MPI. The Message-Passing Interface Forum was held in May 1994 which lead to a specification of library functions that implement the message-passing model of parallel communication. LAM, which creates it's own environment, is more robust in a highly heterogeneous network. MPICH uses the environment native to the machine architecture. While neither of these free-ware implementations provides the performance of native message-passing or vendor's implementations, MPICH begins to approach that performance on the SP-2. The machines used in this study were: IBM RS6000, 3 Sun4, SGI, and the IBM SP-2. Each machine is unique and a few machines required specific modifications during the installation. When installed correctly, both implementations worked well with only minor problems.
Using iRT, a normalized retention time for more targeted measurement of peptides.
Escher, Claudia; Reiter, Lukas; MacLean, Brendan; Ossola, Reto; Herzog, Franz; Chilton, John; MacCoss, Michael J; Rinner, Oliver
2012-04-01
Multiple reaction monitoring (MRM) has recently become the method of choice for targeted quantitative measurement of proteins using mass spectrometry. The method, however, is limited in the number of peptides that can be measured in one run. This number can be markedly increased by scheduling the acquisition if the accurate retention time (RT) of each peptide is known. Here we present iRT, an empirically derived dimensionless peptide-specific value that allows for highly accurate RT prediction. The iRT of a peptide is a fixed number relative to a standard set of reference iRT-peptides that can be transferred across laboratories and chromatographic systems. We show that iRT facilitates the setup of multiplexed experiments with acquisition windows more than four times smaller compared to in silico RT predictions resulting in improved quantification accuracy. iRTs can be determined by any laboratory and shared transparently. The iRT concept has been implemented in Skyline, the most widely used software for MRM experiments. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
On the Reproducibility of Label-Free Quantitative Cross-Linking/Mass Spectrometry
NASA Astrophysics Data System (ADS)
Müller, Fränze; Fischer, Lutz; Chen, Zhuo Angel; Auchynnikava, Tania; Rappsilber, Juri
2018-02-01
Quantitative cross-linking/mass spectrometry (QCLMS) is an emerging approach to study conformational changes of proteins and multi-subunit complexes. Distinguishing protein conformations requires reproducibly identifying and quantifying cross-linked peptides. Here we analyzed the variation between multiple cross-linking reactions using bis[sulfosuccinimidyl] suberate (BS3)-cross-linked human serum albumin (HSA) and evaluated how reproducible cross-linked peptides can be identified and quantified by LC-MS analysis. To make QCLMS accessible to a broader research community, we developed a workflow that integrates the established software tools MaxQuant for spectra preprocessing, Xi for cross-linked peptide identification, and finally Skyline for quantification (MS1 filtering). Out of the 221 unique residue pairs identified in our sample, 124 were subsequently quantified across 10 analyses with coefficient of variation (CV) values of 14% (injection replica) and 32% (reaction replica). Thus our results demonstrate that the reproducibility of QCLMS is in line with the reproducibility of general quantitative proteomics and we establish a robust workflow for MS1-based quantitation of cross-linked peptides.
The history of aggregate development in the denver, Co area
Langer, W.H.
2009-01-01
At the start of the 20th century Denver's population was 203,795. Most streets were unpaved. Buildings were constructed of wood frame or masonry. Transport was by horse-drawn-wagon or rail. Statewide, aggregate consumption was less than 0.25 metric tons per person per year. One hundred years later Denver had a population of 2,365,345. Today Denver is a major metropolitan area at the crossroads of two interstates, home to a new international airport, and in the process of expanding its light rail transit system. The skyline is punctuated with skyscrapers. The urban center is surrounded with edge cities. These changes required huge amounts of aggregate. Statewide, aggregate consumption increased 50 fold to over 13 metric tons per person per year. Denver has a large potential supply of aggregate, but sand and gravel quality decreases downstream from the mountain front and potential sources of crushed stone occur in areas prized for their scenic beauty. These issues, along with urban encroachment and citizen opposition, have complicated aggregate development and have paved a new path for future aggregate development including sustainable resource management and reclamation techniques.
The genetic diversity and evolutionary history of hepatitis C virus in Vietnam
Li, Chunhua; Yuan, Manqiong; Lu, Ling; Lu, Teng; Xia, Wenjie; Pham, Van H.; Vo, An X.D.; Nguyen, Mindie H.; Abe, Kenji
2014-01-01
Vietnam has a unique history in association with foreign countries, which may have resulted in multiple introductions of the alien HCV strains to mix with those indigenous ones. In this study, we characterized the HCV sequences in Core-E1 and NS5B regions from 236 Vietnamese individuals. We identified multiple HCV lineages; 6a, 6e, 6h, 6k, 6l, 6o, 6p, and two novel variants may represent the indigenous strains; 1a was probably introduced from the US; 1b and 2a possibly originated in East Asia; while 2i, 2j, and 2m were likely brought by French explorers. We inferred the evolutionary history for four major subtypes: 1a, 1b, 6a, and 6e. The obtained Bayesian Skyline Plots (BSPs) consistently showed the rapid HCV population growth from 1955-1963 until 1984 or after, corresponding to the era of the Vietnam War. We also estimated HCV growth rates and reconstructed phylogeographic trees for comparing subtypes 1a, 1b, and HCV-2. PMID:25193655
NASA Astrophysics Data System (ADS)
Wang, X.
2018-04-01
Tourism geological resources are of high value in admiration, scientific research and universal education, which need to be protected and rationally utilized. In the past, most of the remote sensing investigations of tourism geological resources used two-dimensional remote sensing interpretation method, which made it difficult for some geological heritages to be interpreted and led to the omission of some information. This aim of this paper is to assess the value of a method using the three-dimensional visual remote sensing image to extract information of geological heritages. skyline software system is applied to fuse the 0.36 m aerial images and 5m interval DEM to establish the digital earth model. Based on the three-dimensional shape, color tone, shadow, texture and other image features, the distribution of tourism geological resources in Shandong Province and the location of geological heritage sites were obtained, such as geological structure, DaiGu landform, granite landform, Volcanic landform, sandy landform, Waterscapes, etc. The results show that using this method for remote sensing interpretation is highly recognizable, making the interpretation more accurate and comprehensive.
Fires in Shenandoah National Park
NASA Technical Reports Server (NTRS)
2002-01-01
A large smoke plume has been streaming eastward from Virginia's Shenandoah National Park near Old Rag Mountain. Based on satellite images, it appears the blaze started sometime between October 30 and 31. This true-color image of the fire was obtained on November 1, 2000 by the Moderate-resolution Imaging Spectroradiometer (MODIS), flying aboard NASA's Terra spacecraft. Thermal Infrared data, overlaid on the color image, reveals the presence of two active fires underneath the smoke plume. The northern fire (upper) is burning near the Pinnacles Picnic Area along Skyline Drive. The southern fire (lower) is on Old Rag Mountain. Old Rag is one of the most popular hikes in the Washington, DC area, and features extremely rugged terrain, with granite cliffs up to 90 feet high. This scene was produced using MODIS direct broadcast data received and processed at the Space Science and Engineering Center, University of Wisconsin-Madison. The smoke plume appears blue-grey while the red and yellow pixels show the locations of the smoldering and flaming portions of the fire, respectively. Image by Liam Gumley, Cooperative Institute for Meteorological Satellite Studies, and Robert Simmon, NASA GSFC
2014-08-13
SAN DIEGO, Calif. – The USS Anchorage returns to Naval Base San Diego after completion of the Orion Underway Recovery Test 2 in the Pacific Ocean. The ship is framed by the skyline of the city of San Diego. NASA, Lockheed Martin and the U.S. Navy conducted the test on the Orion boilerplate test vehicle to prepare for recovery of the Orion crew module on its return from a deep space mission. The underway recovery test allowed the team to demonstrate and evaluate the recovery processes, procedures, new hardware and personnel in open waters. The Ground Systems Development and Operations Program conducted the underway recovery test. Orion is the exploration spacecraft designed to carry astronauts to destinations not yet explored by humans, including an asteroid and Mars. It will have emergency abort capability, sustain the crew during space travel and provide safe re-entry from deep space return velocities. The first unpiloted test flight of the Orion is scheduled to launch in 2014 atop a Delta IV rocket and in 2017 on NASA’s Space Launch System rocket. For more information, visit http://www.nasa.gov/orion. Photo credit: NASA/Kim Shiflett
2014-08-13
SAN DIEGO, Calif. – The USS Anchorage returns to Naval Base San Diego after completion of the Orion Underway Recovery Test 2 in the Pacific Ocean. The ship is framed by the skyline of the city of San Diego. NASA, Lockheed Martin and the U.S. Navy conducted the test on the Orion boilerplate test vehicle to prepare for recovery of the Orion crew module on its return from a deep space mission. The underway recovery test allowed the team to demonstrate and evaluate the recovery processes, procedures, new hardware and personnel in open waters. The Ground Systems Development and Operations Program conducted the underway recovery test. Orion is the exploration spacecraft designed to carry astronauts to destinations not yet explored by humans, including an asteroid and Mars. It will have emergency abort capability, sustain the crew during space travel and provide safe re-entry from deep space return velocities. The first unpiloted test flight of the Orion is scheduled to launch in 2014 atop a Delta IV rocket and in 2017 on NASA’s Space Launch System rocket. For more information, visit http://www.nasa.gov/orion. Photo credit: NASA/Kim Shiflett
Accounting for rate variation among lineages in comparative demographic analyses
Hope, Andrew G.; Ho, Simon Y. W.; Malaney, Jason L.; Cook, Joseph A.; Talbot, Sandra L.
2014-01-01
Genetic analyses of contemporary populations can be used to estimate the demographic histories of species within an ecological community. Comparison of these demographic histories can shed light on community responses to past climatic events. However, species experience different rates of molecular evolution, and this presents a major obstacle to comparative demographic analyses. We address this problem by using a Bayesian relaxed-clock method to estimate the relative evolutionary rates of 22 small mammal taxa distributed across northwestern North America. We found that estimates of the relative molecular substitution rate for each taxon were consistent across the range of sampling schemes that we compared. Using three different reference rates, we rescaled the relative rates so that they could be used to estimate absolute evolutionary timescales. Accounting for rate variation among taxa led to temporal shifts in our skyline-plot estimates of demographic history, highlighting both uniform and idiosyncratic evolutionary responses to directional climate trends for distinct ecological subsets of the small mammal community. Our approach can be used in evolutionary analyses of populations from multiple species, including comparative demographic studies.
It's a River, Not a Lake: A Report on Instructional Technology for the Maricopa Community Colleges.
ERIC Educational Resources Information Center
Jacobs, Alan
This report examines the effects of technological change on Arizona's Maricopa County Community College District (MCCCD) and assesses changes and progress made since the publication of MCCCD's Master Plan for Instructional Computing in 1986. The first section views constant change in computer technology as a running stream and examines the need…
Computer simulation modeling of recreation use: Current status, case studies, and future directions
David N. Cole
2005-01-01
This report compiles information about recent progress in the application of computer simulation modeling to planning and management of recreation use, particularly in parks and wilderness. Early modeling efforts are described in a chapter that provides an historical perspective. Another chapter provides an overview of modeling options, common data input requirements,...
Human Factors in the Design of a Computer-Assisted Instruction System. Technical Progress Report.
ERIC Educational Resources Information Center
Mudge, J. C.
A research project built an author-controlled computer-assisted instruction (CAI) system to study ease-of-use factors in student-system, author-system, and programer-system interfaces. Interfaces were designed and observed in use and systematically revised. Development of course material by authors, use by students, and administrative tasks were…
ERIC Educational Resources Information Center
Johnson, Erin Phinney; Perry, Justin; Shamir, Haya
2010-01-01
This study examines the effects on early reading skills of three different methods of presenting material with computer-assisted instruction (CAI): (1) learner-controlled picture menu, which allows the student to choose activities, (2) linear sequencer, which progresses the students through lessons at a pre-specified pace, and (3) mastery-based…
Comparing Computer Adaptive and Curriculum-Based Measures of Math in Progress Monitoring
ERIC Educational Resources Information Center
Shapiro, Edward S.; Dennis, Minyi Shih; Fu, Qiong
2015-01-01
The purpose of the study was to compare the use of a Computer Adaptive Test and Curriculum-Based Measurement in the assessment of mathematics. This study also investigated the degree to which slope or rate of change predicted student outcomes on the annual state assessment of mathematics above and beyond scores of single point screening…
Enterprise networks. Strategies for integrated delivery systems.
Siwicki, B
1997-02-01
More integrated delivery systems are making progress toward building computer networks that link all their care delivery sites so they can efficiently and economically coordinate care. A growing number of these systems are turning to intranets--private computer networks that use Internet-derived protocols and technologies--to move information that's essential to managing scare health care resources.
An overview of computational simulation methods for composite structures failure and life analysis
NASA Technical Reports Server (NTRS)
Chamis, Christos C.
1993-01-01
Three parallel computational simulation methods are being developed at the LeRC Structural Mechanics Branch (SMB) for composite structures failure and life analysis: progressive fracture CODSTRAN; hierarchical methods for high-temperature composites; and probabilistic evaluation. Results to date demonstrate that these methods are effective in simulating composite structures failure/life/reliability.
How to Represent Adaptation in e-Learning with IMS Learning Design
ERIC Educational Resources Information Center
Burgos, Daniel; Tattersall, Colin; Koper, Rob
2007-01-01
Adaptation in e-learning has been an important research topic for the last few decades in computer-based education. In adaptivity the behaviour of the user triggers some actions in the system that guides the learning process. In adaptability, the user makes changes and takes decisions. Progressing from computer-based training and adaptive…
EFFECTS OF BRANCHING IN A COMPUTER-CONTROLLED AUTO-INSTRUCTIONAL DEVICE.
ERIC Educational Resources Information Center
COULSON, JOHN E.; AND OTHERS
A STUDY ON THE EFFECTIVENESS OF USING BOTH THE STUDENT'S ERRORS ON TRAINING ITEMS AND HIS OWN EVALUATION OF HIS LEARNING PROGRESS WAS PRESENTED. TWO GROUPS OF 15 HIGH SCHOOL STUDENTS WERE GIVEN AUTOMATED INSTRUCTION ON LOGIC BY MEANS OF A FLEXIBLE SEQUENCE, COMPUTER-CONTROLLED AUTO-INSTRUCTIONAL DEVICE. ONE GROUP WAS DESIGNATED THE FIXED-SEQUENCE…
A Framework for the Design of Computer-Assisted Simulation Training for Complex Police Situations
ERIC Educational Resources Information Center
Söderström, Tor; Åström, Jan; Anderson, Greg; Bowles, Ron
2014-01-01
Purpose: The purpose of this paper is to report progress concerning the design of a computer-assisted simulation training (CAST) platform for developing decision-making skills in police students. The overarching aim is to outline a theoretical framework for the design of CAST to facilitate police students' development of search techniques in…
Visions of CSCL: Eight Provocations for the Future of the Field
ERIC Educational Resources Information Center
Wise, Alyssa Friend; Schwarz, Baruch B.
2017-01-01
The field of Computer Supported Computer Learning (CSCL) is at a critical moment in its development. Internally we face issues of fragmentation and questions about what progress is being made. Externally the rise of social media and a variety of research communities that study the interactions within it raise questions about our unique identity…
Caesy: A software tool for computer-aided engineering
NASA Technical Reports Server (NTRS)
Wette, Matt
1993-01-01
A new software tool, Caesy, is described. This tool provides a strongly typed programming environment for research in the development of algorithms and software for computer-aided control system design. A description of the user language and its implementation as they currently stand are presented along with a description of work in progress and areas of future work.
ERIC Educational Resources Information Center
Mirjana, Ivanovic; Zoran, Putnik; Anja, Sisarica; Zoran, Budimac
2011-01-01
This paper reports on progress and conclusions of two-year research of gender issues in studying computer science at Department of Mathematics and Informatics, Faculty of Science, University of Novi Sad. Using statistics on data gathered by a survey, the work presented here focused on identifying, understanding, and correlating both female and…
I CAN Learn®. [Secondary Mathematics.] What Works Clearinghouse Intervention Report
ERIC Educational Resources Information Center
What Works Clearinghouse, 2017
2017-01-01
"I CAN Learn"® is a computer-based math curriculum for students in middle school, high school, and college. It provides math instruction through a series of interactive lessons that students work on individually at their own computers. Students move at their own pace and must demonstrate mastery of each concept before progressing to the…
ERIC Educational Resources Information Center
Xu, Q.; Lai, L. L.; Tse, N. C. F.; Ichiyanagi, K.
2011-01-01
An interactive computer-based learning tool with multiple sessions is proposed in this paper, which teaches students to think and helps them recognize the merits and limitations of simulation tools so as to improve their practical abilities in electrical circuit simulation based on the case of a power converter with progressive problems. The…
I CAN Learn®. [Primary Mathematics.] What Works Clearinghouse Intervention Report
ERIC Educational Resources Information Center
What Works Clearinghouse, 2017
2017-01-01
"I CAN Learn"® is a computer-based math curriculum for students in middle school, high school, and college. It provides math instruction through a series of interactive lessons that students work on individually at their own computers. Students move at their own pace and must demonstrate mastery of each concept before progressing to the…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lottes, S. A.; Kulak, R. F.; Bojanowski, C.
2011-05-19
This project was established with a new interagency agreement between the Department of Energy and the Department of Transportation to provide collaborative research, development, and benchmarking of advanced three-dimensional computational mechanics analysis methods to the aerodynamics and hydraulics laboratories at the Turner-Fairbank Highway Research Center for a period of five years, beginning in October 2010. The analysis methods employ well-benchmarked and supported commercial computational mechanics software. Computational mechanics encompasses the areas of Computational Fluid Dynamics (CFD), Computational Wind Engineering (CWE), Computational Structural Mechanics (CSM), and Computational Multiphysics Mechanics (CMM) applied in Fluid-Structure Interaction (FSI) problems. The major areas of focusmore » of the project are wind and water loads on bridges - superstructure, deck, cables, and substructure (including soil), primarily during storms and flood events - and the risks that these loads pose to structural failure. For flood events at bridges, another major focus of the work is assessment of the risk to bridges caused by scour of stream and riverbed material away from the foundations of a bridge. Other areas of current research include modeling of flow through culverts to assess them for fish passage, modeling of the salt spray transport into bridge girders to address suitability of using weathering steel in bridges, vehicle stability under high wind loading, and the use of electromagnetic shock absorbers to improve vehicle stability under high wind conditions. This quarterly report documents technical progress on the project tasks for the period of January through March 2011.« less
Visualizing a silicon quantum computer
NASA Astrophysics Data System (ADS)
Sanders, Barry C.; Hollenberg, Lloyd C. L.; Edmundson, Darran; Edmundson, Andrew
2008-12-01
Quantum computation is a fast-growing, multi-disciplinary research field. The purpose of a quantum computer is to execute quantum algorithms that efficiently solve computational problems intractable within the existing paradigm of 'classical' computing built on bits and Boolean gates. While collaboration between computer scientists, physicists, chemists, engineers, mathematicians and others is essential to the project's success, traditional disciplinary boundaries can hinder progress and make communicating the aims of quantum computing and future technologies difficult. We have developed a four minute animation as a tool for representing, understanding and communicating a silicon-based solid-state quantum computer to a variety of audiences, either as a stand-alone animation to be used by expert presenters or embedded into a longer movie as short animated sequences. The paper includes a generally applicable recipe for successful scientific animation production.
London, Nir; Ambroggio, Xavier
2014-02-01
Computational protein design efforts aim to create novel proteins and functions in an automated manner and, in the process, these efforts shed light on the factors shaping natural proteins. The focus of these efforts has progressed from the interior of proteins to their surface and the design of functions, such as binding or catalysis. Here we examine progress in the development of robust methods for the computational design of non-natural interactions between proteins and molecular targets such as other proteins or small molecules. This problem is referred to as the de novo computational design of interactions. Recent successful efforts in de novo enzyme design and the de novo design of protein-protein interactions open a path towards solving this problem. We examine the common themes in these efforts, and review recent studies aimed at understanding the nature of successes and failures in the de novo computational design of interactions. While several approaches culminated in success, the use of a well-defined structural model for a specific binding interaction in particular has emerged as a key strategy for a successful design, and is therefore reviewed with special consideration. Copyright © 2013 Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
BRATTEN, JACK E.
THE BIOLOGY COURSE OF THEODORE HIGH SCHOOL AT THEODORE, ALABAMA, WAS STUDIED AS A SYSTEM FOR "PROCESSING" STUDENTS AND WAS SIMULATED ON A COMPUTER. AN EXPERIMENTAL VERSION OF THE COURSE WAS SIMULATED AND COMPARED WITH THE ACTUAL COURSE. THE PURPOSES OF THIS STUDY WERE (1) TO EXAMINE THE CONCEPT OF INDIVIDUAL PROGRESS AS IT RELATED TO THE…
ERIC Educational Resources Information Center
National Center for Education Statistics, 2012
2012-01-01
This report presents results of the 2011 National Assessment of Educational Progress (NAEP) in writing at grades 8 and 12. In this new national writing assessment sample, 24,100 eighth-graders and 28,100 twelfth-graders engaged with writing tasks and composed their responses on computer. The assessment tasks reflected writing situations common to…
2014-01-01
system UAV unmanned aircraft vehicle UCI User -Computer Interface UCS UAS control segment Abbreviations xxix UGS unmanned ground system UGV unmanned ...made substantial progress in the deployment of more capable sensors, unmanned aircraft systems (UAS), and other unmanned systems (UxS). Innovative...progress in fielding more, and more capable unmanned aircraft systems (UAS) to meet the needs of warfighters
Thrifty: An Exascale Architecture for Energy Proportional Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Torrellas, Josep
2014-12-23
The objective of this project is to design different aspects of a novel exascale architecture called Thrifty. Our goal is to focus on the challenges of power/energy efficiency, performance, and resiliency in exascale systems. The project includes work on computer architecture (Josep Torrellas from University of Illinois), compilation (Daniel Quinlan from Lawrence Livermore National Laboratory), runtime and applications (Laura Carrington from University of California San Diego), and circuits (Wilfred Pinfold from Intel Corporation). In this report, we focus on the progress at the University of Illinois during the last year of the grant (September 1, 2013 to August 31, 2014).more » We also point to the progress in the other collaborating institutions when needed.« less
Spatial distribution of nuclei in progressive nucleation: Modeling and application
NASA Astrophysics Data System (ADS)
Tomellini, Massimo
2018-04-01
Phase transformations ruled by non-simultaneous nucleation and growth do not lead to random distribution of nuclei. Since nucleation is only allowed in the untransformed portion of space, positions of nuclei are correlated. In this article an analytical approach is presented for computing pair-correlation function of nuclei in progressive nucleation. This quantity is further employed for characterizing the spatial distribution of nuclei through the nearest neighbor distribution function. The modeling is developed for nucleation in 2D space with power growth law and it is applied to describe electrochemical nucleation where correlation effects are significant. Comparison with both computer simulations and experimental data lends support to the model which gives insights into the transition from Poissonian to correlated nearest neighbor probability density.
Universal blind quantum computation for hybrid system
NASA Astrophysics Data System (ADS)
Huang, He-Liang; Bao, Wan-Su; Li, Tan; Li, Feng-Guang; Fu, Xiang-Qun; Zhang, Shuo; Zhang, Hai-Long; Wang, Xiang
2017-08-01
As progress on the development of building quantum computer continues to advance, first-generation practical quantum computers will be available for ordinary users in the cloud style similar to IBM's Quantum Experience nowadays. Clients can remotely access the quantum servers using some simple devices. In such a situation, it is of prime importance to keep the security of the client's information. Blind quantum computation protocols enable a client with limited quantum technology to delegate her quantum computation to a quantum server without leaking any privacy. To date, blind quantum computation has been considered only for an individual quantum system. However, practical universal quantum computer is likely to be a hybrid system. Here, we take the first step to construct a framework of blind quantum computation for the hybrid system, which provides a more feasible way for scalable blind quantum computation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lottes, S.A.; Kulak, R.F.; Bojanowski, C.
2011-12-09
The computational fluid dynamics (CFD) and computational structural mechanics (CSM) focus areas at Argonne's Transportation Research and Analysis Computing Center (TRACC) initiated a project to support and compliment the experimental programs at the Turner-Fairbank Highway Research Center (TFHRC) with high performance computing based analysis capabilities in August 2010. The project was established with a new interagency agreement between the Department of Energy and the Department of Transportation to provide collaborative research, development, and benchmarking of advanced three-dimensional computational mechanics analysis methods to the aerodynamics and hydraulics laboratories at TFHRC for a period of five years, beginning in October 2010. Themore » analysis methods employ well-benchmarked and supported commercial computational mechanics software. Computational mechanics encompasses the areas of Computational Fluid Dynamics (CFD), Computational Wind Engineering (CWE), Computational Structural Mechanics (CSM), and Computational Multiphysics Mechanics (CMM) applied in Fluid-Structure Interaction (FSI) problems. The major areas of focus of the project are wind and water effects on bridges - superstructure, deck, cables, and substructure (including soil), primarily during storms and flood events - and the risks that these loads pose to structural failure. For flood events at bridges, another major focus of the work is assessment of the risk to bridges caused by scour of stream and riverbed material away from the foundations of a bridge. Other areas of current research include modeling of flow through culverts to assess them for fish passage, modeling of the salt spray transport into bridge girders to address suitability of using weathering steel in bridges, CFD analysis of the operation of the wind tunnel in the TFCHR wind engineering laboratory, vehicle stability under high wind loading, and the use of electromagnetic shock absorbers to improve vehicle stability under high wind conditions. This quarterly report documents technical progress on the project tasks for the period of July through September 2011.« less
Aquatic Toxic Analysis by Monitoring Fish Behavior Using Computer Vision: A Recent Progress
Fu, Longwen; Liu, Zuoyi
2018-01-01
Video tracking based biological early warning system achieved a great progress with advanced computer vision and machine learning methods. Ability of video tracking of multiple biological organisms has been largely improved in recent years. Video based behavioral monitoring has become a common tool for acquiring quantified behavioral data for aquatic risk assessment. Investigation of behavioral responses under chemical and environmental stress has been boosted by rapidly developed machine learning and artificial intelligence. In this paper, we introduce the fundamental of video tracking and present the pioneer works in precise tracking of a group of individuals in 2D and 3D space. Technical and practical issues suffered in video tracking are explained. Subsequently, the toxic analysis based on fish behavioral data is summarized. Frequently used computational methods and machine learning are explained with their applications in aquatic toxicity detection and abnormal pattern analysis. Finally, advantages of recent developed deep learning approach in toxic prediction are presented. PMID:29849612
Toward a superconducting quantum computer
Tsai, Jaw-Shen
2010-01-01
Intensive research on the construction of superconducting quantum computers has produced numerous important achievements. The quantum bit (qubit), based on the Josephson junction, is at the heart of this research. This macroscopic system has the ability to control quantum coherence. This article reviews the current state of quantum computing as well as its history, and discusses its future. Although progress has been rapid, the field remains beset with unsolved issues, and there are still many new research opportunities open to physicists and engineers. PMID:20431256
Computational Age Dating of Special Nuclear Materials
DOE Office of Scientific and Technical Information (OSTI.GOV)
None, None
2012-06-30
This slide-show presented an overview of the Constrained Progressive Reversal (CPR) method for computing decays, age dating, and spoof detecting. The CPR method is: Capable of temporal profiling a SNM sample; Precise (compared with known decay code, such a ORIGEN); Easy (for computer implementation and analysis). We have illustrated with real SNM data using CPR for age dating and spoof detection. If SNM is pure, may use CPR to derive its age. If SNM is mixed, CPR will indicate that it is mixed or spoofed.
Computational chemistry and cheminformatics: an essay on the future.
Glen, Robert Charles
2012-01-01
Computers have changed the way we do science. Surrounded by a sea of data and with phenomenal computing capacity, the methodology and approach to scientific problems is evolving into a partnership between experiment, theory and data analysis. Given the pace of change of the last twenty-five years, it seems folly to speculate on the future, but along with unpredictable leaps of progress there will be a continuous evolution of capability, which points to opportunities and improvements that will certainly appear as our discipline matures.
Toward a superconducting quantum computer. Harnessing macroscopic quantum coherence.
Tsai, Jaw-Shen
2010-01-01
Intensive research on the construction of superconducting quantum computers has produced numerous important achievements. The quantum bit (qubit), based on the Josephson junction, is at the heart of this research. This macroscopic system has the ability to control quantum coherence. This article reviews the current state of quantum computing as well as its history, and discusses its future. Although progress has been rapid, the field remains beset with unsolved issues, and there are still many new research opportunities open to physicists and engineers.
Abou-Ayash, Samir; Boldt, Johannes; Vuck, Alexander
Full-arch rehabilitation of patients with severe tooth wear due to parafunctional behavior is a challenge for dentists and dental technicians, especially when a highly esthetic outcome is desired. A variety of different treatment options and prosthetic materials are available for such a clinical undertaking. The ongoing progress of computer-aided design/computer-assisted manufacture technologies in combination with all-ceramic materials provides a predictable workflow for these complex cases. This case history report describes a comprehensive, step-by-step treatment protocol leading to an optimally predictable treatment outcome for an esthetically compromised patient.
Research in mathematical theory of computation. [computer programming applications
NASA Technical Reports Server (NTRS)
Mccarthy, J.
1973-01-01
Research progress in the following areas is reviewed: (1) new version of computer program LCF (logic for computable functions) including a facility to search for proofs automatically; (2) the description of the language PASCAL in terms of both LCF and in first order logic; (3) discussion of LISP semantics in LCF and attempt to prove the correctness of the London compilers in a formal way; (4) design of both special purpose and domain independent proving procedures specifically program correctness in mind; (5) design of languages for describing such proof procedures; and (6) the embedding of ideas in the first order checker.
Computing nucleon EDM on a lattice
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abramczyk, Michael; Izubuchi, Taku
I will discuss briefly recent changes in the methodology of computing the baryon EDM on a lattice. The associated correction substantially reduces presently existing lattice values for the proton and neutron theta-induced EDMs, so that even the most precise previous lattice results become consistent with zero. On one hand, this change removes previous disagreements between these lattice results and the phenomenological estimates of the nucleon EDM. On the other hand, the nucleon EDM becomes much harder to compute on a lattice. In addition, I will review the progress in computing quark chromo-EDM-induced nucleon EDM using chiral quark action.
Computer analysis of railcar vibrations
NASA Technical Reports Server (NTRS)
Vlaminck, R. R.
1975-01-01
Computer models and techniques for calculating railcar vibrations are discussed along with criteria for vehicle ride optimization. The effect on vibration of car body structural dynamics, suspension system parameters, vehicle geometry, and wheel and rail excitation are presented. Ride quality vibration data collected on the state-of-the-art car and standard light rail vehicle is compared to computer predictions. The results show that computer analysis of the vehicle can be performed for relatively low cost in short periods of time. The analysis permits optimization of the design as it progresses and minimizes the possibility of excessive vibration on production vehicles.
Computer-assisted cartography: an overview.
Guptill, S.C.; Starr, L.E.
1984-01-01
An assessment of the current status of computer-assisted cartography, in part, is biased by one's view of the cartographic process as a whole. From a traditional viewpoint we are concerned about automating the mapping process; from a progressive viewpoint we are concerned about using the tools of computer science to convey spatial information. On the surface these viewpoints appear to be in opposition. However, it is postulated that in the final analysis, they face the same goal. This overview uses the perspectives from two viewpoints to depict the current state of computer-assisted cartography and speculate on future goals, trends, and challenges.-Authors
Modeling Subsurface Reactive Flows Using Leadership-Class Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mills, Richard T; Hammond, Glenn; Lichtner, Peter
2009-01-01
We describe our experiences running PFLOTRAN - a code for simulation of coupled hydro-thermal-chemical processes in variably saturated, non-isothermal, porous media - on leadership-class supercomputers, including initial experiences running on the petaflop incarnation of Jaguar, the Cray XT5 at the National Center for Computational Sciences at Oak Ridge National Laboratory. PFLOTRAN utilizes fully implicit time-stepping and is built on top of the Portable, Extensible Toolkit for Scientific Computation (PETSc). We discuss some of the hurdles to 'at scale' performance with PFLOTRAN and the progress we have made in overcoming them on leadership-class computer architectures.
Enabling Large-Scale Biomedical Analysis in the Cloud
Lin, Ying-Chih; Yu, Chin-Sheng; Lin, Yen-Jen
2013-01-01
Recent progress in high-throughput instrumentations has led to an astonishing growth in both volume and complexity of biomedical data collected from various sources. The planet-size data brings serious challenges to the storage and computing technologies. Cloud computing is an alternative to crack the nut because it gives concurrent consideration to enable storage and high-performance computing on large-scale data. This work briefly introduces the data intensive computing system and summarizes existing cloud-based resources in bioinformatics. These developments and applications would facilitate biomedical research to make the vast amount of diversification data meaningful and usable. PMID:24288665
Baird, Amy B; Braun, Janet K; Engstrom, Mark D; Holbert, Ashlyn C; Huerta, Maritza G; Lim, Burton K; Mares, Michael A; Patton, John C; Bickham, John W
2017-01-01
Previous studies on genetics of hoary bats produced differing conclusions on the timing of their colonization of the Hawaiian Islands and whether or not North American (Aeorestes cinereus) and Hawaiian (A. semotus) hoary bats are distinct species. One study, using mtDNA COI and nuclear Rag2 and CMA1, concluded that hoary bats colonized the Hawaiian Islands no more than 10,000 years ago based on indications of population expansion at that time using Extended Bayesian Skyline Plots. The other study, using 3 mtDNA and 1 Y-chromosome locus, concluded that the Hawaiian Islands were colonized about 1 million years ago. To address the marked inconsistencies between those studies, we examined DNA sequences from 4 mitochondrial and 2 nuclear loci in lasiurine bats to investigate the timing of colonization of the Hawaiian Islands by hoary bats, test the hypothesis that Hawaiian and North American hoary bats belong to different species, and further investigate the generic level taxonomy within the tribe. Phylogenetic analysis and dating of the nodes of mtDNA haplotypes and of nuclear CMA1 alleles show that A. semotus invaded the Hawaiian Islands approximately 1.35 Ma and that multiple arrivals of A. cinereus occurred much more recently. Extended Bayesian Skyline plots show population expansion at about 20,000 years ago in the Hawaiian Islands, which we conclude does not represent the timing of colonization of the Hawaiian Islands given the high degree of genetic differentiation among A. cinereus and A. semotus (4.2% divergence at mtDNA Cytb) and the high degree of genetic diversity within A. semotus. Rather, population expansion 20,000 years ago could have resulted from colonization of additional islands, expansion after a bottleneck, or other factors. New genetic data also support the recognition of A. semotus and A. cinereus as distinct species, a finding consistent with previous morphological and behavioral studies. The phylogenetic analysis of CMA1 alleles shows the presence of 2 clades that are primarily associated with A. semotus mtDNA haplotypes, and are unique to the Hawaiian Islands. There is evidence for low levels of hybridization between A. semotus and A. cinereus on the Hawaiian Islands, but it is not extensive (<15% of individuals are of hybrid origin), and clearly each species is able to maintain its own genetic distinctiveness. Both mtDNA and nuclear DNA sequences show deep divergence between the 3 groups (genera) of lasiurine bats that correspond to the previously recognized morphological differences between them. We show that the Tribe Lasiurini contains the genera Aeorestes (hoary bats), Lasiurus (red bats), and Dasypterus (yellow bats).
Braun, Janet K.; Engstrom, Mark D.; Holbert, Ashlyn C.; Huerta, Maritza G.; Lim, Burton K.; Mares, Michael A.; Patton, John C.
2017-01-01
Previous studies on genetics of hoary bats produced differing conclusions on the timing of their colonization of the Hawaiian Islands and whether or not North American (Aeorestes cinereus) and Hawaiian (A. semotus) hoary bats are distinct species. One study, using mtDNA COI and nuclear Rag2 and CMA1, concluded that hoary bats colonized the Hawaiian Islands no more than 10,000 years ago based on indications of population expansion at that time using Extended Bayesian Skyline Plots. The other study, using 3 mtDNA and 1 Y-chromosome locus, concluded that the Hawaiian Islands were colonized about 1 million years ago. To address the marked inconsistencies between those studies, we examined DNA sequences from 4 mitochondrial and 2 nuclear loci in lasiurine bats to investigate the timing of colonization of the Hawaiian Islands by hoary bats, test the hypothesis that Hawaiian and North American hoary bats belong to different species, and further investigate the generic level taxonomy within the tribe. Phylogenetic analysis and dating of the nodes of mtDNA haplotypes and of nuclear CMA1 alleles show that A. semotus invaded the Hawaiian Islands approximately 1.35 Ma and that multiple arrivals of A. cinereus occurred much more recently. Extended Bayesian Skyline plots show population expansion at about 20,000 years ago in the Hawaiian Islands, which we conclude does not represent the timing of colonization of the Hawaiian Islands given the high degree of genetic differentiation among A. cinereus and A. semotus (4.2% divergence at mtDNA Cytb) and the high degree of genetic diversity within A. semotus. Rather, population expansion 20,000 years ago could have resulted from colonization of additional islands, expansion after a bottleneck, or other factors. New genetic data also support the recognition of A. semotus and A. cinereus as distinct species, a finding consistent with previous morphological and behavioral studies. The phylogenetic analysis of CMA1 alleles shows the presence of 2 clades that are primarily associated with A. semotus mtDNA haplotypes, and are unique to the Hawaiian Islands. There is evidence for low levels of hybridization between A. semotus and A. cinereus on the Hawaiian Islands, but it is not extensive (<15% of individuals are of hybrid origin), and clearly each species is able to maintain its own genetic distinctiveness. Both mtDNA and nuclear DNA sequences show deep divergence between the 3 groups (genera) of lasiurine bats that correspond to the previously recognized morphological differences between them. We show that the Tribe Lasiurini contains the genera Aeorestes (hoary bats), Lasiurus (red bats), and Dasypterus (yellow bats). PMID:29020097
Computer-Integrated Breakdown of Hardwood Sawlogs
Luis G. Occeña; Daniel L. Schmoldt; Philip A. Araman
1996-01-01
This paper describes work in progress concerning the development of an integrated approach to hardwood processing. The motivation for this work, research direction, and research developments are presented.
ERIC Educational Resources Information Center
Heidar, Davood Mashhadi; Afghari, Akbar
2015-01-01
The present paper concentrates on a web-based inquiry in the synchronous computer-mediated communication (SCMC) via Web 2.0 technologies of Talk and Write and Skype. It investigates EFL learners' socio-cognitive progress through dynamic assessment (DA), which follows Vygotsky's inclination for supportive interchange in the zone of proximal…
ERIC Educational Resources Information Center
National Science Foundation, Washington, DC.
This report addresses an opportunity to accelerate progress in virtually every branch of science and engineering concurrently, while also boosting the American economy as business firms also learn to exploit these new capabilities. The successful rapid advancement in both science and technology creates its own challenges, four of which are…
Video-Based Eye Tracking to Detect the Attention Shift: A Computer Classroom Context-Aware System
ERIC Educational Resources Information Center
Kuo, Yung-Lung; Lee, Jiann-Shu; Hsieh, Min-Chai
2014-01-01
Eye and head movements evoked in response to obvious visual attention shifts. However, there has been little progress on the causes of absent-mindedness so far. The paper proposes an attention awareness system that captures the conditions regarding the interaction of eye gaze and head pose under various attentional switching in computer classroom.…
Computer-Assisted Instruction: Decision Handbook.
1985-04-01
to feelings of " depersonalization " or "dehumanization." The approach is to document investigations of attitudes toward CBI held by students and...utilized within a computer-based training system that includes management of student progress, training resources, testing, and instructional materials...training time. As compared to programmed texts and workbookl, students were more attentive and stayed on task. The attentiveness to PLATO materials
MIT Laboratory for Computer Science Progress Report 27
1990-06-01
because of the natural, yet unexploited, concurrence that characterizes contemporary and prospective applications from business to sensory computing...432. 14 Advanced Network Architecture Academic Staff D. Clark, Group Leader D. Tennenhouse J. Saltzer Research Staff J. Davin K. Sollins Graduate...Murray Hill, NJ, July 1989. 23 24 Clinical Decision Making Academic Staff R. Patil P. Szolovits, Group Leader G. Rennels Collaborating Investigators M
Progress in analysis of computed tomography (CT) images of hardwood logs for defect detection
Erol Sarigul; A. Lynn Abbott; Daniel L. Schmoldt
2003-01-01
This paper addresses the problem of automatically detecting internal defects in logs using computed tomography (CT) images. The overall purpose is to assist in breakdown optimization. Several studies have shown that the commercial value of resulting boards can be increased substantially if defect locations are known in advance, and if this information is used to make...
Next generation keyboards: The importance of cognitive compatibility
NASA Technical Reports Server (NTRS)
Amell, John R.; Ewry, Michael E.; Colle, Herbert A.
1988-01-01
The computer keyboard of today is essentially the same as it has been for many years. Few advances have been made in keyboard design even though computer systems in general have made remarkable progress in improvements. This paper discusses the future of keyboards, their competition and compatibility with voice input systems, and possible special-application intelligent keyboards for controlling complex systems.
ERIC Educational Resources Information Center
Li, Dan; Wang, Jian
2014-01-01
Reading for personal interest and acquiring and using information using various reading processes are important parts of reading literacy that students need to develop in order to progress successfully through their schooling and fully function in the information society. Computer assisted reading instructional activities are assumed useful in…
Language, Learning, and Identity in Social Networking Sites for Language Learning: The Case of Busuu
ERIC Educational Resources Information Center
Alvarez Valencia, Jose Aldemar
2014-01-01
Recent progress in the discipline of computer applications such as the advent of web-based communication, afforded by the Web 2.0, has paved the way for novel applications in language learning, namely, social networking. Social networking has challenged the area of Computer Mediated Communication (CMC) to expand its research palette in order to…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, M.M.; Chao, B.T.
This technical progress report covers the progress made during the fifth quarter of the project entitled Measurements of Solids Motion in Gas Fluidized Beds under Grant No. DOE-F22-81PC40804 during the period 1 October through 31 December 1982. The research concerns the measurement of solids particle velocity distribution and residence time distribution using the Computer-Aided Particle Tracking Facility (CAPTF) at the University of Illinois at Urbana-Champaign. The experimental equipment and measuring methods used to determine particle size distribution and particle motion and the results obtained are presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kuhl, D.E.
1993-06-01
This progress report describes accomplishments of four programs. The four programs are entitled (1) Faster,simpler processing of positron-computing precursors: New physicochemical approaches, (2) Novel solid phase reagents and methods to improve radiosynthesis and isotope production, (3) Quantitative evaluation of the extraction of information from PET images, and (4) Optimization of tracer kinetic methods for radioligand studies in PET.
The National Shipbuilding Research Program 1985 Ship Production Symposium Volume 1
1985-09-01
though there is some impact on hull construction progress because assembly is performed in shops which provide ideal climate, lighting, and access...with the production of parts is necessary in order to pay employees , report performance to budget and compute productivity. The latter requirement is...progress estimates in addition to the employees normal timecards. Even this can be eliminated by using completion of previously performed tasks as
Ma, Xiaopeng; Phi Van, Valerie; Kimm, Melanie A; Prakash, Jaya; Kessler, Horst; Kosanke, Katja; Feuchtinger, Annette; Aichler, Michaela; Gupta, Aayush; Rummeny, Ernst J; Eisenblätter, Michel; Siveke, Jens; Walch, Axel K; Braren, Rickmer; Ntziachristos, Vasilis; Wildgruber, Moritz
2017-01-01
Integrins play an important role in tumor progression, invasion and metastasis. Therefore we aimed to evaluate a preclinical imaging approach applying ανβ3 integrin targeted hybrid Fluorescence Molecular Tomography/X-ray Computed Tomography (FMT-XCT) for monitoring tumor progression as well as early therapy response in a syngeneic murine Non-Small Cell Lung Cancer (NSCLC) model. Lewis Lung Carcinomas were grown orthotopically in C57BL/6 J mice and imaged in-vivo using a ανβ3 targeted near-infrared fluorescence (NIRF) probe. ανβ3-targeted FMT-XCT was able to track tumor progression. Cilengitide was able to substantially block the binding of the NIRF probe and suppress the imaging signal. Additionally mice were treated with an established chemotherapy regimen of Cisplatin and Bevacizumab or with a novel MEK inhibitor (Refametinib) for 2 weeks. While μCT revealed only a moderate slowdown of tumor growth, ανβ3 dependent signal decreased significantly compared to non-treated mice already at one week post treatment. ανβ3 targeted imaging might therefore become a promising tool for assessment of early therapy response in the future. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
Recent progress in the imaging of soil processes at the microscopic scale, and a look ahead
NASA Astrophysics Data System (ADS)
Garnier, Patricia; Baveye, Philippe C.; Pot, Valérie; Monga, Olivier; Portell, Xavier
2016-04-01
Over the last few years, tremendous progress has been achieved in the visualization of soil structures at the microscopic scale. Computed tomography, based on synchrotron X-ray beams or table-top equipment, allows the visualization of pore geometry at micrometric resolution. Chemical and microbiological information obtainable in 2D cuts through soils can now be interpolated, with the support of CT-data, to produce 3-dimensional maps. In parallel with these analytical advances, significant progress has also been achieved in the computer simulation and visualization of a range of physical, chemical, and microbiological processes taking place in soil pores. In terms of water distribution and transport in soils, for example, the use of Lattice-Boltzmann models as well as models based on geometric primitives has been shown recently to reproduce very faithfully observations made with synchrotron X-ray tomography. Coupling of these models with fungal and bacterial growth models allows the description of a range of microbiologically-mediated processes of great importance at the moment, for example in terms of carbon sequestration. In this talk, we shall review progress achieved to date in this field, indicate where questions remain unanswered, and point out areas where further advances are expected in the next few years.
NASA Astrophysics Data System (ADS)
Christensen, C.; Summa, B.; Scorzelli, G.; Lee, J. W.; Venkat, A.; Bremer, P. T.; Pascucci, V.
2017-12-01
Massive datasets are becoming more common due to increasingly detailed simulations and higher resolution acquisition devices. Yet accessing and processing these huge data collections for scientific analysis is still a significant challenge. Solutions that rely on extensive data transfers are increasingly untenable and often impossible due to lack of sufficient storage at the client side as well as insufficient bandwidth to conduct such large transfers, that in some cases could entail petabytes of data. Large-scale remote computing resources can be useful, but utilizing such systems typically entails some form of offline batch processing with long delays, data replications, and substantial cost for any mistakes. Both types of workflows can severely limit the flexible exploration and rapid evaluation of new hypotheses that are crucial to the scientific process and thereby impede scientific discovery. In order to facilitate interactivity in both analysis and visualization of these massive data ensembles, we introduce a dynamic runtime system suitable for progressive computation and interactive visualization of arbitrarily large, disparately located spatiotemporal datasets. Our system includes an embedded domain-specific language (EDSL) that allows users to express a wide range of data analysis operations in a simple and abstract manner. The underlying runtime system transparently resolves issues such as remote data access and resampling while at the same time maintaining interactivity through progressive and interruptible processing. Computations involving large amounts of data can be performed remotely in an incremental fashion that dramatically reduces data movement, while the client receives updates progressively thereby remaining robust to fluctuating network latency or limited bandwidth. This system facilitates interactive, incremental analysis and visualization of massive remote datasets up to petabytes in size. Our system is now available for general use in the community through both docker and anaconda.
Perspectives on the Future of CFD
NASA Technical Reports Server (NTRS)
Kwak, Dochan
2000-01-01
This viewgraph presentation gives an overview of the future of computational fluid dynamics (CFD), which in the past has pioneered the field of flow simulation. Over time CFD has progressed as computing power. Numerical methods have been advanced as CPU and memory capacity increases. Complex configurations are routinely computed now and direct numerical simulations (DNS) and large eddy simulations (LES) are used to study turbulence. As the computing resources changed to parallel and distributed platforms, computer science aspects such as scalability (algorithmic and implementation) and portability and transparent codings have advanced. Examples of potential future (or current) challenges include risk assessment, limitations of the heuristic model, and the development of CFD and information technology (IT) tools.
NASA Technical Reports Server (NTRS)
Aggrawal, Bharat
1994-01-01
This viewgraph presentation describes the development of user interfaces for OS/2 versions of computer codes for the analysis of seals. Current status, new features, work in progress, and future plans are discussed.
An assessment of laser velocimetry in hypersonic flow
NASA Technical Reports Server (NTRS)
1992-01-01
Although extensive progress has been made in computational fluid mechanics, reliable flight vehicle designs and modifications still cannot be made without recourse to extensive wind tunnel testing. Future progress in the computation of hypersonic flow fields is restricted by the need for a reliable mean flow and turbulence modeling data base which could be used to aid in the development of improved empirical models for use in numerical codes. Currently, there are few compressible flow measurements which could be used for this purpose. In this report, the results of experiments designed to assess the potential for laser velocimeter measurements of mean flow and turbulent fluctuations in hypersonic flow fields are presented. Details of a new laser velocimeter system which was designed and built for this test program are described.
NASA Technical Reports Server (NTRS)
Daw, Murray S.; Mills, Michael J.
2003-01-01
We report on the progress made during the first year of the project. Most of the progress at this point has been on the theoretical and computational side. Here are the highlights: (1) A new code, tailored for high-end desktop computing, now combines modern Accelerated Dynamics (AD) with the well-tested Embedded Atom Method (EAM); (2) The new Accelerated Dynamics allows the study of relatively slow, thermally-activated processes, such as diffusion, which are much too slow for traditional Molecular Dynamics; (3) We have benchmarked the new AD code on a rather simple and well-known process: vacancy diffusion in copper; and (4) We have begun application of the AD code to the diffusion of vacancies in ordered intermetallics.
NASA Astrophysics Data System (ADS)
Vrettaros, John; Vouros, George; Drigas, Athanasios S.
This article studies the expediency of using neural networks technology and the development of back-propagation networks (BPN) models for modeling automated evaluation of the answers and progress of deaf students' that possess basic knowledge of the English language and computer skills, within a virtual e-learning environment. The performance of the developed neural models is evaluated with the correlation factor between the neural networks' response values and the real value data as well as the percentage measurement of the error between the neural networks' estimate values and the real value data during its training process and afterwards with unknown data that weren't used in the training process.
Lightning Simulation and Design Program (LSDP)
NASA Astrophysics Data System (ADS)
Smith, D. A.
This computer program simulates a user-defined lighting configuration. It has been developed as a tool to aid in the design of exterior lighting systems. Although this program is used primarily for perimeter security lighting design, it has potential use for any application where the light can be approximated by a point source. A data base of luminaire photometric information is maintained for use with this program. The user defines the surface area to be illuminated with a rectangular grid and specifies luminaire positions. Illumination values are calculated for regularly spaced points in that area and isolux contour plots are generated. The numerical and graphical output for a particular site mode are then available for analysis. The amount of time spent on point-to-point illumination computation with this progress is much less than that required for tedious hand calculations. The ease with which various parameters can be interactively modified with the progress also reduces the time and labor expended. Consequently, the feasibility of design ideas can be examined, modified, and retested more thoroughly, and overall design costs can be substantially lessened by using this progress as an adjunct to the design process.
Thermal sensation, rate of temperature change, and the heat dissipation design for tablet computers.
Zhang, Han; Hedge, Alan; Cosley, Daniel
2017-07-01
Past research has shown that the rate of change of skin surface temperature can affect thermal sensation. This study investigated users' thermal responses to a tablet heating surface with different heat pads and different temperature change rates. The test conditions included: A. keeping the surface at a constant 42 °C, B. increasing the surface temperature from 38 °C to 42 °C at a rate of 0.02 °C/s in progressive intervals, C. increasing the temperature at 0.15 °C/s in progressive intervals, and D. Heating two left and right side pads alternately from 38 °C to 42 °C at 0.15 °C/s in progressive intervals. Overall results showed the lowest temperature change rate of 0.02 °C/s was most preferred in terms of thermal comfort. The findings suggest a potential to improve user thermal experience by dissipating tablet computer heat at a lower temperature change rate, or by alternating the dissipation areas. Copyright © 2017 Elsevier Ltd. All rights reserved.