Sample records for usa parallel analysis

  1. Parallel Epidemics of Community-Associated Methicillin-Resistant Staphylococcus aureus USA300 Infection in North and South America.

    PubMed

    Planet, Paul J; Diaz, Lorena; Kolokotronis, Sergios-Orestis; Narechania, Apurva; Reyes, Jinnethe; Xing, Galen; Rincon, Sandra; Smith, Hannah; Panesso, Diana; Ryan, Chanelle; Smith, Dylan P; Guzman, Manuel; Zurita, Jeannete; Sebra, Robert; Deikus, Gintaras; Nolan, Rathel L; Tenover, Fred C; Weinstock, George M; Robinson, D Ashley; Arias, Cesar A

    2015-12-15

    The community-associated methicillin-resistant Staphylococcus aureus (CA-MRSA) epidemic in the United States is attributed to the spread of the USA300 clone. An epidemic of CA-MRSA closely related to USA300 has occurred in northern South America (USA300 Latin-American variant, USA300-LV). Using phylogenomic analysis, we aimed to understand the relationships between these 2 epidemics. We sequenced the genomes of 51 MRSA clinical isolates collected between 1999 and 2012 from the United States, Colombia, Venezuela, and Ecuador. Phylogenetic analysis was used to infer the relationships and times since the divergence of the major clades. Phylogenetic analyses revealed 2 dominant clades that segregated by geographical region, had a putative common ancestor in 1975, and originated in 1989, in North America, and in 1985, in South America. Emergence of these parallel epidemics coincides with the independent acquisition of the arginine catabolic mobile element (ACME) in North American isolates and a novel copper and mercury resistance (COMER) mobile element in South American isolates. Our results reveal the existence of 2 parallel USA300 epidemics that shared a recent common ancestor. The simultaneous rapid dissemination of these 2 epidemic clades suggests the presence of shared, potentially convergent adaptations that enhance fitness and ability to spread. © The Author 2015. Published by Oxford University Press on behalf of the Infectious Diseases Society of America. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  2. Culminating Point and the 38th Parallel

    DTIC Science & Technology

    1994-01-01

    T• 3M•~ OPKALL"L 6. AUTHOR(S) TAMVS L. BRyA10 LF COL ) LkSA 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) 8. PERFORMING ORGANIZATION AIR WAR ...Prescribed by ANSI Std. Z39-18 298-102 AIR WAR COLLEGE AIR UNIVERSITY THE CULMINATING POINT AND THE 38TH PARALLEL by James L. Bryan Lieutenant Colonel, USA...securing the only attainable objective the following Spring. Why do this analysis on the Korean War when so much has already been written about it

  3. FastID: Extremely Fast Forensic DNA Comparisons

    DTIC Science & Technology

    2017-05-19

    FastID: Extremely Fast Forensic DNA Comparisons Darrell O. Ricke, PhD Bioengineering Systems & Technologies Massachusetts Institute of...Technology Lincoln Laboratory Lexington, MA USA Darrell.Ricke@ll.mit.edu Abstract—Rapid analysis of DNA forensic samples can have a critical impact on...time sensitive investigations. Analysis of forensic DNA samples by massively parallel sequencing is creating the next gold standard for DNA

  4. Restricted access Improved hydrogeophysical characterization and monitoring through parallel modeling and inversion of time-domain resistivity andinduced-polarization data

    USGS Publications Warehouse

    Johnson, Timothy C.; Versteeg, Roelof J.; Ward, Andy; Day-Lewis, Frederick D.; Revil, André

    2010-01-01

    Electrical geophysical methods have found wide use in the growing discipline of hydrogeophysics for characterizing the electrical properties of the subsurface and for monitoring subsurface processes in terms of the spatiotemporal changes in subsurface conductivity, chargeability, and source currents they govern. Presently, multichannel and multielectrode data collections systems can collect large data sets in relatively short periods of time. Practitioners, however, often are unable to fully utilize these large data sets and the information they contain because of standard desktop-computer processing limitations. These limitations can be addressed by utilizing the storage and processing capabilities of parallel computing environments. We have developed a parallel distributed-memory forward and inverse modeling algorithm for analyzing resistivity and time-domain induced polar-ization (IP) data. The primary components of the parallel computations include distributed computation of the pole solutions in forward mode, distributed storage and computation of the Jacobian matrix in inverse mode, and parallel execution of the inverse equation solver. We have tested the corresponding parallel code in three efforts: (1) resistivity characterization of the Hanford 300 Area Integrated Field Research Challenge site in Hanford, Washington, U.S.A., (2) resistivity characterization of a volcanic island in the southern Tyrrhenian Sea in Italy, and (3) resistivity and IP monitoring of biostimulation at a Superfund site in Brandywine, Maryland, U.S.A. Inverse analysis of each of these data sets would be limited or impossible in a standard serial computing environment, which underscores the need for parallel high-performance computing to fully utilize the potential of electrical geophysical methods in hydrogeophysical applications.

  5. Canadian Space Launch: Exploiting Northern Latitudes For Efficient Space Launch

    DTIC Science & Technology

    2015-04-01

    9  Peoples’ Republic of China .........................................................................................11  USA Launch... taxation and legislation that make Canada an attractive destination for commercial space companies.3 General Definitions Highly Inclined Orbit...launches from sites north of the 35th parallel.33 USA Launch Facilities There are 3 US based launch facilities that conduct launch operations north

  6. Current globalization of drug interventional clinical trials: characteristics and associated factors, 2011-2013.

    PubMed

    Jeong, Sohyun; Sohn, Minji; Kim, Jae Hyun; Ko, Minoh; Seo, Hee-Won; Song, Yun-Kyoung; Choi, Boyoon; Han, Nayoung; Na, Han-Sung; Lee, Jong Gu; Kim, In-Wha; Oh, Jung Mi; Lee, Euni

    2017-06-21

    Clinical trial globalization is a major trend for industry-sponsored clinical trials. There has been a shift in clinical trial sites towards emerging regions of Eastern Europe, Latin America, Asia, the Middle East, and Africa. Our study objectives were to evaluate the current characteristics of clinical trials and to find out the associated multiple factors which could explain clinical trial globalization and its implications for clinical trial globalization in 2011-2013. The data elements of "phase," "recruitment status," "type of sponsor," "age groups," and "design of trial" from 30 countries were extracted from the ClinicalTrials.gov website. Ten continental representative countries including the USA were selected and the design elements were compared to those of the USA. Factors associated with trial site distribution were chosen for a multilinear regression analysis. The USA, Germany, France, Canada, and United Kingdom were the "top five" countries which frequently held clinical trials. The design elements from nine continental representative countries were quite different from those of the USA; phase 1 trials were more prevalent in India (OR 1.517, p < 0.001) while phase 3 trials were much more prevalent in all nine representative countries than in the USA. A larger number of "child" age group trials was performed in Poland (OR 1.852, p < 0.001), Israel (OR 1.546, p = 0.005), and South Africa (OR 1.963, p < 0.001) than in the USA. Multivariate analysis showed that health care expenditure per capita, Economic Freedom Index, Human Capital Index, and Intellectual Property Rights Index could explain the variance of regional distribution of clinical trials by 63.6%. The globalization of clinical trials in the emerging regions of Asia, South Africa, and Eastern Europe developed in parallel with the factors of economic drive, population for recruitment, and regulatory constraints.

  7. Massively parallel sequencing-enabled mixture analysis of mitochondrial DNA samples.

    PubMed

    Churchill, Jennifer D; Stoljarova, Monika; King, Jonathan L; Budowle, Bruce

    2018-02-22

    The mitochondrial genome has a number of characteristics that provide useful information to forensic investigations. Massively parallel sequencing (MPS) technologies offer improvements to the quantitative analysis of the mitochondrial genome, specifically the interpretation of mixed mitochondrial samples. Two-person mixtures with nuclear DNA ratios of 1:1, 5:1, 10:1, and 20:1 of individuals from different and similar phylogenetic backgrounds and three-person mixtures with nuclear DNA ratios of 1:1:1 and 5:1:1 were prepared using the Precision ID mtDNA Whole Genome Panel and Ion Chef, and sequenced on the Ion PGM or Ion S5 sequencer (Thermo Fisher Scientific, Waltham, MA, USA). These data were used to evaluate whether and to what degree MPS mixtures could be deconvolved. Analysis was effective in identifying the major contributor in each instance, while SNPs from the minor contributor's haplotype only were identified in the 1:1, 5:1, and 10:1 two-person mixtures. While the major contributor was identified from the 5:1:1 mixture, analysis of the three-person mixtures was more complex, and the mixed haplotypes could not be completely parsed. These results indicate that mixed mitochondrial DNA samples may be interpreted with the use of MPS technologies.

  8. A randomised, single-blind, single-dose, three-arm, parallel-group study in healthy subjects to demonstrate pharmacokinetic equivalence of ABP 501 and adalimumab

    PubMed Central

    Kaur, Primal; Chow, Vincent; Zhang, Nan; Moxness, Michael; Kaliyaperumal, Arunan; Markus, Richard

    2017-01-01

    Objective To demonstrate pharmacokinetic (PK) similarity of biosimilar candidate ABP 501 relative to adalimumab reference product from the USA and European Union (EU) and evaluate safety, tolerability and immunogenicity of ABP 501. Methods Randomised, single-blind, single-dose, three-arm, parallel-group study; healthy subjects were randomised to receive ABP 501 (n=67), adalimumab (USA) (n=69) or adalimumab (EU) (n=67) 40 mg subcutaneously. Primary end points were area under the serum concentration-time curve from time 0 extrapolated to infinity (AUCinf) and the maximum observed concentration (Cmax). Secondary end points included safety and immunogenicity. Results AUCinf and Cmax were similar across the three groups. Geometrical mean ratio (GMR) of AUCinf was 1.11 between ABP 501 and adalimumab (USA), and 1.04 between ABP 501 and adalimumab (EU). GMR of Cmax was 1.04 between ABP 501 and adalimumab (USA) and 0.96 between ABP 501 and adalimumab (EU). The 90% CIs for the GMRs of AUCinf and Cmax were within the prespecified standard PK equivalence criteria of 0.80 to 1.25. Treatment-related adverse events were mild to moderate and were reported for 35.8%, 24.6% and 41.8% of subjects in the ABP 501, adalimumab (USA) and adalimumab (EU) groups; incidence of antidrug antibodies (ADAbs) was similar among the study groups. Conclusions Results of this study demonstrated PK similarity of ABP 501 with adalimumab (USA) and adalimumab (EU) after a single 40-mg subcutaneous injection. No new safety signals with ABP 501 were identified. The safety and tolerability of ABP 501 was similar to the reference products, and similar ADAb rates were observed across the three groups. Trial registration number EudraCT number 2012-000785-37; Results. PMID:27466231

  9. LLMapReduce: Multi-Lingual Map-Reduce for Supercomputing Environments

    DTIC Science & Technology

    2015-11-20

    1990s. Popularized by Google [36] and Apache Hadoop [37], map-reduce has become a staple technology of the ever- growing big data community...Lexington, MA, U.S.A Abstract— The map-reduce parallel programming model has become extremely popular in the big data community. Many big data ...to big data users running on a supercomputer. LLMapReduce dramatically simplifies map-reduce programming by providing simple parallel programming

  10. Parallel evolution of image processing tools for multispectral imagery

    NASA Astrophysics Data System (ADS)

    Harvey, Neal R.; Brumby, Steven P.; Perkins, Simon J.; Porter, Reid B.; Theiler, James P.; Young, Aaron C.; Szymanski, John J.; Bloch, Jeffrey J.

    2000-11-01

    We describe the implementation and performance of a parallel, hybrid evolutionary-algorithm-based system, which optimizes image processing tools for feature-finding tasks in multi-spectral imagery (MSI) data sets. Our system uses an integrated spatio-spectral approach and is capable of combining suitably-registered data from different sensors. We investigate the speed-up obtained by parallelization of the evolutionary process via multiple processors (a workstation cluster) and develop a model for prediction of run-times for different numbers of processors. We demonstrate our system on Landsat Thematic Mapper MSI , covering the recent Cerro Grande fire at Los Alamos, NM, USA.

  11. Scan Directed Load Balancing for Highly-Parallel Mesh-Connected Computers

    DTIC Science & Technology

    1991-07-01

    DTIC ~ ELECTE OCT 2 41991 AD-A242 045 Scan Directed Load Balancing for Highly-Parallel Mesh-Connected Computers’ Edoardo S. Biagioni Jan F. Prins...Department of Computer Science University of North Carolina Chapel Hill, N.C. 27599-3175 USA biagioni @cs.unc.edu prinsOcs.unc.edu Abstract Scan Directed...MasPar Computer Corpora- tion. Bibliography [1] Edoardo S. Biagioni . Scan Directed Load Balancing. PhD thesis., University of North Carolina, Chapel Hill

  12. A randomised, single-blind, single-dose, three-arm, parallel-group study in healthy subjects to demonstrate pharmacokinetic equivalence of ABP 501 and adalimumab.

    PubMed

    Kaur, Primal; Chow, Vincent; Zhang, Nan; Moxness, Michael; Kaliyaperumal, Arunan; Markus, Richard

    2017-03-01

    To demonstrate pharmacokinetic (PK) similarity of biosimilar candidate ABP 501 relative to adalimumab reference product from the USA and European Union (EU) and evaluate safety, tolerability and immunogenicity of ABP 501. Randomised, single-blind, single-dose, three-arm, parallel-group study; healthy subjects were randomised to receive ABP 501 (n=67), adalimumab (USA) (n=69) or adalimumab (EU) (n=67) 40 mg subcutaneously. Primary end points were area under the serum concentration-time curve from time 0 extrapolated to infinity (AUC inf ) and the maximum observed concentration (C max ). Secondary end points included safety and immunogenicity. AUC inf and C max were similar across the three groups. Geometrical mean ratio (GMR) of AUC inf was 1.11 between ABP 501 and adalimumab (USA), and 1.04 between ABP 501 and adalimumab (EU). GMR of C max was 1.04 between ABP 501 and adalimumab (USA) and 0.96 between ABP 501 and adalimumab (EU). The 90% CIs for the GMRs of AUC inf and C max were within the prespecified standard PK equivalence criteria of 0.80 to 1.25. Treatment-related adverse events were mild to moderate and were reported for 35.8%, 24.6% and 41.8% of subjects in the ABP 501, adalimumab (USA) and adalimumab (EU) groups; incidence of antidrug antibodies (ADAbs) was similar among the study groups. Results of this study demonstrated PK similarity of ABP 501 with adalimumab (USA) and adalimumab (EU) after a single 40-mg subcutaneous injection. No new safety signals with ABP 501 were identified. The safety and tolerability of ABP 501 was similar to the reference products, and similar ADAb rates were observed across the three groups. EudraCT number 2012-000785-37; Results. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  13. China and the United States--Global partners, competitors and collaborators in nanotechnology development.

    PubMed

    Gao, Yu; Jin, Biyu; Shen, Weiyu; Sinko, Patrick J; Xie, Xiaodong; Zhang, Huijuan; Jia, Lee

    2016-01-01

    USA and China are two leading countries engaged in nanotechnology research and development. They compete with each other for fruits in this innovative area in a parallel and compatible manner. Understanding the status and developmental prospects of nanotechnology in USA and China is important for policy-makers to decide nanotechnology priorities and funding, and to explore new ways for global cooperation on key issues. We here present the nanoscience and nanomedicine research and the related productivity measured by publications, and patent applications, governmental funding, policies and regulations, institutional translational research, industrial and enterprise growth in nanotechnology-related fields across China and USA. The comparison reveals some marked asymmetries of nanotechnology development in China and USA, which may be helpful for future directions to strengthen nanotechnology collaboration for both countries, and for the world as a whole. Copyright © 2015 Elsevier Inc. All rights reserved.

  14. MRSA Causing Infections in Hospitals in Greater Metropolitan New York: Major Shift in the Dominant Clonal Type between 1996 and 2014.

    PubMed

    Pardos de la Gandara, Maria; Curry, Marie; Berger, Judith; Burstein, David; Della-Latta, Phyllis; Kopetz, Virgina; Quale, John; Spitzer, Eric; Tan, Rexie; Urban, Carl; Wang, Guiqing; Whittier, Susan; de Lencastre, Herminia; Tomasz, Alexander

    2016-01-01

    A surveillance study in 1996 identified the USA100 clone (ST5/SCCmecII)-also known as the "New York/Japan" clone-as the most prevalent MRSA causing infections in 12 New York City hospitals. Here we update the epidemiology of MRSA in seven of the same hospitals eighteen years later in 2013/14. Most of the current MRSA isolates (78 of 121) belonged to the MRSA clone USA300 (CC8/SCCmecIV) but the USA100 clone-dominant in the 1996 survey-still remained the second most frequent MRSA (25 of the 121 isolates) causing 32% of blood stream infections. The USA300 clone was most common in skin and soft tissue infections (SSTIs) and was associated with 84.5% of SSTIs compared to 5% caused by the USA100 clone. Our data indicate that by 2013/14, the USA300 clone replaced the New York/Japan clone as the most frequent cause of MRSA infections in hospitals in Metropolitan New York. In parallel with this shift in the clonal type of MRSA, there was also a striking change in the types of MRSA infections from 1996 to 2014.

  15. Diazotrophy in the Deep: An analysis of the distribution, magnitude, geochemical controls, and biological mediators of deep-sea benthic nitrogen fixation

    NASA Astrophysics Data System (ADS)

    Dekas, Anne Elizabeth

    Biological nitrogen fixation (the conversion of N2 to NH3) is a critical process in the oceans, counteracting the production of N2 gas by dissimilatory bacterial metabolisms and providing a source of bioavailable nitrogen to many nitrogen-limited ecosystems. One currently poorly studied and potentially underappreciated habitat for diazotrophic organisms is the sediments of the deep-sea. Although nitrogen fixation was once thought to be negligible in non-photosynthetically driven benthic ecosystems, the present study demonstrates the occurrence and expression of a diversity of nifH genes (those necessary for nitrogen fixation), as well as a widespread ability to fix nitrogen at high rates in these locations. The following research explores the distribution, magnitude, geochemical controls, and biological mediators of nitrogen fixation at several deep-sea sediment habitats, including active methane seeps (Mound 12, Costa Rica; Eel River Basin, CA, USA; Hydrate Ridge, OR, USA; and Monterey Canyon, CA, USA), whale-fall sites (Monterey Canyon, CA), and background deep-sea sediment (off-site Mound 12 Costa Rica, off-site Hydrate Ridge, OR, USA; and Monterey Canyon, CA, USA). The first of the five chapters describes the FISH-NanoSIMS method, which we optimized for the analysis of closely associated microbial symbionts in marine sediments. The second describes an investigation of methane seep sediment from the Eel River Basin, where we recovered nifH sequences from extracted DNA, and used FISH-NanoSIMS to identify methanotrophic archaea (ANME-2) as diazotrophs, when associated with functional sulfate-reducing bacterial symbionts. The third and fourth chapters focus on the distribution and diversity of active diazotrophs (respectively) in methane seep sediment from Mound 12, Costa Rica, using a combination of 15N-labeling experiments, FISH-NanoSIMS, and RNA and DNA analysis. The fifth chapter expands the scope of the investigation by targeting diverse samples from methane seep, whale-fall, and background sediment collected along the Eastern Pacific Margin, and comparing the rates of nitrogen fixation observed to geochemical measurements collected in parallel. Together, these analyses represent the most extensive investigation of deep-sea nitrogen fixation to date, and work towards understanding the contribution of benthic nitrogen fixation to global marine nitrogen cycling.

  16. Structural analysis of the Lombard thrust sheet and adjacent areas in the Helena salient, southwest Montana, USA

    NASA Astrophysics Data System (ADS)

    Whisner, Stephen C.; Schmidt, Christopher J.; Whisner, Jennifer B.

    2014-12-01

    The Helena salient is a prominent craton-convex curve in the Cordillera thrust belt of Montana, USA. The Lombard thrust sheet is the primary sheet in the salient. Structural analysis of fold trends, cleavage attitudes, and movement on minor faults is used to better understand both the geometry of the Lombard thrust and the kinematic development of the salient. Early W-E to WNW-ENE shortening directions in the Lombard sheet are indicated by fold trends in the center of the thrust sheet. The same narrow range of shortening directions is inferred from kinematic analysis of movement on minor faults and the orientations of unrotated cleavage planes along the southern lateral ramp boundary of the salient. As the salient developed, the amount and direction of shortening were locally modified as listric detachment faults rotated some tight folds to the NW, and as right-lateral simple shear, caused by lock-up and folding of the Jefferson Canyon fault above the lateral ramp, rotated other folds northeastward. Where the lateral ramp and frontal-oblique ramp intersect, folds were rotated back to the NW. Our interpretation of dominant W-E to WNW-ESE shortening in the Lombard sheet, later altered by local rotations, supports a model of salient formation by primary parallel transport modified by interactions with a lateral ramp.

  17. Limnodrilus sulphurensis n. sp., from a sulfur cave in Colorado, USA, with notes on the morphologically similar L. profundicola (Clitellata, Naididae, Tubificinae).

    PubMed

    Fend, Steven V; Liu, Yingkui; Steinmann, David; Giere, Olav; Barton, Hazel A; Luiszer, Fred; Erséus, Christer

    2016-01-18

    A new species of the tubificine genus Limnodrilus is described and COI barcoded from Sulphur Cave and associated springs in Colorado, USA. The habitats are characterized by high sulfide concentrations. The new species, L. sulphurensis, is distinguished from all congeners by the elongate, nearly parallel teeth of chaetae in its anterior segments. It has a penis sheath resembling that of L. profundicola; consequently, museum specimens and new collections are examined here to resolve some of the taxonomic confusion surrounding that widespread, but uncommon species.

  18. Antarctic Exploration Parallels for Future Human Planetary Exploration: The Role and Utility of Long Range, Long Duration Traverses

    NASA Technical Reports Server (NTRS)

    Hoffman, Stephen J. (Editor); Voels, Stephen A. (Editor)

    2012-01-01

    Topics covered include: Antarctic Exploration Parallels for Future Human Planetary Exploration: Science Operations Lessons Learned, Planning, and Equipment Capabilities for Long Range, Long Duration Traverses; Parallels Between Antarctic Travel in 1950 and Planetary Travel in 2050 (to Accompany Notes on "The Norwegian British-Swedish Antarctic Expedition 1949-52"); My IGY in Antarctica; Short Trips and a Traverse; Geologic Traverse Planning for Apollo Missions; Desert Research and Technology Studies (DRATS) Traverse Planning; Science Traverses in the Canadian High Arctic; NOR-USA Scientific Traverse of East Antarctica: Science and Logistics on a Three-Month Expedition Across Antarctica's Farthest Frontier; A Notional Example of Understanding Human Exploration Traverses on the Lunar Surface; and The Princess Elisabeth Station.

  19. Immigration and the modern welfare state: the case of USA and Germany.

    PubMed

    Wenzel, U; Bos, M

    1997-10-01

    "This article presents a comparison of the inclusion of migrants into welfare programmes in the USA and in Germany. In the first part of the article a brief overview is provided of immigration categories in both countries in order to demonstrate the relevance of these administrative regulations for the opportunities of individual migrants to participate in the welfare system. In the second part we elaborate in more detail on how welfare programmes have developed as basic mechanisms to include or exclude migrants. Our findings illustrate an increasing differentiation of membership statuses parallel to the expansion of modern welfare systems. In both the USA and Germany, the territorial principle and participation in the labour market are of prime importance to the access to social rights. In both cases all migrants may profit from contributory programmes." excerpt

  20. GrigoraSNPs: Optimized Analysis of SNPs for DNA Forensics.

    PubMed

    Ricke, Darrell O; Shcherbina, Anna; Michaleas, Adam; Fremont-Smith, Philip

    2018-04-16

    High-throughput sequencing (HTS) of single nucleotide polymorphisms (SNPs) enables additional DNA forensic capabilities not attainable using traditional STR panels. However, the inclusion of sets of loci selected for mixture analysis, extended kinship, phenotype, biogeographic ancestry prediction, etc., can result in large panel sizes that are difficult to analyze in a rapid fashion. GrigoraSNP was developed to address the allele-calling bottleneck that was encountered when analyzing SNP panels with more than 5000 loci using HTS. GrigoraSNPs uses a MapReduce parallel data processing on multiple computational threads plus a novel locus-identification hashing strategy leveraging target sequence tags. This tool optimizes the SNP calling module of the DNA analysis pipeline with runtimes that scale linearly with the number of HTS reads. Results are compared with SNP analysis pipelines implemented with SAMtools and GATK. GrigoraSNPs removes a computational bottleneck for processing forensic samples with large HTS SNP panels. Published 2018. This article is a U.S. Government work and is in the public domain in the USA.

  1. OpenACC acceleration of an unstructured CFD solver based on a reconstructed discontinuous Galerkin method for compressible flows

    DOE PAGES

    Xia, Yidong; Lou, Jialin; Luo, Hong; ...

    2015-02-09

    Here, an OpenACC directive-based graphics processing unit (GPU) parallel scheme is presented for solving the compressible Navier–Stokes equations on 3D hybrid unstructured grids with a third-order reconstructed discontinuous Galerkin method. The developed scheme requires the minimum code intrusion and algorithm alteration for upgrading a legacy solver with the GPU computing capability at very little extra effort in programming, which leads to a unified and portable code development strategy. A face coloring algorithm is adopted to eliminate the memory contention because of the threading of internal and boundary face integrals. A number of flow problems are presented to verify the implementationmore » of the developed scheme. Timing measurements were obtained by running the resulting GPU code on one Nvidia Tesla K20c GPU card (Nvidia Corporation, Santa Clara, CA, USA) and compared with those obtained by running the equivalent Message Passing Interface (MPI) parallel CPU code on a compute node (consisting of two AMD Opteron 6128 eight-core CPUs (Advanced Micro Devices, Inc., Sunnyvale, CA, USA)). Speedup factors of up to 24× and 1.6× for the GPU code were achieved with respect to one and 16 CPU cores, respectively. The numerical results indicate that this OpenACC-based parallel scheme is an effective and extensible approach to port unstructured high-order CFD solvers to GPU computing.« less

  2. Genomic characterization of malonate positive Cronobacter sakazakii serotype O:2, sequence type 64 strains, isolated from clinical, food, and environment samples.

    PubMed

    Gopinath, Gopal R; Chase, Hannah R; Gangiredla, Jayanthi; Eshwar, Athmanya; Jang, Hyein; Patel, Isha; Negrete, Flavia; Finkelstein, Samantha; Park, Eunbi; Chung, TaeJung; Yoo, YeonJoo; Woo, JungHa; Lee, YouYoung; Park, Jihyeon; Choi, Hyerim; Jeong, Seungeun; Jun, Soyoung; Kim, Mijeong; Lee, Chaeyoon; Jeong, HyeJin; Fanning, Séamus; Stephan, Roger; Iversen, Carol; Reich, Felix; Klein, Günter; Lehner, Angelika; Tall, Ben D

    2018-01-01

    Malonate utilization, an important differential trait, well recognized as being possessed by six of the seven Cronobacter species is thought to be largely absent in Cronobacter sakazakii (Csak). The current study provides experimental evidence that confirms the presence of a malonate utilization operon in 24 strains of sequence type (ST) 64, obtained from Europe, Middle East, China, and USA; it offers explanations regarding the genomic diversity and phylogenetic relatedness among these strains, and that of other C. sakazakii strains. In this study, the presence of a malonate utilization operon in these strains was initially identified by DNA microarray analysis (MA) out of a pool of 347 strains obtained from various surveillance studies involving clinical, spices, milk powder sources and powdered infant formula production facilities in Ireland and Germany, and dried dairy powder manufacturing facilities in the USA. All ST64 C. sakazakii strains tested could utilize malonate. Zebrafish embryo infection studies showed that C. sakazakii ST64 strains are as virulent as other Cronobacter species. Parallel whole genome sequencing (WGS) and MA showed that the strains phylogenetically grouped as a separate clade among the Csak species cluster. Additionally, these strains possessed the Csak O:2 serotype. The nine-gene, ~ 7.7 kbp malonate utilization operon was located in these strains between two conserved flanking genes, gyrB and katG. Plasmidotyping results showed that these strains possessed the virulence plasmid pESA3, but in contrast to the USA ST64 Csak strains, ST64 Csak strains isolated from sources in Europe and the Middle East, did not possess the type six secretion system effector vgrG gene. Until this investigation, the presence of malonate-positive Csak strains, which are associated with foods and clinical cases, was under appreciated. If this trait was used solely to identify Cronobacter strains, many strains would likely be misidentified. Parallel WGS and MA were useful in characterizing the total genome content of these Csak O:2, ST64, malonate-positive strains and further provides an understanding of their phylogenetic relatedness among other virulent C. sakazakii strains.

  3. Analysis, tuning and comparison of two general sparse solvers for distributed memory computers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Amestoy, P.R.; Duff, I.S.; L'Excellent, J.-Y.

    2000-06-30

    We describe the work performed in the context of a Franco-Berkeley funded project between NERSC-LBNL located in Berkeley (USA) and CERFACS-ENSEEIHT located in Toulouse (France). We discuss both the tuning and performance analysis of two distributed memory sparse solvers (superlu from Berkeley and mumps from Toulouse) on the 512 processor Cray T3E from NERSC (Lawrence Berkeley National Laboratory). This project gave us the opportunity to improve the algorithms and add new features to the codes. We then quite extensively analyze and compare the two approaches on a set of large problems from real applications. We further explain the main differencesmore » in the behavior of the approaches on artificial regular grid problems. As a conclusion to this activity report, we mention a set of parallel sparse solvers on which this type of study should be extended.« less

  4. Combining qualitative and quantitative research within mixed method research designs: a methodological review.

    PubMed

    Östlund, Ulrika; Kidd, Lisa; Wengström, Yvonne; Rowa-Dewar, Neneh

    2011-03-01

    It has been argued that mixed methods research can be useful in nursing and health science because of the complexity of the phenomena studied. However, the integration of qualitative and quantitative approaches continues to be one of much debate and there is a need for a rigorous framework for designing and interpreting mixed methods research. This paper explores the analytical approaches (i.e. parallel, concurrent or sequential) used in mixed methods studies within healthcare and exemplifies the use of triangulation as a methodological metaphor for drawing inferences from qualitative and quantitative findings originating from such analyses. This review of the literature used systematic principles in searching CINAHL, Medline and PsycINFO for healthcare research studies which employed a mixed methods approach and were published in the English language between January 1999 and September 2009. In total, 168 studies were included in the results. Most studies originated in the United States of America (USA), the United Kingdom (UK) and Canada. The analytic approach most widely used was parallel data analysis. A number of studies used sequential data analysis; far fewer studies employed concurrent data analysis. Very few of these studies clearly articulated the purpose for using a mixed methods design. The use of the methodological metaphor of triangulation on convergent, complementary, and divergent results from mixed methods studies is exemplified and an example of developing theory from such data is provided. A trend for conducting parallel data analysis on quantitative and qualitative data in mixed methods healthcare research has been identified in the studies included in this review. Using triangulation as a methodological metaphor can facilitate the integration of qualitative and quantitative findings, help researchers to clarify their theoretical propositions and the basis of their results. This can offer a better understanding of the links between theory and empirical findings, challenge theoretical assumptions and develop new theory. Copyright © 2010 Elsevier Ltd. All rights reserved.

  5. Cultural adaptation in measuring common client characteristics with an urban Mainland Chinese sample.

    PubMed

    Song, Xiaoxia; Anderson, Timothy; Beutler, Larry E; Sun, Shijin; Wu, Guohong; Kimpara, Satoko

    2015-01-01

    This study aimed to develop a culturally adapted version of the Systematic Treatment Selection-Innerlife (STS) in China. A total of 300 nonclinical participants collected from Mainland China and 240 nonclinical US participants were drawn from archival data. A Chinese version of the STS was developed, using translation and back-translation procedures. After confirmatory factor analysis (CFA) of the original STS sub scales failed on both samples, exploratory factor analysis (EFA) was then used to access whether a simple structure would emerge on these STS treatment items. Parallel analysis and minimum average partial were used to determine the number of factor to retain. Three cross-cultural factors were found in this study, Internalized Distress, Externalized Distress and interpersonal relations. This supported that regardless of whether one is in presumably different cultural contexts of the USA or China, psychological distress is expressed in a few basic channels of internalized distress, externalized distress, and interpersonal relations, from which different manifestations in different culture were also discussed.

  6. Coccidioidomycosis among Scholarship Athletes and Other College Students, Arizona, USA1

    PubMed Central

    Stern, Nicole G.

    2010-01-01

    To compare coccidioidomycosis case rates among groups of young adults in a disease-endemic region, we reviewed medical charts for serologic testing and coding. Case rates were higher for scholarship athletes than for other students and paralleled 5× more serologic testing. Our findings underscore the need to routinely test patients for coccidioidomycosis. PMID:20113571

  7. Implementing Cycles of Assess, Plan, Do, Review: A Literature Review of Practitioner Perspectives

    ERIC Educational Resources Information Center

    Greenwood, Jo; Kelly, Catherine

    2017-01-01

    This article uses a literature review process to explore current literature on Response to Intervention (RtI), an approach to the identification of and provision for students with special educational needs introduced in the USA by the Individuals with Disabilities Education Improvement Act of 2004. Parallels are made between RtI and the graduated…

  8. An Anthropologist's Reflections on Defining Quality in Education Research

    ERIC Educational Resources Information Center

    Tobin, Joseph

    2007-01-01

    In the USA there is a contemporary discourse of crisis about the state of education and a parallel discourse that lays a large portion of the blame onto the poor quality of educational research. The solution offered is "scientific research." This article presents critiques of the core assumptions of the scientific research as secure argument.…

  9. Graphics processing unit based computation for NDE applications

    NASA Astrophysics Data System (ADS)

    Nahas, C. A.; Rajagopal, Prabhu; Balasubramaniam, Krishnan; Krishnamurthy, C. V.

    2012-05-01

    Advances in parallel processing in recent years are helping to improve the cost of numerical simulation. Breakthroughs in Graphical Processing Unit (GPU) based computation now offer the prospect of further drastic improvements. The introduction of 'compute unified device architecture' (CUDA) by NVIDIA (the global technology company based in Santa Clara, California, USA) has made programming GPUs for general purpose computing accessible to the average programmer. Here we use CUDA to develop parallel finite difference schemes as applicable to two problems of interest to NDE community, namely heat diffusion and elastic wave propagation. The implementations are for two-dimensions. Performance improvement of the GPU implementation against serial CPU implementation is then discussed.

  10. Tidally generated sea-floor lineations in Bristol Bay, Alaska, USA

    USGS Publications Warehouse

    Marlow, M. S.; Stevenson, A.J.; Chezar, H.; McConnaughey, R.A.

    1999-01-01

    Highly reflective linear features occur in water depths of 20-30 m in northern Bristol Bay (Alaska, USA) and are, in places, over 600 m in length. Their length-to-width ratio is over 100:1. The lineations are usually characterized by large transverse ripples with wavelengths of 1-2 m. The lineations trend about N60??E, and are spaced between 20 and 350 m. Main tidal directions near the lineations are N60??E (flood) and S45??W (ebb), which are parallel to subparallel to the lineations. They suggest that the lineations may be tidally generated. The lineations may be bright sonar reflections from a winnowed lag concentrate of coarse sand.

  11. PREFACE: International Conference on Computing in High Energy and Nuclear Physics (CHEP 2010)

    NASA Astrophysics Data System (ADS)

    Lin, Simon C.; Shen, Stella; Neufeld, Niko; Gutsche, Oliver; Cattaneo, Marco; Fisk, Ian; Panzer-Steindel, Bernd; Di Meglio, Alberto; Lokajicek, Milos

    2011-12-01

    The International Conference on Computing in High Energy and Nuclear Physics (CHEP) was held at Academia Sinica in Taipei from 18-22 October 2010. CHEP is a major series of international conferences for physicists and computing professionals from the worldwide High Energy and Nuclear Physics community, Computer Science, and Information Technology. The CHEP conference provides an international forum to exchange information on computing progress and needs for the community, and to review recent, ongoing and future activities. CHEP conferences are held at roughly 18 month intervals, alternating between Europe, Asia, America and other parts of the world. Recent CHEP conferences have been held in Prauge, Czech Republic (2009); Victoria, Canada (2007); Mumbai, India (2006); Interlaken, Switzerland (2004); San Diego, California(2003); Beijing, China (2001); Padova, Italy (2000) CHEP 2010 was organized by Academia Sinica Grid Computing Centre. There was an International Advisory Committee (IAC) setting the overall themes of the conference, a Programme Committee (PC) responsible for the content, as well as Conference Secretariat responsible for the conference infrastructure. There were over 500 attendees with a program that included plenary sessions of invited speakers, a number of parallel sessions comprising around 260 oral and 200 poster presentations, and industrial exhibitions. We thank all the presenters, for the excellent scientific content of their contributions to the conference. Conference tracks covered topics on Online Computing, Event Processing, Software Engineering, Data Stores, and Databases, Distributed Processing and Analysis, Computing Fabrics and Networking Technologies, Grid and Cloud Middleware, and Collaborative Tools. The conference included excursions to various attractions in Northern Taiwan, including Sanhsia Tsu Shih Temple, Yingko, Chiufen Village, the Northeast Coast National Scenic Area, Keelung, Yehliu Geopark, and Wulai Aboriginal Village, as well as two banquets held at the Grand Hotel and Grand Formosa Regent in Taipei. The next CHEP conference will be held in New York, the United States on 21-25 May 2012. We would like to thank the National Science Council of Taiwan, the EU ACEOLE project, commercial sponsors, and the International Advisory Committee and the Programme Committee members for all their support and help. Special thanks to the Programme Committee members for their careful choice of conference contributions and enormous effort in reviewing and editing about 340 post conference proceedings papers. Simon C Lin CHEP 2010 Conference Chair and Proceedings Editor Taipei, Taiwan November 2011 Track Editors/ Programme Committee Chair Simon C Lin, Academia Sinica, Taiwan Online Computing Track Y H Chang, National Central University, Taiwan Harry Cheung, Fermilab, USA Niko Neufeld, CERN, Switzerland Event Processing Track Fabio Cossutti, INFN Trieste, Italy Oliver Gutsche, Fermilab, USA Ryosuke Itoh, KEK, Japan Software Engineering, Data Stores, and Databases Track Marco Cattaneo, CERN, Switzerland Gang Chen, Chinese Academy of Sciences, China Stefan Roiser, CERN, Switzerland Distributed Processing and Analysis Track Kai-Feng Chen, National Taiwan University, Taiwan Ulrik Egede, Imperial College London, UK Ian Fisk, Fermilab, USA Fons Rademakers, CERN, Switzerland Torre Wenaus, BNL, USA Computing Fabrics and Networking Technologies Track Harvey Newman, Caltech, USA Bernd Panzer-Steindel, CERN, Switzerland Antonio Wong, BNL, USA Ian Fisk, Fermilab, USA Niko Neufeld, CERN, Switzerland Grid and Cloud Middleware Track Alberto Di Meglio, CERN, Switzerland Markus Schulz, CERN, Switzerland Collaborative Tools Track Joao Correia Fernandes, CERN, Switzerland Philippe Galvez, Caltech, USA Milos Lokajicek, FZU Prague, Czech Republic International Advisory Committee Chair: Simon C. Lin , Academia Sinica, Taiwan Members: Mohammad Al-Turany , FAIR, Germany Sunanda Banerjee, Fermilab, USA Dario Barberis, CERN & Genoa University/INFN, Switzerland Lothar Bauerdick, Fermilab, USA Ian Bird, CERN, Switzerland Amber Boehnlein, US Department of Energy, USA Kors Bos, CERN, Switzerland Federico Carminati, CERN, Switzerland Philippe Charpentier, CERN, Switzerland Gang Chen, Institute of High Energy Physics, China Peter Clarke, University of Edinburgh, UK Michael Ernst, Brookhaven National Laboratory, USA David Foster, CERN, Switzerland Merino Gonzalo, CIEMAT, Spain John Gordon, STFC-RAL, UK Volker Guelzow, Deutsches Elektronen-Synchrotron DESY, Hamburg, Germany John Harvey, CERN, Switzerland Frederic Hemmer, CERN, Switzerland Hafeez Hoorani, NCP, Pakistan Viatcheslav Ilyin, Moscow State University, Russia Matthias Kasemann, DESY, Germany Nobuhiko Katayama, KEK, Japan Milos Lokajícek, FZU Prague, Czech Republic David Malon, ANL, USA Pere Mato Vila, CERN, Switzerland Mirco Mazzucato, INFN CNAF, Italy Richard Mount, SLAC, USA Harvey Newman, Caltech, USA Mitsuaki Nozaki, KEK, Japan Farid Ould-Saada, University of Oslo, Norway Ruth Pordes, Fermilab, USA Hiroshi Sakamoto, The University of Tokyo, Japan Alberto Santoro, UERJ, Brazil Jim Shank, Boston University, USA Alan Silverman, CERN, Switzerland Randy Sobie , University of Victoria, Canada Dongchul Son, Kyungpook National University, South Korea Reda Tafirout , TRIUMF, Canada Victoria White, Fermilab, USA Guy Wormser, LAL, France Frank Wuerthwein, UCSD, USA Charles Young, SLAC, USA

  12. Time-series analysis to study the impact of an intersection on dispersion along a street canyon.

    PubMed

    Richmond-Bryant, Jennifer; Eisner, Alfred D; Hahn, Intaek; Fortune, Christopher R; Drake-Richman, Zora E; Brixey, Laurie A; Talih, M; Wiener, Russell W; Ellenson, William D

    2009-12-01

    This paper presents data analysis from the Brooklyn Traffic Real-Time Ambient Pollutant Penetration and Environmental Dispersion (B-TRAPPED) study to assess the transport of ultrafine particulate matter (PM) across urban intersections. Experiments were performed in a street canyon perpendicular to a highway in Brooklyn, NY, USA. Real-time ultrafine PM samplers were positioned on either side of an intersection at multiple locations along a street to collect time-series number concentration data. Meteorology equipment was positioned within the street canyon and at an upstream background site to measure wind speed and direction. Time-series analysis was performed on the PM data to compute a transport velocity along the direction of the street for the cases where background winds were parallel and perpendicular to the street. The data were analyzed for sampler pairs located (1) on opposite sides of the intersection and (2) on the same block. The time-series analysis demonstrated along-street transport, including across the intersection when background winds were parallel to the street canyon and there was minimal transport and no communication across the intersection when background winds were perpendicular to the street canyon. Low but significant values of the cross-correlation function (CCF) underscore the turbulent nature of plume transport along the street canyon. The low correlations suggest that flow switching around corners or traffic-induced turbulence at the intersection may have aided dilution of the PM plume from the highway. This observation supports similar findings in the literature. Furthermore, the time-series analysis methodology applied in this study is introduced as a technique for studying spatiotemporal variation in the urban microscale environment.

  13. COMMITTEES: Proceedings of the 13th Gravitational Waves Data Analysis Workshop (GWDAW13), San Juan, Puerto Rico, 19-22 January 2009 Proceedings of the 13th Gravitational Waves Data Analysis Workshop (GWDAW13), San Juan, Puerto Rico, 19-22 January 2009

    NASA Astrophysics Data System (ADS)

    2009-10-01

    Science Organising Committee (SOC) Bruce Allen, AEI, Germany Patrick Brady, University of Wisconsin Milwaukee, USA Deepto Chakrabarty, MIT, USA Eugenio Coccia, INFN, Gran Sasso, Italy James Cordes, Cornell University, USA Mario Díaz (Chair), University of Texas Brownsville, USA Sam Finn, Penn State, USA Neil Gehrels, NASA GSFC, USA Fredrick A Jenet, University of Texas Brownsville, USA Nobuyuki Kanda, Osaka City University, Japan Erik Katsavounides, MIT, USA Dick Manchester, ATNF, Australia Soumya Mohanty, University of Texas Brownsville, USA Benoit Mours, LAPP-Annecy, France Maria Alessandra Papa, AEI, Germany Kate Scholberg, Duke University, USA Susan Scott, The Australian National University Alberto Vecchio, University of Birmingham, UK Andrea Vicere, INFN - Sezione di Firenze, Italy Stan Whitcomb, LIGO CALTECH, USA Local Organising Committee (LOC) Paulo Freire (Arecibo Observatory, Puerto Rico) Murray Lewis (Arecibo Observatory, Puerto Rico) Wanda Wiley (University of Texas Brownsville, USA)

  14. National Bone Health Alliance Bone Turnover Marker Project: current practices and the need for US harmonization, standardization, and common reference ranges.

    PubMed

    Bauer, D; Krege, J; Lane, N; Leary, E; Libanati, C; Miller, P; Myers, G; Silverman, S; Vesper, H W; Lee, D; Payette, M; Randall, S

    2012-10-01

    This position paper reviews how the National Bone Health Alliance (NBHA) will execute a project to help assure health professionals of the clinical utility of bone turnover markers; the current clinical approaches concerning osteoporosis and the status and use of bone turnover markers in the USA; the rationale for focusing this effort around two specific bone turnover markers; the need to standardize bone marker sample collection procedures, reference ranges, and bone turnover marker assays in clinical laboratories; and the importance of harmonization for future research of bone turnover markers. Osteoporosis is a major global health problem, with the prevalence and incidence of osteoporosis for at-risk populations estimated to be 44 million Americans. The potential of bone markers as an additional tool for health care professionals to improve patient outcomes and impact morbidity and mortality is crucial in providing better health care and addressing rising health care costs. This need to advance the field of bone turnover markers has been recognized by a number of organizations, including the International Osteoporosis Foundation (IOF), National Osteoporosis Foundation, International Federation of Clinical Chemistry, and Laboratory Medicine (IFCC), and the NBHA. This position paper elucidates how this project will standardize bone turnover marker sample collection procedures in the USA, establish a USA reference range for one bone formation (serum procollagen type I N propeptide, s-PINP) and one bone resorption (serum C-terminal telopeptide of type I collagen, s-CTX) marker, and standardize bone turnover marker assays used in clinical laboratories. This effort will allow clinicians from the USA to have confidence in their use of bone turnover markers to help monitor osteoporosis treatment and assess future fracture risk. This project builds on the recommendations of the IOF/IFCC Bone Marker Standards Working Group by developing USA reference standards for s-PINP and s-CTX, the markers identified as most promising for use as reference markers. The goals of this project will be realized through the NBHA and will include its governmental, academic, for-profit, and non-profit sector stakeholders as well as major academic and commercial laboratories. Upon completion, a parallel effort will be pursued to make bone turnover marker measurements reliable and accepted by all health care professionals for facilitating treatment decisions and ultimately be reimbursed by all health insurance payers. Successful completion of this project will help assure health professionals from the USA of the clinical utility of bone turnover markers and ties in with the parallel effort of the IOF/IFCC to develop worldwide bone turnover reference ranges.

  15. Microbial community stratification linked to utilization of carbohydrates and phosphorus limitation in a boreal peatland at Marcell Experimental Forest, Minnesota, USA.

    PubMed

    Lin, Xueju; Tfaily, Malak M; Steinweg, J Megan; Chanton, Patrick; Esson, Kaitlin; Yang, Zamin K; Chanton, Jeffrey P; Cooper, William; Schadt, Christopher W; Kostka, Joel E

    2014-06-01

    This study investigated the abundance, distribution, and composition of microbial communities at the watershed scale in a boreal peatland within the Marcell Experimental Forest (MEF), Minnesota, USA. Through a close coupling of next-generation sequencing, biogeochemistry, and advanced analytical chemistry, a biogeochemical hot spot was revealed in the mesotelm (30- to 50-cm depth) as a pronounced shift in microbial community composition in parallel with elevated peat decomposition. The relative abundance of Acidobacteria and the Syntrophobacteraceae, including known hydrocarbon-utilizing genera, was positively correlated with carbohydrate and organic acid content, showing a maximum in the mesotelm. The abundance of Archaea (primarily crenarchaeal groups 1.1c and 1.3) increased with depth, reaching up to 60% of total small-subunit (SSU) rRNA gene sequences in the deep peat below the 75-cm depth. Stable isotope geochemistry and potential rates of methane production paralleled vertical changes in methanogen community composition to indicate a predominance of acetoclastic methanogenesis mediated by the Methanosarcinales in the mesotelm, while hydrogen-utilizing methanogens predominated in the deeper catotelm. RNA-derived pyrosequence libraries corroborated DNA sequence data to indicate that the above-mentioned microbial groups are metabolically active in the mid-depth zone. Fungi showed a maximum in rRNA gene abundance above the 30-cm depth, which comprised only an average of 0.1% of total bacterial and archaeal rRNA gene abundance, indicating prokaryotic dominance. Ratios of C to P enzyme activities approached 0.5 at the acrotelm and catotelm, indicating phosphorus limitation. In contrast, P limitation pressure appeared to be relieved in the mesotelm, likely due to P solubilization by microbial production of organic acids and C-P lyases. Based on path analysis and the modeling of community spatial turnover, we hypothesize that P limitation outweighs N limitation at MEF, and microbial communities are structured by the dominant shrub, Chamaedaphne calyculata, which may act as a carbon source for major consumers in the peatland.

  16. Performance Analysis of Multilevel Parallel Applications on Shared Memory Architectures

    NASA Technical Reports Server (NTRS)

    Biegel, Bryan A. (Technical Monitor); Jost, G.; Jin, H.; Labarta J.; Gimenez, J.; Caubet, J.

    2003-01-01

    Parallel programming paradigms include process level parallelism, thread level parallelization, and multilevel parallelism. This viewgraph presentation describes a detailed performance analysis of these paradigms for Shared Memory Architecture (SMA). This analysis uses the Paraver Performance Analysis System. The presentation includes diagrams of a flow of useful computations.

  17. Spatial data analytics on heterogeneous multi- and many-core parallel architectures using python

    USGS Publications Warehouse

    Laura, Jason R.; Rey, Sergio J.

    2017-01-01

    Parallel vector spatial analysis concerns the application of parallel computational methods to facilitate vector-based spatial analysis. The history of parallel computation in spatial analysis is reviewed, and this work is placed into the broader context of high-performance computing (HPC) and parallelization research. The rise of cyber infrastructure and its manifestation in spatial analysis as CyberGIScience is seen as a main driver of renewed interest in parallel computation in the spatial sciences. Key problems in spatial analysis that have been the focus of parallel computing are covered. Chief among these are spatial optimization problems, computational geometric problems including polygonization and spatial contiguity detection, the use of Monte Carlo Markov chain simulation in spatial statistics, and parallel implementations of spatial econometric methods. Future directions for research on parallelization in computational spatial analysis are outlined.

  18. Ancient human mitochondrial DNA and radiocarbon analysis of archived quids from the Mule Spring Rockshelter, Nevada, USA.

    PubMed

    Hamilton-Brehm, Scott D; Hristova, Lidia T; Edwards, Susan R; Wedding, Jeffrey R; Snow, Meradeth; Kruger, Brittany R; Moser, Duane P

    2018-01-01

    Chewed and expectorated quids, indigestible stringy fibers from the roasted inner pulp of agave or yucca root, have proven resilient over long periods of time in dry cave environments and correspondingly, although little studied, are common in archaeological archives. In the late 1960s, thousands of quids were recovered from Mule Spring Rockshelter (Nevada, USA) deposits and stored without consideration to DNA preservation in a museum collection, remaining unstudied for over fifty years. To assess the utility of these materials as repositories for genetic information about past inhabitants of the region and their movements, twenty-one quids were selected from arbitrary excavation depths for detailed analysis. Human mitochondrial DNA sequences from the quids were amplified by PCR and screened for diagnostic single nucleotide polymorphisms. Most detected single nucleotide polymorphisms were consistent with recognized Native American haplogroup subclades B2a5, B2i1, C1, C1c, C1c2, and D1; with the majority of the sample set consistent with subclades C1, C1c, and C1c2. In parallel with the DNA analysis, each quid was radiocarbon dated, revealing a time-resolved pattern of occupancy from 347 to 977 calibrated years before present. In particular, this dataset reveals strong evidence for the presence of haplogroup C1/C1c at the Southwestern edge of the US Great Basin from ~670 to 980 cal YBP, which may temporally correspond with the beginnings of the so-called Numic Spread into the region. The research described here demonstrates an approach which combines targeted DNA analysis with radiocarbon age dating; thus enabling the genetic analysis of archaeological materials of uncertain stratigraphic context. Here we present a survey of the maternal genetic profiles from people who used the Mule Spring Rockshelter and the historic timing of their utilization of a key natural resource.

  19. FY2015 Analysis of the Teamwork USA Program. Memorandum

    ERIC Educational Resources Information Center

    Howard, Mark

    2015-01-01

    The Department of Research and Evaluation (DRE) has completed an analysis of the performance of students who participated in the Teamwork USA Program, administered in FY2014 at three District schools. Teamwork USA hopes to improve student achievement at select Title I elementary schools via its Instrumental Music Program grant. This memorandum to…

  20. Chromophoric dissolved organic matter (CDOM) variability in Barataria Basin using excitation-emission matrix (EEM) fluorescence and parallel factor analysis (PARAFAC).

    PubMed

    Singh, Shatrughan; D'Sa, Eurico J; Swenson, Erick M

    2010-07-15

    Chromophoric dissolved organic matter (CDOM) variability in Barataria Basin, Louisiana, USA,was examined by excitation emission matrix (EEM) fluorescence combined with parallel factor analysis (PARAFAC). CDOM optical properties of absorption and fluorescence at 355nm along an axial transect (36 stations) during March, April, and May 2008 showed an increasing trend from the marine end member to the upper basin with mean CDOM absorption of 11.06 + or - 5.01, 10.05 + or - 4.23, 11.67 + or - 6.03 (m(-)(1)) and fluorescence 0.80 + or - 0.37, 0.78 + or - 0.39, 0.75 + or - 0.51 (RU), respectively. PARAFAC analysis identified two terrestrial humic-like (component 1 and 2), one non-humic like (component 3), and one soil derived humic acid like (component 4) components. The spatial variation of the components showed an increasing trend from station 1 (near the mouth of basin) to station 36 (end member of bay; upper basin). Deviations from this increasing trend were observed at a bayou channel with very high chlorophyll-a concentrations especially for component 3 in May 2008 that suggested autochthonous production of CDOM. The variability of components with salinity indicated conservative mixing along the middle part of the transect. Component 1 and 4 were found to be relatively constant, while components 2 and 3 revealed an inverse relationship for the sampling period. Total organic carbon showed increasing trend for each of the components. An increase in humification and a decrease in fluorescence indices along the transect indicated an increase in terrestrial derived organic matter and reduced microbial activity from lower to upper basin. The use of these indices along with PARAFAC results improved dissolved organic matter characterization in the Barataria Basin. Copyright 2010 Elsevier B.V. All rights reserved.

  1. Consistency between hydrological models and field observations: Linking processes at the hillslope scale to hydrological responses at the watershed scale

    USGS Publications Warehouse

    Clark, M.P.; Rupp, D.E.; Woods, R.A.; Tromp-van, Meerveld; Peters, N.E.; Freer, J.E.

    2009-01-01

    The purpose of this paper is to identify simple connections between observations of hydrological processes at the hillslope scale and observations of the response of watersheds following rainfall, with a view to building a parsimonious model of catchment processes. The focus is on the well-studied Panola Mountain Research Watershed (PMRW), Georgia, USA. Recession analysis of discharge Q shows that while the relationship between dQ/dt and Q is approximately consistent with a linear reservoir for the hillslope, there is a deviation from linearity that becomes progressively larger with increasing spatial scale. To account for these scale differences conceptual models of streamflow recession are defined at both the hillslope scale and the watershed scale, and an assessment made as to whether models at the hillslope scale can be aggregated to be consistent with models at the watershed scale. Results from this study show that a model with parallel linear reservoirs provides the most plausible explanation (of those tested) for both the linear hillslope response to rainfall and non-linear recession behaviour observed at the watershed outlet. In this model each linear reservoir is associated with a landscape type. The parallel reservoir model is consistent with both geochemical analyses of hydrological flow paths and water balance estimates of bedrock recharge. Overall, this study demonstrates that standard approaches of using recession analysis to identify the functional form of storage-discharge relationships identify model structures that are inconsistent with field evidence, and that recession analysis at multiple spatial scales can provide useful insights into catchment behaviour. Copyright ?? 2008 John Wiley & Sons, Ltd.

  2. Hans Driesch and the problems of "normal psychology". Rereading his Crisis in Psychology (1925).

    PubMed

    Allesch, Christian G

    2012-06-01

    In 1925, the German biologist and philosopher Hans Driesch published a booklet entitled The Crisis in Psychology. It was originally published in English and was based on lectures given at various universities in China, Japan and the USA. The "crisis" in psychology of that time, in Driesch's opinion, lies in the necessity to decide about "the road which psychology is to follow in the future". This necessity refers to five "critical points", namely (1) to develop the theory of psychic elements to a theory of meaning by phenomenological analysis, (2) the overcoming of association theory, (3) to acknowledge that the unconscious is a fact and a "normal" aspect of mental life, (4) to reject "psychomechanical parallelism" or any other epiphenomenalistic solution of the mind-body problem, and (5) the extension of psychical research to new facts as described by parapsychology, for instance. Driesch saw close parallels between the development of modern psychology and that of biology, namely in a theoretical shift from "sum-concepts" like association and mechanics, to "totality-concepts" like soul and entelechy. The German translation of 1926 was entitled Grundprobleme der Psychologie (Fundamental Problems of Psychology) while "the crisis in psychology" forms just the subtitle of this book. This underlines that Driesch's argumentation--in contrast to that of Buehler--dealt with ontological questions rather than with paradigms. Copyright © 2011 Elsevier Ltd. All rights reserved.

  3. Magnetic fabric constraints of the emplacement of igneous intrusions

    NASA Astrophysics Data System (ADS)

    Maes, Stephanie M.

    Fabric analysis is critical to evaluating the history, kinematics, and dynamics of geological deformation. This is particularly true of igneous intrusions, where the development of fabric is used to constrain magmatic flow and emplacement mechanisms. Fabric analysis was applied to three mafic intrusions, with different tectonic and petrogenetic histories, to study emplacement and magma flow: the Insizwa sill (Mesozoic Karoo Large Igneous Province, South Africa), Sonju Lake intrusion (Proterozoic Midcontinent Rift, Minnesota, USA), and Palisades sill (Mesozoic rift basin, New Jersey, USA). Multiple fabric analysis techniques were used to define the fabric in each intrusive body. Using digital image analysis techniques on multiple thin sections, the three-dimensional shape-preferred orientation (SPO) of populations of mineral phases were calculated. Low-field anisotropy of magnetic susceptibility (AMS) measurements were used as a proxy for the mineral fabric of the ferromagnetic phases (e.g., magnetite). In addition, a new technique---high-field AMS---was used to isolate the paramagnetic component of the fabric (e.g., silicate fabric). Each fabric analysis technique was then compared to observable field fabrics as a framework for interpretation. In the Insizwa sill, magnetic properties were used to corroborate vertical petrologic zonation and distinguish sub-units within lithologically defined units. Abrupt variation in magnetic properties provides evidence supporting the formation of the Insizwa sill by separate magma intrusions. Low-field AMS fabrics in the Sonju Lake intrusion exhibit consistent SW-plunging lineations and SW-dipping foliations. These fabric orientations provide evidence that the cumulate layers in the intrusion were deposited in a dynamic environment, and indicate magma flowed from southwest to northeast, parallel to the pre-existing rift structures. In the Palisades sill, the magnetite SPO and low-field AMS lineation have developed orthogonal to the plagioclase SPO and high-field AMS lineation. Magma flow in the Palisades magmatic system is interpreted to have originated from a point source feeder. Low-field AMS records the flow direction, whereas high-field AMS records extension within the igneous sheet. The multiple fabric analysis techniques presented in this dissertation have advanced our understanding of the development of fabric and its relationship to internal structure, emplacement, and magma dynamics in mafic igneous systems.

  4. Microbial Community Stratification Linked to Utilization of Carbohydrates and Phosphorus Limitation in a Boreal Peatland at Marcell Experimental Forest, Minnesota, USA

    PubMed Central

    Tfaily, Malak M.; Steinweg, J. Megan; Chanton, Patrick; Esson, Kaitlin; Yang, Zamin K.; Chanton, Jeffrey P.; Cooper, William; Schadt, Christopher W.

    2014-01-01

    This study investigated the abundance, distribution, and composition of microbial communities at the watershed scale in a boreal peatland within the Marcell Experimental Forest (MEF), Minnesota, USA. Through a close coupling of next-generation sequencing, biogeochemistry, and advanced analytical chemistry, a biogeochemical hot spot was revealed in the mesotelm (30- to 50-cm depth) as a pronounced shift in microbial community composition in parallel with elevated peat decomposition. The relative abundance of Acidobacteria and the Syntrophobacteraceae, including known hydrocarbon-utilizing genera, was positively correlated with carbohydrate and organic acid content, showing a maximum in the mesotelm. The abundance of Archaea (primarily crenarchaeal groups 1.1c and 1.3) increased with depth, reaching up to 60% of total small-subunit (SSU) rRNA gene sequences in the deep peat below the 75-cm depth. Stable isotope geochemistry and potential rates of methane production paralleled vertical changes in methanogen community composition to indicate a predominance of acetoclastic methanogenesis mediated by the Methanosarcinales in the mesotelm, while hydrogen-utilizing methanogens predominated in the deeper catotelm. RNA-derived pyrosequence libraries corroborated DNA sequence data to indicate that the above-mentioned microbial groups are metabolically active in the mid-depth zone. Fungi showed a maximum in rRNA gene abundance above the 30-cm depth, which comprised only an average of 0.1% of total bacterial and archaeal rRNA gene abundance, indicating prokaryotic dominance. Ratios of C to P enzyme activities approached 0.5 at the acrotelm and catotelm, indicating phosphorus limitation. In contrast, P limitation pressure appeared to be relieved in the mesotelm, likely due to P solubilization by microbial production of organic acids and C-P lyases. Based on path analysis and the modeling of community spatial turnover, we hypothesize that P limitation outweighs N limitation at MEF, and microbial communities are structured by the dominant shrub, Chamaedaphne calyculata, which may act as a carbon source for major consumers in the peatland. PMID:24682300

  5. A Proposed Solution to the Problem with Using Completely Random Data to Assess the Number of Factors with Parallel Analysis

    ERIC Educational Resources Information Center

    Green, Samuel B.; Levy, Roy; Thompson, Marilyn S.; Lu, Min; Lo, Wen-Juo

    2012-01-01

    A number of psychometricians have argued for the use of parallel analysis to determine the number of factors. However, parallel analysis must be viewed at best as a heuristic approach rather than a mathematically rigorous one. The authors suggest a revision to parallel analysis that could improve its accuracy. A Monte Carlo study is conducted to…

  6. All roads lead to weediness: Patterns of genomic divergence reveal extensive recurrent weedy rice origins from South Asian Oryza.

    PubMed

    Huang, Zhongyun; Young, Nelson D; Reagon, Michael; Hyma, Katie E; Olsen, Kenneth M; Jia, Yulin; Caicedo, Ana L

    2017-06-01

    Weedy rice (Oryza spp.), a weedy relative of cultivated rice (O. sativa), infests and persists in cultivated rice fields worldwide. Many weedy rice populations have evolved similar adaptive traits, considered part of the 'agricultural weed syndrome', making this an ideal model to study the genetic basis of parallel evolution. Understanding parallel evolution hinges on accurate knowledge of the genetic background and origins of existing weedy rice groups. Using population structure analyses of South Asian and US weedy rice, we show that weeds in South Asia have highly heterogeneous genetic backgrounds, with ancestry contributions both from cultivated varieties (aus and indica) and wild rice. Moreover, the two main groups of weedy rice in the USA, which are also related to aus and indica cultivars, constitute a separate origin from that of Asian weeds. Weedy rice populations in South Asia largely converge on presence of red pericarps and awns and on ease of shattering. Genomewide divergence scans between weed groups from the USA and South Asia, and their crop relatives are enriched for loci involved in metabolic processes. Some candidate genes related to iconic weedy traits and competitiveness are highly divergent between some weed-crop pairs, but are not shared among all weed-crop comparisons. Our results show that weedy rice is an extreme example of recurrent evolution, and suggest that most populations are evolving their weedy traits through different genetic mechanisms. © 2017 John Wiley & Sons Ltd.

  7. PREFACE: The 15th International Conference on X-ray Absorption Fine Structure (XAFS15)

    NASA Astrophysics Data System (ADS)

    Wu, Z. Y.

    2013-04-01

    The 15th International Conference on X-ray Absorption Fine Structure (XAFS15) was held on 22-28 July 2012 in Beijing, P. R. China. About 340 scientists from 34 countries attended this important international event. Main hall Figure 1. Main hall of XAFS15. The rapidly increasing application of XAFS to the study of a large variety of materials and the operation of the new SR source led to the first meeting of XAFS users in 1981 in England. Following that a further 14 International Conferences have been held. Comparing a breakdown of attendees according to their national origin, it is clear that participation is spreading to include attendees from more and more countries every year. The strategy of development in China of science and education is increasing quickly thanks to the large investment in scientific and technological research and infrastructure. There are three Synchrotron Radiation facilities in mainland China, Hefei Light Source (HLS) in the National Natural Science Foundation of China (NSRL), Beijing Synchrotron Radiation Facility (BSRF) in the Institute of High Energy Physics, and Shanghai Synchrotron Radiation Facility (SSRF) in the Shanghai Institute of Applied Physics. More than 10000 users and over 5000 proposals run at these facilities. Among them, many teams from the USA, Japan, German, Italy, Russia, and other countries. More than 3000 manuscript were published in SCI journals, including (incomplete) Science (7), Nature (10), Nature Series (7), PNAS (3), JACS (12), Angew. Chem. Int. Ed. (15), Nano Lett. (2), etc. In XAFS15, the participants contributed 18 plenary invited talks, 16 parallel invited talks, 136 oral presentations, 12 special talks, and 219 poster presentations. Wide communication was promoted in the conference halls, the classical banquet restaurant, and the Great Wall. Parallel hallCommunicationPoster room Figure 2. Parallel hallFigure 3. CommunicationFigure 4. Poster room This volume contains 136 invited and contributed papers, accepted after a rigorous peer review procedure. A group of about 90 outstanding scientists in the field reviewed and suggested revisions of the manuscripts to improve scientific presentation. As a result, we believe the entire volume has reached a high standard. The 19 topics covered are listed as follows: Theory Data analysis New technology and devices of XAFS Applications in Nano science and technology Applications in Life Science Applications in Chemistry Applications in Catalytic Science Applications in Surface and Interface Science Applications in Material Science Applications in Energy and Environmental Science Applications in Magnetic and Related Material Science Applications in Nuclear Science Applications in Disordered Systems Applications in Extreme Conditions Applications for Time-resolved experiments XMCD technology and its applications Advanced methods (e.g., new coherent sources and spectroscopic imaging techniques) XAFS combined with other experimental methods Other related studies We hope this volume will be a useful reference for the ongoing scientific activity in XAFS. We would also like to express our sincere appreciation to the sponsors for their generous support: Chinese Academy of Science, National Natural Science Foundation of China, China Center of Advanced Science and Technology World Laboratory, University of Science and Technology of China, National Synchrotron Radiation Laboratory, Institute of High Energy Physics Chinese Academy of Sciences, and our commercial sponsors (AREVA, Xi'an Action Power Electric Co., Ltd). Finally, we would like to acknowledge the entire local organizing staff (names are given below) and particularly the collaborators and members of the XAS group at the National Natural Science Foundation of China and Institute of High Energy Physics Chinese Academy of Sciences for their efforts to make the XAFS15 conference a success. Ziyu Wu Chair of the Conference and Proceedings Editor Hefei, P. R. China, 28 September 2012 Committees and Staff Chair of the Conference Ziyu Wu International Advisory Committee Adam Hitchcock, Canada Adriano Filipponi, Italy Alain Manceau, France Alexander Soldatov, Russia Andrea Di Cicco, Italy Britt Hedman, USA Bruce Bunker, USA Calogero R. Natoli, Italy Christopher T. Chantler, Australia Frank M. F. De Groot, Netherlands Hiroyuki Oyanagi, Japan Ingolf Lindau, USA J. Mustre de Leon, México James E Penner-Hahn, USA Joaquin Garcia Ruiz, Spain John Evans, UK John J. Rehr, USA Kiyotaka Asakura, Japan Majed Chergui, Switzerland Mark Newton, UK Shiqiang Wei, P. R. China Tsun-Kong Sham, Canada Ziyu Wu, P. R. China International Program Committee Antonio Bianconi, Italy Augusto Marcelli, Italy Emad Flear Aziz, Germany Jinghua Guo, USA Joly Yves, France Masaharu Nomura, Japan Maurizio Benfatto, Italy Pieter Glatzel, France Shiqiang Wei, China Tiandou Hu, China Toshihiko Yokoyama, Japan Way-Faung Pong, Taiwan Xinyi Zhang, China Yi Xie, China Yuying Huang, China Zhonghua Wu, China Ziyu Wu, China Local Organizing Committee Bo He Fengchun Hu Haifeng Zhao Jing Zhang Meijuan Yu Qin Yu Shuo Zhang Wangsheng Chu Wei He Wei Xu Wensheng Yan Xiaomei Gong Xing Chen Yang Zou Yi Xia Zheng Jiang Zhi Xie Zhihu Sun Zhiyun Pan Additional Staff Chengxun Liu

  8. Molecular insights into the progression of crystalline silica-induced pulmonary toxicity in rats.

    PubMed

    Sellamuthu, Rajendran; Umbright, Christina; Roberts, Jenny R; Cumpston, Amy; McKinney, Walter; Chen, Bean T; Frazer, David; Li, Shengqiao; Kashon, Michael; Joseph, Pius

    2013-04-01

    Identification of molecular target(s) and mechanism(s) of silica-induced pulmonary toxicity is important for the intervention and/or prevention of diseases associated with exposure to silica. Rats were exposed to crystalline silica by inhalation (15 mg m(-3), 6 h per day, 5 days) and global gene expression profile was determined in the lungs by microarray analysis at 1, 2, 4, 8 and 16 weeks following termination of silica exposure. The number of significantly differentially expressed genes (>1.5-fold change and <0.01 false discovery rate P-value) detected in the lungs during the post-exposure time intervals analyzed exhibited a steady increase in parallel with the progression of silica-induced pulmonary toxicity noticed in the rats. Quantitative real-time PCR analysis of a representative set of 10 genes confirmed the microarray findings. The number of biological functions, canonical pathways and molecular networks significantly affected by silica exposure, as identified by the bioinformatics analysis of the significantly differentially expressed genes detected during the post-exposure time intervals, also exhibited a steady increase similar to the silica-induced pulmonary toxicity. Genes involved in oxidative stress, inflammation, respiratory diseases, cancer, and tissue remodeling and fibrosis were significantly differentially expressed in the rat lungs; however, unresolved inflammation was the single most significant biological response to pulmonary exposure to silica. Excessive mucus production, as implicated by significant overexpression of the pendrin coding gene, SLC26A4, was identified as a potential novel mechanism for silica-induced pulmonary toxicity. Collectively, the findings of our study provided insights into the molecular mechanisms underlying the progression of crystalline silica-induced pulmonary toxicity in the rat. Published 2012. This article is a US Government work and is in the public domain in the USA. Published 2012. This article is a US Government work and is in the public domain in the USA.

  9. Computational fluid dynamics study of the end-side and sequential coronary artery bypass anastomoses in a native coronary occlusion model.

    PubMed

    Matsuura, Kaoru; Jin, Wei Wei; Liu, Hao; Matsumiya, Goro

    2018-04-01

    The objective of this study was to evaluate the haemodynamic patterns in each anastomosis fashion using a computational fluid dynamic study in a native coronary occlusion model. Fluid dynamic computations were carried out with ANSYS CFX (ANSYS Inc., Canonsburg, PA, USA) software. The incision lengths for parallel and diamond anastomoses were fixed at 2 mm. Native vessels were set to be totally occluded. The diameter of both the native and graft vessels was set to be 2 mm. The inlet boundary condition was set by a sample of the transient time flow measurement which was measured intraoperatively. The diamond anastomosis was observed to reduce flow to the native outlet and increase flow to the bypass outlet; the opposite was observed in the parallel anastomosis. Total energy efficiency was higher in the diamond anastomosis than the parallel anastomosis. Wall shear stress was higher in the diamond anastomosis than in the parallel anastomosis; it was the highest at the top of the outlet. A high oscillatory shear index was observed at the bypass inlet in the parallel anastomosis and at the native inlet in the diamond anastomosis. The diamond sequential anastomosis would be an effective option for multiple sequential bypasses because of the better flow to the bypass outlet than with the parallel anastomosis. However, flow competition should be kept in mind while using the diamond anastomosis for moderately stenotic vessels because of worsened flow to the native outlet. Care should be taken to ensure that the fluid dynamics patterns are optimal and prevent future native and bypass vessel disease progression.

  10. Probing entropic repulsion through mesoscopic simulations

    NASA Astrophysics Data System (ADS)

    Vaiwala, Rakesh; Thaokar, Rochish

    2017-11-01

    Following the publication of Freund's work on entropic pressure (Freund L., Proc. Natl. Acad. Sci. U.S.A., 110 (2013) 2047), which states that the undulation pressure for a biomembrane confined between two parallel rigid walls decays linearly with wall separation (d), different contradictory views on the pressure law evolved; some supporting Helfrich's prediction that the entropic pressure scales as 1/d3 . Attempts were made to resolve this stark contradiction by distinct groups of researchers. In this work, using dissipative particle dynamics simulations it has been shown for the first time that the height fluctuations are suppressed by wall confinement. An analysis of the fluctuation spectrum reveals that the entropic loss manifests as a membrane tension for a membrane that conserves its local area, and this fact is confirmed by the rise in stresses with an increase in wall confinement. Furthermore, we theorize a pressure law, which interestingly is congruous with Freund's prediction that the entropic pressure scales as 1/d .

  11. Semiautomatic and Automatic Cooperative Inversion of Seismic and Magnetotelluric Data

    NASA Astrophysics Data System (ADS)

    Le, Cuong V. A.; Harris, Brett D.; Pethick, Andrew M.; Takam Takougang, Eric M.; Howe, Brendan

    2016-09-01

    Natural source electromagnetic methods have the potential to recover rock property distributions from the surface to great depths. Unfortunately, results in complex 3D geo-electrical settings can be disappointing, especially where significant near-surface conductivity variations exist. In such settings, unconstrained inversion of magnetotelluric data is inexorably non-unique. We believe that: (1) correctly introduced information from seismic reflection can substantially improve MT inversion, (2) a cooperative inversion approach can be automated, and (3) massively parallel computing can make such a process viable. Nine inversion strategies including baseline unconstrained inversion and new automated/semiautomated cooperative inversion approaches are applied to industry-scale co-located 3D seismic and magnetotelluric data sets. These data sets were acquired in one of the Carlin gold deposit districts in north-central Nevada, USA. In our approach, seismic information feeds directly into the creation of sets of prior conductivity model and covariance coefficient distributions. We demonstrate how statistical analysis of the distribution of selected seismic attributes can be used to automatically extract subvolumes that form the framework for prior model 3D conductivity distribution. Our cooperative inversion strategies result in detailed subsurface conductivity distributions that are consistent with seismic, electrical logs and geochemical analysis of cores. Such 3D conductivity distributions would be expected to provide clues to 3D velocity structures that could feed back into full seismic inversion for an iterative practical and truly cooperative inversion process. We anticipate that, with the aid of parallel computing, cooperative inversion of seismic and magnetotelluric data can be fully automated, and we hold confidence that significant and practical advances in this direction have been accomplished.

  12. Parallel-vector computation for linear structural analysis and non-linear unconstrained optimization problems

    NASA Technical Reports Server (NTRS)

    Nguyen, D. T.; Al-Nasra, M.; Zhang, Y.; Baddourah, M. A.; Agarwal, T. K.; Storaasli, O. O.; Carmona, E. A.

    1991-01-01

    Several parallel-vector computational improvements to the unconstrained optimization procedure are described which speed up the structural analysis-synthesis process. A fast parallel-vector Choleski-based equation solver, pvsolve, is incorporated into the well-known SAP-4 general-purpose finite-element code. The new code, denoted PV-SAP, is tested for static structural analysis. Initial results on a four processor CRAY 2 show that using pvsolve reduces the equation solution time by a factor of 14-16 over the original SAP-4 code. In addition, parallel-vector procedures for the Golden Block Search technique and the BFGS method are developed and tested for nonlinear unconstrained optimization. A parallel version of an iterative solver and the pvsolve direct solver are incorporated into the BFGS method. Preliminary results on nonlinear unconstrained optimization test problems, using pvsolve in the analysis, show excellent parallel-vector performance indicating that these parallel-vector algorithms can be used in a new generation of finite-element based structural design/analysis-synthesis codes.

  13. Methods for Functional Connectivity Analyses

    DTIC Science & Technology

    2012-12-13

    motor , or hand motor function (green, red, or blue shading, respectively). Thus, this work produced the first comprehensive analysis of ECoG...Computer Engineering, University of Texas at El Paso , TX, USA 3Department of Neurology, Albany Medical College, Albany, NY, USA 4Department of Computer...Department of Health, Albany, NY, USA bDepartment of Electrical and Computer Engineering, University of Texas at El Paso , TX, USA cDepartment of Neurology

  14. Parallel computing for probabilistic fatigue analysis

    NASA Technical Reports Server (NTRS)

    Sues, Robert H.; Lua, Yuan J.; Smith, Mark D.

    1993-01-01

    This paper presents the results of Phase I research to investigate the most effective parallel processing software strategies and hardware configurations for probabilistic structural analysis. We investigate the efficiency of both shared and distributed-memory architectures via a probabilistic fatigue life analysis problem. We also present a parallel programming approach, the virtual shared-memory paradigm, that is applicable across both types of hardware. Using this approach, problems can be solved on a variety of parallel configurations, including networks of single or multiprocessor workstations. We conclude that it is possible to effectively parallelize probabilistic fatigue analysis codes; however, special strategies will be needed to achieve large-scale parallelism to keep large number of processors busy and to treat problems with the large memory requirements encountered in practice. We also conclude that distributed-memory architecture is preferable to shared-memory for achieving large scale parallelism; however, in the future, the currently emerging hybrid-memory architectures will likely be optimal.

  15. Parallel Gene Expression Differences between Low and High Latitude Populations of Drosophila melanogaster and D. simulans

    PubMed Central

    Zhao, Li; Wit, Janneke; Svetec, Nicolas; Begun, David J.

    2015-01-01

    Gene expression variation within species is relatively common, however, the role of natural selection in the maintenance of this variation is poorly understood. Here we investigate low and high latitude populations of Drosophila melanogaster and its sister species, D. simulans, to determine whether the two species show similar patterns of population differentiation, consistent with a role for spatially varying selection in maintaining gene expression variation. We compared at two temperatures the whole male transcriptome of D. melanogaster and D. simulans sampled from Panama City (Panama) and Maine (USA). We observed a significant excess of genes exhibiting differential expression in both species, consistent with parallel adaptation to heterogeneous environments. Moreover, the majority of genes showing parallel expression differentiation showed the same direction of differential expression in the two species and the magnitudes of expression differences between high and low latitude populations were correlated across species, further bolstering the conclusion that parallelism for expression phenotypes results from spatially varying selection. However, the species also exhibited important differences in expression phenotypes. For example, the genomic extent of genotype × environment interaction was much more common in D. melanogaster. Highly differentiated SNPs between low and high latitudes were enriched in the 3’ UTRs and CDS of the geographically differently expressed genes in both species, consistent with an important role for cis-acting variants in driving local adaptation for expression-related phenotypes. PMID:25950438

  16. Parallel Gene Expression Differences between Low and High Latitude Populations of Drosophila melanogaster and D. simulans.

    PubMed

    Zhao, Li; Wit, Janneke; Svetec, Nicolas; Begun, David J

    2015-05-01

    Gene expression variation within species is relatively common, however, the role of natural selection in the maintenance of this variation is poorly understood. Here we investigate low and high latitude populations of Drosophila melanogaster and its sister species, D. simulans, to determine whether the two species show similar patterns of population differentiation, consistent with a role for spatially varying selection in maintaining gene expression variation. We compared at two temperatures the whole male transcriptome of D. melanogaster and D. simulans sampled from Panama City (Panama) and Maine (USA). We observed a significant excess of genes exhibiting differential expression in both species, consistent with parallel adaptation to heterogeneous environments. Moreover, the majority of genes showing parallel expression differentiation showed the same direction of differential expression in the two species and the magnitudes of expression differences between high and low latitude populations were correlated across species, further bolstering the conclusion that parallelism for expression phenotypes results from spatially varying selection. However, the species also exhibited important differences in expression phenotypes. For example, the genomic extent of genotype × environment interaction was much more common in D. melanogaster. Highly differentiated SNPs between low and high latitudes were enriched in the 3' UTRs and CDS of the geographically differently expressed genes in both species, consistent with an important role for cis-acting variants in driving local adaptation for expression-related phenotypes.

  17. Methodology of Comparative Analysis of Public School Teachers' Continuing Professional Development in Great Britain, Canada and the USA

    ERIC Educational Resources Information Center

    Mukan, Nataliya; Kravets, Svitlana

    2015-01-01

    In the article the methodology of comparative analysis of public school teachers' continuing professional development (CPD) in Great Britain, Canada and the USA has been presented. The main objectives are defined as theoretical analysis of scientific and pedagogical literature, which highlights different aspects of the problem under research;…

  18. "USA Today": Can the Nation's Newspaper Survive?

    ERIC Educational Resources Information Center

    Wicks, Robert H.

    The failure of 17 newspaper markets between 1957 and 1975 raises the question of whether the 1982 entrance of "USA Today" into the newspaper market demonstrated fiscal prudence. A 20-month advertising content analysis was conducted to assess advertising trends in "USA Today." These data were compared with industry statistics…

  19. "USA Today": Comparative Analysis with Two National and Two Los Angeles Daily Newspapers. Research Bulletin.

    ERIC Educational Resources Information Center

    Ames, Steve; And Others

    Sections of the newspaper "USA Today" were compared with corresponding sections of four major newspapers--the "New York Times," the "Wall Street Journal," the "Los Angeles Herald Examiner," and the "Los Angeles Times"--to determine what editorial components made "USA Today" different and…

  20. HPCC Methodologies for Structural Design and Analysis on Parallel and Distributed Computing Platforms

    NASA Technical Reports Server (NTRS)

    Farhat, Charbel

    1998-01-01

    In this grant, we have proposed a three-year research effort focused on developing High Performance Computation and Communication (HPCC) methodologies for structural analysis on parallel processors and clusters of workstations, with emphasis on reducing the structural design cycle time. Besides consolidating and further improving the FETI solver technology to address plate and shell structures, we have proposed to tackle the following design related issues: (a) parallel coupling and assembly of independently designed and analyzed three-dimensional substructures with non-matching interfaces, (b) fast and smart parallel re-analysis of a given structure after it has undergone design modifications, (c) parallel evaluation of sensitivity operators (derivatives) for design optimization, and (d) fast parallel analysis of mildly nonlinear structures. While our proposal was accepted, support was provided only for one year.

  1. A new parallel-vector finite element analysis software on distributed-memory computers

    NASA Technical Reports Server (NTRS)

    Qin, Jiangning; Nguyen, Duc T.

    1993-01-01

    A new parallel-vector finite element analysis software package MPFEA (Massively Parallel-vector Finite Element Analysis) is developed for large-scale structural analysis on massively parallel computers with distributed-memory. MPFEA is designed for parallel generation and assembly of the global finite element stiffness matrices as well as parallel solution of the simultaneous linear equations, since these are often the major time-consuming parts of a finite element analysis. Block-skyline storage scheme along with vector-unrolling techniques are used to enhance the vector performance. Communications among processors are carried out concurrently with arithmetic operations to reduce the total execution time. Numerical results on the Intel iPSC/860 computers (such as the Intel Gamma with 128 processors and the Intel Touchstone Delta with 512 processors) are presented, including an aircraft structure and some very large truss structures, to demonstrate the efficiency and accuracy of MPFEA.

  2. Evaluation of Parallel Analysis Methods for Determining the Number of Factors

    ERIC Educational Resources Information Center

    Crawford, Aaron V.; Green, Samuel B.; Levy, Roy; Lo, Wen-Juo; Scott, Lietta; Svetina, Dubravka; Thompson, Marilyn S.

    2010-01-01

    Population and sample simulation approaches were used to compare the performance of parallel analysis using principal component analysis (PA-PCA) and parallel analysis using principal axis factoring (PA-PAF) to identify the number of underlying factors. Additionally, the accuracies of the mean eigenvalue and the 95th percentile eigenvalue criteria…

  3. Using Horn's Parallel Analysis Method in Exploratory Factor Analysis for Determining the Number of Factors

    ERIC Educational Resources Information Center

    Çokluk, Ömay; Koçak, Duygu

    2016-01-01

    In this study, the number of factors obtained from parallel analysis, a method used for determining the number of factors in exploratory factor analysis, was compared to that of the factors obtained from eigenvalue and scree plot--two traditional methods for determining the number of factors--in terms of consistency. Parallel analysis is based on…

  4. Increasing incidence of Dirofilaria immitis in dogs in USA with focus on the southeast region 2013-2016.

    PubMed

    Drake, Jason; Wiseman, Scott

    2018-01-17

    A recent American Heartworm Society (AHS) survey on the incidence of adult heartworm infections in dogs in the United States of America showed a 21.7% increase in the average cases per veterinary clinic from 2013 to 2016. The analysis reported here was performed to see if heartworm testing results available via the Companion Animal Parasite Council (CAPC) aligned with the AHS survey and whether changes in heartworm preventive dispensing accounts for the increased incidence. The resistance of Dirofilaria immitis to macrocyclic lactones (MLs) has been previously reported. An analysis of 7-9 million heartworm antigen tests reported annually to the Companion Animal Parasite Council (CAPC) from 2013 to 2016 was conducted and compared to the 2016 AHS survey. A state-by-state analysis across the southeastern USA was also performed. National heartworm preventive dispensing data were obtained from Vetstreet LLC and analyzed. All oral, topical and injectable heartworm preventives were included in this analysis, with injectable moxidectin counting as six doses. Positive antigen tests increased by 15.28% from 2013 to 2016, similar to the 21.7% increase reported by the AHS survey. Incidence in the southeastern USA increased by17.9% while the rest of USA incidence increased by 11.4%. State-by-state analysis across the southeastern USA revealed an increased positive test frequency greater than 10% in 9 of 12 states evaluated. During this time, the overall proportion of dogs receiving heartworm prophylaxis remained relatively unchanged. Approximately 2/3 of the dogs in the USA received no heartworm prevention each year. These CAPC data show the rate of positive heartworm tests increasing significantly (P <  0.0001) in the USA from 2013 to 2016, with a higher rate of increase in the southeastern USA than nationally. Only 1/3 of dogs in the USA were dispensed one or more doses of heartworm prevention annually by veterinarians, averaging 8.6 monthly doses/year. Veterinarians and pet owners should work together to follow CAPC and AHS guidelines to protect dogs from infection with D. immitis. Lack of preventive use and the emergence of heartworm resistance to MLs could both be impacting the increased rate of positive heartworm tests in dogs.

  5. Thread concept for automatic task parallelization in image analysis

    NASA Astrophysics Data System (ADS)

    Lueckenhaus, Maximilian; Eckstein, Wolfgang

    1998-09-01

    Parallel processing of image analysis tasks is an essential method to speed up image processing and helps to exploit the full capacity of distributed systems. However, writing parallel code is a difficult and time-consuming process and often leads to an architecture-dependent program that has to be re-implemented when changing the hardware. Therefore it is highly desirable to do the parallelization automatically. For this we have developed a special kind of thread concept for image analysis tasks. Threads derivated from one subtask may share objects and run in the same context but may process different threads of execution and work on different data in parallel. In this paper we describe the basics of our thread concept and show how it can be used as basis of an automatic task parallelization to speed up image processing. We further illustrate the design and implementation of an agent-based system that uses image analysis threads for generating and processing parallel programs by taking into account the available hardware. The tests made with our system prototype show that the thread concept combined with the agent paradigm is suitable to speed up image processing by an automatic parallelization of image analysis tasks.

  6. Predictive relationship of osteopathic manual medicine grades and COMLEX-USA Level 1 total scores and osteopathic principles and practice subscores.

    PubMed

    Lewis, Drew D; Johnson, Mary T; Finnerty, Edward P

    2014-06-01

    Osteopathic manual medicine (OMM) encompasses hands-on diagnosis and treatment as part of patient care. The area of osteopathic principles and practice (OPP) is considered a core competency for students and practitioners of this medical tradition. The Comprehensive Osteopathic Medical Licensing Examination-USA (COMLEX-USA) is a useful tool for assessing candidates' competency. To examine the relationship of COMLEX-USA Level 1 total scores and OPP subscores with OMM course grades, and to determine if these grades are predictive of COMLEX-USA Level 1 OPP performance. The authors collected data-COMLEX-USA Level 1 total and OPP subscores, OMM grades (written, practical, and total for first and second academic years), sex, and age-for a cohort of osteopathic medical students at a single institution, and these data were then analyzed by means of correlation analysis. Records were obtained from a second-year class of osteopathic medical students (N=217). The authors' analysis of total scores and OPP subscores on COMLEX-USA Level 1 yielded a statistically significant correlation with all variables. Although the correlations were moderate, second-year written examination grades showed the strongest association with the COMLEX-USA Level 1 OPP subscores (r=0.530) and total scores (r=0.566). Performance in the second-year OMM written examination could identify students potentially at risk for poor performance on COMLEX-USA Level 1. © 2014 The American Osteopathic Association.

  7. Relationship between COMLEX-USA scores and performance on the American Osteopathic Board of Emergency Medicine Part I certifying examination.

    PubMed

    Li, Feiming; Gimpel, John R; Arenson, Ethan; Song, Hao; Bates, Bruce P; Ludwin, Fredric

    2014-04-01

    Few studies have investigated how well scores from the Comprehensive Osteopathic Medical Licensing Examination-USA (COMLEX-USA) series predict resident outcomes, such as performance on board certification examinations. To determine how well COMLEX-USA predicts performance on the American Osteopathic Board of Emergency Medicine (AOBEM) Part I certification examination. The target study population was first-time examinees who took AOBEM Part I in 2011 and 2012 with matched performances on COMLEX-USA Level 1, Level 2-Cognitive Evaluation (CE), and Level 3. Pearson correlations were computed between AOBEM Part I first-attempt scores and COMLEX-USA performances to measure the association between these examinations. Stepwise linear regression analysis was conducted to predict AOBEM Part I scores by the 3 COMLEX-USA scores. An independent t test was conducted to compare mean COMLEX-USA performances between candidates who passed and who failed AOBEM Part I, and a stepwise logistic regression analysis was used to predict the log-odds of passing AOBEM Part I on the basis of COMLEX-USA scores. Scores from AOBEM Part I had the highest correlation with COMLEX-USA Level 3 scores (.57) and slightly lower correlation with COMLEX-USA Level 2-CE scores (.53). The lowest correlation was between AOBEM Part I and COMLEX-USA Level 1 scores (.47). According to the stepwise regression model, COMLEX-USA Level 1 and Level 2-CE scores, which residency programs often use as selection criteria, together explained 30% of variance in AOBEM Part I scores. Adding Level 3 scores explained 37% of variance. The independent t test indicated that the 397 examinees passing AOBEM Part I performed significantly better than the 54 examinees failing AOBEM Part I in all 3 COMLEX-USA levels (P<.001 for all 3 levels). The logistic regression model showed that COMLEX-USA Level 1 and Level 3 scores predicted the log-odds of passing AOBEM Part I (P=.03 and P<.001, respectively). The present study empirically supported the predictive and discriminant validities of the COMLEX-USA series in relation to the AOBEM Part I certification examination. Although residency programs may use COMLEX-USA Level 1 and Level 2-CE scores as partial criteria in selecting residents, Level 3 scores, though typically not available at the time of application, are actually the most statistically related to performances on AOBEM Part I.

  8. Diel distribution of age-0 largemouth bass, Micropterus salmoides, in B. E. Jordan Lake, North Carolina (USA) and its relation to cover

    USGS Publications Warehouse

    Irwin, E.R.; Noble, R.L.

    2000-01-01

    We used prepositioned area electrofishers (PAEs, 10X1.5 m) to assess diel differences in distribution of age-0 largemouth bass, Micropterus salmoides, in August 1992-1993 in a paired sampling design. PAEs were placed parallel to shore in an embayment of an unvegetated reservoir (B. E. Jordan Lake, North Carolina, USA). The catch per unit effort (CPUE=fish/PAE) was significantly higher at night than during the day in both years, indicating that age-0 largemouth bass exhibit nocturnal inshore movements. Age-0 largemouth bass captured inshore during day were smaller than those captured at night, indicating that movement patterns may change ontogenetically. Inshore-offshore movements of age-0 largemouth bass were significantly reduced in the presence of cover, suggesting that diel movements were influenced by specific habitat components. Diel movements likely were related to foraging, resting and predator avoidance behavior and could affect population dynamics and introduce bias in assessment programs.

  9. Strongyloidiasis in Latin American immigrants: a pilot study.

    PubMed

    Ostera, G; Blum, J; Cornejo, C; Burgula, S; Jeun, R; Bryan, P E; Mejia, R

    2017-03-01

    The United States of America (USA) has the largest international population of any nation in the world. Immigrants from Latin American countries, where intestinal parasites are endemic, comprise more than half of this population. This study aims to determine the prevalence of strongyloidiasis, a potentially deadly parasitic infection, in foreign-born individuals. We conducted a cross-sectional study in Washington, DC, to determine the seroprevalence of Strongyloides stercoralis infection using an NIE-ELISA IgG antibody assay. Multi-parallel quantitative real-time polymerase chain reaction (qPCR) was performed in stool samples of NIE-ELISA-positive patients to investigate possible polyparasitism. The NIE-ELISA assay detected an S. stercoralis prevalence of 4.2% in a group of 119 volunteers. Combining NIE-ELISA and qPCR detected a parasite prevalence of 5.0%. Our results underscore the relevance of systematic testing for gastrointestinal parasites in individuals from endemic regions. It also makes a case for a survey in the USA to identify immigrants' risk for strongyloidiasis and other gastrointestinal parasitic infections.

  10. An Analysis of Counterinsurgency in Iraq: Mosul, Ramadi, and Samarra from 2003-2005

    DTIC Science & Technology

    2006-12-01

    Information Operations as Part of COIN Warfare”, School of Advanced Military Studies, May 2005. Mounir Elkhamri, Lester W. Grau, Laurie King-Irani, Amanda S...Lieutenant Colonel, USA, July, 22, 2006 Lovelace , Daniel, Captain, USA, August 6, 2006. Mathews, Tim, Captain, USA, August 11, 2006. McLamb, Joseph

  11. Physicochemical and sensory analysis of USA rice varieties developed for the basmati and jasmine markets

    USDA-ARS?s Scientific Manuscript database

    There is a steady demand for imported basmati and jasmine rice in The USA. Rice varieties that can be domestically produced and compete with these imports, have been developed from basmati, jasmine, and other aromatic germplasm sources. This study evaluated differences among eight USA aromatic varie...

  12. Reported awareness of tobacco advertising and promotion in China compared to Thailand, Australia and the USA.

    PubMed

    Li, L; Yong, H-H; Borland, R; Fong, G T; Thompson, M E; Jiang, Y; Yang, Y; Sirirassamee, B; Hastings, G; Harris, F

    2009-06-01

    China currently does not have comprehensive laws or regulations on tobacco advertising and promotion, although it ratified the World Health Organization (WHO) Framework Convention on Tobacco Control (FCTC) in October 2005 and promised to ban all tobacco advertising by January 2011. Much effort is needed to monitor the current situation of tobacco advertising and promotion in China. This study aims to examine levels of awareness of tobacco advertising and promotion among smokers in China as compared to other countries with different levels of restrictions. One developing country (Thailand) and two developed countries (Australia and the USA) were selected for comparison. All four countries are part of the International Tobacco Control (ITC) Policy Evaluation Survey project. Between 2005 and 2006, parallel ITC surveys were conducted among adult smokers (at least smoked weekly) in China (n = 4763), Thailand (n = 2000), Australia (n = 1767) and the USA (n = 1780). Unprompted and prompted recall of noticing tobacco advertising and promotion were measured. Chinese respondents reported noticing tobacco advertisements in a range of channels and venues, with highest exposure levels on television (34.5%), billboards (33.4%) and in stores (29.2%). A quarter of respondents noticed tobacco sponsorships, and a high level of awareness of promotion was reported. Cross-country comparison reveals that overall reported awareness was significantly higher in China than in Thailand (particularly) and Australia, but lower than in the USA. There is a big gap between China and the better-performing countries such as Thailand and Australia regarding tobacco promotion restrictions. China needs to do more, including enhanced policy and more robust enforcement.

  13. Computational mechanics analysis tools for parallel-vector supercomputers

    NASA Technical Reports Server (NTRS)

    Storaasli, Olaf O.; Nguyen, Duc T.; Baddourah, Majdi; Qin, Jiangning

    1993-01-01

    Computational algorithms for structural analysis on parallel-vector supercomputers are reviewed. These parallel algorithms, developed by the authors, are for the assembly of structural equations, 'out-of-core' strategies for linear equation solution, massively distributed-memory equation solution, unsymmetric equation solution, general eigensolution, geometrically nonlinear finite element analysis, design sensitivity analysis for structural dynamics, optimization search analysis and domain decomposition. The source code for many of these algorithms is available.

  14. Evolving Epidemiology of Staphylococcus aureus Bacteremia.

    PubMed

    Rhee, Yoona; Aroutcheva, Alla; Hota, Bala; Weinstein, Robert A; Popovich, Kyle J

    2015-12-01

    Methicillin-resistant Staphylococcus aureus (MRSA) infections due to USA300 have become widespread in community and healthcare settings. It is unclear whether risk factors for bloodstream infections (BSIs) differ by strain type. To examine the epidemiology of S. aureus BSIs, including USA300 and non-USA300 MRSA strains. Retrospective observational study with molecular analysis. Large urban public hospital. Individuals with S. aureus BSIs from January 1, 2007 through December 31, 2013. We used electronic surveillance data to identify cases of S. aureus BSI. Available MRSA isolates were analyzed by pulsed-field gel electrophoresis. Poisson regression was used to evaluate changes in BSI incidence over time. Risk factor data were collected by medical chart review and logistic regression was used for multivariate analysis of risk factors. A total of 1,015 cases of S. aureus BSIs were identified during the study period; 36% were due to MRSA. The incidence of hospital-onset (HO) MRSA BSIs decreased while that of community-onset (CO) MRSA BSIs remained stable. The rate of CO- and HO- methicillin-susceptible S. aureus infections both decreased over time. More than half of HO-MRSA BSIs were due to the USA300 strain type and for 4 years, the proportion of HO-MRSA BSIs due to USA300 exceeded 60%. On multivariate analysis, current or former drug use was the only epidemiologic risk factor for CO- or HO-MRSA BSIs due to USA300 strains. USA300 MRSA is endemic in communities and hospitals and certain populations (eg, those who use illicit drugs) may benefit from enhanced prevention efforts in the community.

  15. College Students' Use of Social Media for Health in the USA and Korea

    ERIC Educational Resources Information Center

    Oh, Sanghee; Kim, Soojung

    2014-01-01

    Purpose: This exploratory study aims to understand college students' use and perception of social media for health information by comparing college students in the USA and Korea. Method. This study surveyed 342 college students from two state-level universities in the USA and Korea (one from each country) using a convenience sample. Analysis:…

  16. System Hazard Analysis of TACOM’s Crew Station/Turret Motion Base Simulator

    DTIC Science & Technology

    1992-01-01

    Safety devices have been located on the equipment where necessary and are described in the Contraves USA Manual No. IM-27751, "INSTRUCTION MANUAL FOR...OF TACOM’s CREW STATION/TURRET MOTION BASE SIMULATOR" and Contraves USA Manual No. IM-27751, "INSTRUCTION MANUAL FOR TACOM" in an attempt to satisfy... Contraves USA and assembled jointly by Contraves USA and TACOM. All control compensation was performed by TACOM. The CS/TMBS is expected to open doors

  17. AFLP analysis of a worldwide collection of Didymella bryoniae.

    PubMed

    Kothera, Ronald T; Keinath, Anthony P; Dean, Ralph A; Farnham, Mark W

    2003-03-01

    Didymella bryoniae (anamorph Phoma cucurbitacearum) is an ascomycete that causes gummy stem blight, a foliar disease that occurs on cucurbits in greenhouses and fields throughout the world. In a previous study using RAPD analysis, little genetic diversity was found among isolates of D. bryoniae from New York and South Carolina, USA. Here we report the use of amplified fragment length polymorphism (AFLP) analysis to assess the genetic variation within a worldwide collection of D. bryoniae, 102 field and greenhouse isolates from ten states in the USA (California, Delaware, Florida, Georgia, Indiana, Maryland, Michigan, Oklahoma, South Carolina, and Texas) and seven other countries (Australia, Canada, China, Greece, Israel, Sweden, and The Netherlands) were examined. Seven different AFLP primer-pair combinations generated 450 bands, of which 134 were polymorphic (30%). Using cluster analysis, two groups and a total of seven subgroups were delineated. Representative isolates varied in their virulence on muskmelon and watermelon seedlings, but the degree of virulence was not strongly associated with AFLP groupings. However, isolates from the northern USA grouped separately from isolates originating from the southern USA.

  18. Parallel Algorithms for Image Analysis.

    DTIC Science & Technology

    1982-06-01

    8217 _ _ _ _ _ _ _ 4. TITLE (aid Subtitle) S. TYPE OF REPORT & PERIOD COVERED PARALLEL ALGORITHMS FOR IMAGE ANALYSIS TECHNICAL 6. PERFORMING O4G. REPORT NUMBER TR-1180...Continue on reverse side it neceesary aid Identlfy by block number) Image processing; image analysis ; parallel processing; cellular computers. 20... IMAGE ANALYSIS TECHNICAL 6. PERFORMING ONG. REPORT NUMBER TR-1180 - 7. AUTHOR(&) S. CONTRACT OR GRANT NUMBER(s) Azriel Rosenfeld AFOSR-77-3271 9

  19. Performance Analysis and Optimization on the UCLA Parallel Atmospheric General Circulation Model Code

    NASA Technical Reports Server (NTRS)

    Lou, John; Ferraro, Robert; Farrara, John; Mechoso, Carlos

    1996-01-01

    An analysis is presented of several factors influencing the performance of a parallel implementation of the UCLA atmospheric general circulation model (AGCM) on massively parallel computer systems. Several modificaitons to the original parallel AGCM code aimed at improving its numerical efficiency, interprocessor communication cost, load-balance and issues affecting single-node code performance are discussed.

  20. INDUCTION OF CYTOKINE PRODUCTION IN CHEETAH (ACINONYX JUBATUS) PERIPHERAL BLOOD MONONUCLEAR CELLS AND VALIDATION OF FELINE-SPECIFIC CYTOKINE ASSAYS FOR ANALYSIS OF CHEETAH SERUM.

    PubMed

    Franklin, Ashley D; Crosier, Adrienne E; Vansandt, Lindsey M; Mattson, Elliot; Xiao, Zhengguo

    2015-06-01

    Peripheral blood mononuclear cells (PBMCs) were isolated from the whole blood of cheetahs (Acinonyx jubatus ; n=3) and stimulated with lipopolysaccharides (LPS) to induce the production of proinflammatory cytokines TNF-α, IL-1β, and IL-6 for establishment of cross-reactivity between these cheetah cytokines and feline-specific cytokine antibodies provided in commercially available Feline DuoSet® ELISA kits (R&D Systems, Inc., Minneapolis, Minnesota 55413, USA). This study found that feline-specific cytokine antibodies bind specifically to cheetah proinflammatory cytokines TNF-α, IL-1β, and IL-6 from cell culture supernatants. The assays also revealed that cheetah PBMCs produce a measurable, cell concentration-dependent increase in proinflammatory cytokine production after LPS stimulation. To enable the use of these kits, which are designed for cell culture supernatants for analyzing cytokine concentrations in cheetah serum, percent recovery and parallelism of feline cytokine standards in cheetah serum were also evaluated. Cytokine concentrations in cheetah serum were approximated based on the use of domestic cat standards in the absence of cheetah standard material. In all cases (for cytokines TNF-α, IL-1β, and IL-6), percent recovery increased as the serum sample dilution increased, though percent recovery varied between cytokines at a given dilution factor. A 1:2 dilution of serum resulted in approximately 45, 82, and 7% recovery of TNF-α, IL-1β, and IL-6 standards, respectively. Adequate parallelism was observed across a large range of cytokine concentrations for TNF-α and IL-1β; however, a significant departure from parallelism was observed between the IL-6 standard and the serum samples (P=0.004). Therefore, based on our results, the Feline DuoSet ELISA (R&D Systems, Inc.) kits are valid assays for the measurement of TNF-α and IL-1β in cheetah serum but should not be used for accurate measurement of IL-6.

  1. Mapping behavioral landscapes for animal movement: a finite mixture modeling approach

    USGS Publications Warehouse

    Tracey, Jeff A.; Zhu, Jun; Boydston, Erin E.; Lyren, Lisa M.; Fisher, Robert N.; Crooks, Kevin R.

    2013-01-01

    Because of its role in many ecological processes, movement of animals in response to landscape features is an important subject in ecology and conservation biology. In this paper, we develop models of animal movement in relation to objects or fields in a landscape. We take a finite mixture modeling approach in which the component densities are conceptually related to different choices for movement in response to a landscape feature, and the mixing proportions are related to the probability of selecting each response as a function of one or more covariates. We combine particle swarm optimization and an Expectation-Maximization (EM) algorithm to obtain maximum likelihood estimates of the model parameters. We use this approach to analyze data for movement of three bobcats in relation to urban areas in southern California, USA. A behavioral interpretation of the models revealed similarities and differences in bobcat movement response to urbanization. All three bobcats avoided urbanization by moving either parallel to urban boundaries or toward less urban areas as the proportion of urban land cover in the surrounding area increased. However, one bobcat, a male with a dispersal-like large-scale movement pattern, avoided urbanization at lower densities and responded strictly by moving parallel to the urban edge. The other two bobcats, which were both residents and occupied similar geographic areas, avoided urban areas using a combination of movements parallel to the urban edge and movement toward areas of less urbanization. However, the resident female appeared to exhibit greater repulsion at lower levels of urbanization than the resident male, consistent with empirical observations of bobcats in southern California. Using the parameterized finite mixture models, we mapped behavioral states to geographic space, creating a representation of a behavioral landscape. This approach can provide guidance for conservation planning based on analysis of animal movement data using statistical models, thereby linking connectivity evaluations to empirical data.

  2. Deep crustal deformation by sheath folding in the Adirondack Mountains, USA

    NASA Technical Reports Server (NTRS)

    Mclelland, J. M.

    1988-01-01

    As described by McLelland and Isachsen, the southern half of the Adirondacks are underlain by major isoclinal (F sub 1) and open-upright (F sub 2) folds whose axes are parallel, trend approximately E-W, and plunge gently about the horizontal. These large structures are themselves folded by open upright folds trending NNE (F sub 3). It is pointed out that elongation lineations in these rocks are parallel to X of the finite strain ellipsoid developed during progressive rotational strain. The parallelism between F sub 1 and F sub 2 fold axes and elongation lineations led to the hypothesis that progressive rotational strain, with a west-directed tectonic transport, rotated earlier F sub 1-folds into parallelism with the evolving elongation lineation. Rotation is accomplished by ductile, passive flow of F sub 1-axes into extremely arcuate, E-W hinges. In order to test these hypotheses a number of large folds were mapped in the eastern Adirondacks. Other evidence supporting the existence of sheath folds in the Adirondacks is the presence, on a map scale, of synforms whose limbs pass through the vertical and into antiforms. This type of outcrop pattern is best explained by intersecting a horizontal plane with the double curvature of sheath folds. It is proposed that sheath folding is a common response of hot, ductile rocks to rotational strain at deep crustal levels. The recognition of sheath folds in the Adirondacks reconciles the E-W orientation of fold axes with an E-W elongation lineation.

  3. Parallel processing for nonlinear dynamics simulations of structures including rotating bladed-disk assemblies

    NASA Technical Reports Server (NTRS)

    Hsieh, Shang-Hsien

    1993-01-01

    The principal objective of this research is to develop, test, and implement coarse-grained, parallel-processing strategies for nonlinear dynamic simulations of practical structural problems. There are contributions to four main areas: finite element modeling and analysis of rotational dynamics, numerical algorithms for parallel nonlinear solutions, automatic partitioning techniques to effect load-balancing among processors, and an integrated parallel analysis system.

  4. A Green's function method for two-dimensional reactive solute transport in a parallel fracture-matrix system

    NASA Astrophysics Data System (ADS)

    Chen, Kewei; Zhan, Hongbin

    2018-06-01

    The reactive solute transport in a single fracture bounded by upper and lower matrixes is a classical problem that captures the dominant factors affecting transport behavior beyond pore scale. A parallel fracture-matrix system which considers the interaction among multiple paralleled fractures is an extension to a single fracture-matrix system. The existing analytical or semi-analytical solution for solute transport in a parallel fracture-matrix simplifies the problem to various degrees, such as neglecting the transverse dispersion in the fracture and/or the longitudinal diffusion in the matrix. The difficulty of solving the full two-dimensional (2-D) problem lies in the calculation of the mass exchange between the fracture and matrix. In this study, we propose an innovative Green's function approach to address the 2-D reactive solute transport in a parallel fracture-matrix system. The flux at the interface is calculated numerically. It is found that the transverse dispersion in the fracture can be safely neglected due to the small scale of fracture aperture. However, neglecting the longitudinal matrix diffusion would overestimate the concentration profile near the solute entrance face and underestimate the concentration profile at the far side. The error caused by neglecting the longitudinal matrix diffusion decreases with increasing Peclet number. The longitudinal matrix diffusion does not have obvious influence on the concentration profile in long-term. The developed model is applied to a non-aqueous-phase-liquid (DNAPL) contamination field case in New Haven Arkose of Connecticut in USA to estimate the Trichloroethylene (TCE) behavior over 40 years. The ratio of TCE mass stored in the matrix and the injected TCE mass increases above 90% in less than 10 years.

  5. Comparative Analysis of Pedagogical Technologies in the Context of Future Agrarians' Multicultural Education in the USA

    ERIC Educational Resources Information Center

    Kravets, Ruslan

    2015-01-01

    In the article the comparative analysis of pedagogical technologies in the USA has been carried out in the context of future agrarians' multicultural education. The essence of traditional and innovative pedagogical technologies and the peculiarities of their realization at higher educational establishments have been viewed. The expediency of…

  6. A Comparative Analysis of Numbers and Biology Content Domains between Turkey and the USA

    ERIC Educational Resources Information Center

    Incikabi, Lutfi; Ozgelen, Sinan; Tjoe, Hartono

    2012-01-01

    This study aimed to compare Mathematics and Science programs focusing on TIMSS content domains of Numbers and Biology that produced the largest achievement gap among students from Turkey and the USA. Specifically, it utilized the content analysis method within Turkish and New York State (NYS) frameworks. The procedures of study included matching…

  7. Parallel processing in finite element structural analysis

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.

    1987-01-01

    A brief review is made of the fundamental concepts and basic issues of parallel processing. Discussion focuses on parallel numerical algorithms, performance evaluation of machines and algorithms, and parallelism in finite element computations. A computational strategy is proposed for maximizing the degree of parallelism at different levels of the finite element analysis process including: 1) formulation level (through the use of mixed finite element models); 2) analysis level (through additive decomposition of the different arrays in the governing equations into the contributions to a symmetrized response plus correction terms); 3) numerical algorithm level (through the use of operator splitting techniques and application of iterative processes); and 4) implementation level (through the effective combination of vectorization, multitasking and microtasking, whenever available).

  8. Transmission Index Research of Parallel Manipulators Based on Matrix Orthogonal Degree

    NASA Astrophysics Data System (ADS)

    Shao, Zhu-Feng; Mo, Jiao; Tang, Xiao-Qiang; Wang, Li-Ping

    2017-11-01

    Performance index is the standard of performance evaluation, and is the foundation of both performance analysis and optimal design for the parallel manipulator. Seeking the suitable kinematic indices is always an important and challenging issue for the parallel manipulator. So far, there are extensive studies in this field, but few existing indices can meet all the requirements, such as simple, intuitive, and universal. To solve this problem, the matrix orthogonal degree is adopted, and generalized transmission indices that can evaluate motion/force transmissibility of fully parallel manipulators are proposed. Transmission performance analysis of typical branches, end effectors, and parallel manipulators is given to illustrate proposed indices and analysis methodology. Simulation and analysis results reveal that proposed transmission indices possess significant advantages, such as normalized finite (ranging from 0 to 1), dimensionally homogeneous, frame-free, intuitive and easy to calculate. Besides, proposed indices well indicate the good transmission region and relativity to the singularity with better resolution than the traditional local conditioning index, and provide a novel tool for kinematic analysis and optimal design of fully parallel manipulators.

  9. Parallel Event Analysis Under Unix

    NASA Astrophysics Data System (ADS)

    Looney, S.; Nilsson, B. S.; Oest, T.; Pettersson, T.; Ranjard, F.; Thibonnier, J.-P.

    The ALEPH experiment at LEP, the CERN CN division and Digital Equipment Corp. have, in a joint project, developed a parallel event analysis system. The parallel physics code is identical to ALEPH's standard analysis code, ALPHA, only the organisation of input/output is changed. The user may switch between sequential and parallel processing by simply changing one input "card". The initial implementation runs on an 8-node DEC 3000/400 farm, using the PVM software, and exhibits a near-perfect speed-up linearity, reducing the turn-around time by a factor of 8.

  10. Type I and Type II Error Rates and Overall Accuracy of the Revised Parallel Analysis Method for Determining the Number of Factors

    ERIC Educational Resources Information Center

    Green, Samuel B.; Thompson, Marilyn S.; Levy, Roy; Lo, Wen-Juo

    2015-01-01

    Traditional parallel analysis (T-PA) estimates the number of factors by sequentially comparing sample eigenvalues with eigenvalues for randomly generated data. Revised parallel analysis (R-PA) sequentially compares the "k"th eigenvalue for sample data to the "k"th eigenvalue for generated data sets, conditioned on"k"-…

  11. Considering Horn's Parallel Analysis from a Random Matrix Theory Point of View.

    PubMed

    Saccenti, Edoardo; Timmerman, Marieke E

    2017-03-01

    Horn's parallel analysis is a widely used method for assessing the number of principal components and common factors. We discuss the theoretical foundations of parallel analysis for principal components based on a covariance matrix by making use of arguments from random matrix theory. In particular, we show that (i) for the first component, parallel analysis is an inferential method equivalent to the Tracy-Widom test, (ii) its use to test high-order eigenvalues is equivalent to the use of the joint distribution of the eigenvalues, and thus should be discouraged, and (iii) a formal test for higher-order components can be obtained based on a Tracy-Widom approximation. We illustrate the performance of the two testing procedures using simulated data generated under both a principal component model and a common factors model. For the principal component model, the Tracy-Widom test performs consistently in all conditions, while parallel analysis shows unpredictable behavior for higher-order components. For the common factor model, including major and minor factors, both procedures are heuristic approaches, with variable performance. We conclude that the Tracy-Widom procedure is preferred over parallel analysis for statistically testing the number of principal components based on a covariance matrix.

  12. Two Distinct Clones of Methicillin-Resistant Staphylococcus aureus (MRSA) with the Same USA300 Pulsed-Field Gel Electrophoresis Profile: a Potential Pitfall for Identification of USA300 Community-Associated MRSA▿

    PubMed Central

    Larsen, Anders Rhod; Goering, Richard; Stegger, Marc; Lindsay, Jodi A.; Gould, Katherine A.; Hinds, Jason; Sørum, Marit; Westh, Henrik; Boye, Kit; Skov, Robert

    2009-01-01

    Analysis of methicillin-resistant Staphylococcus aureus (MRSA) characterized as USA300 by pulsed-field gel electrophoresis identified two distinct clones. One was similar to community-associated USA300 MRSA (ST8-IVa, t008, and Panton-Valentine leukocidin positive). The second (ST8-IVa, t024, and PVL negative) had different molecular characteristics and epidemiology, suggesting independent evolution. We recommend spa typing and/or PCR to discriminate between the two clones. PMID:19759225

  13. Computational mechanics analysis tools for parallel-vector supercomputers

    NASA Technical Reports Server (NTRS)

    Storaasli, O. O.; Nguyen, D. T.; Baddourah, M. A.; Qin, J.

    1993-01-01

    Computational algorithms for structural analysis on parallel-vector supercomputers are reviewed. These parallel algorithms, developed by the authors, are for the assembly of structural equations, 'out-of-core' strategies for linear equation solution, massively distributed-memory equation solution, unsymmetric equation solution, general eigen-solution, geometrically nonlinear finite element analysis, design sensitivity analysis for structural dynamics, optimization algorithm and domain decomposition. The source code for many of these algorithms is available from NASA Langley.

  14. Improved preconditioned conjugate gradient algorithm and application in 3D inversion of gravity-gradiometry data

    NASA Astrophysics Data System (ADS)

    Wang, Tai-Han; Huang, Da-Nian; Ma, Guo-Qing; Meng, Zhao-Hai; Li, Ye

    2017-06-01

    With the continuous development of full tensor gradiometer (FTG) measurement techniques, three-dimensional (3D) inversion of FTG data is becoming increasingly used in oil and gas exploration. In the fast processing and interpretation of large-scale high-precision data, the use of the graphics processing unit process unit (GPU) and preconditioning methods are very important in the data inversion. In this paper, an improved preconditioned conjugate gradient algorithm is proposed by combining the symmetric successive over-relaxation (SSOR) technique and the incomplete Choleksy decomposition conjugate gradient algorithm (ICCG). Since preparing the preconditioner requires extra time, a parallel implement based on GPU is proposed. The improved method is then applied in the inversion of noisecontaminated synthetic data to prove its adaptability in the inversion of 3D FTG data. Results show that the parallel SSOR-ICCG algorithm based on NVIDIA Tesla C2050 GPU achieves a speedup of approximately 25 times that of a serial program using a 2.0 GHz Central Processing Unit (CPU). Real airborne gravity-gradiometry data from Vinton salt dome (southwest Louisiana, USA) are also considered. Good results are obtained, which verifies the efficiency and feasibility of the proposed parallel method in fast inversion of 3D FTG data.

  15. Genomic and transcriptomic differences in community acquired methicillin resistant Staphylococcus aureus USA300 and USA400 strains.

    PubMed

    Jones, Marcus B; Montgomery, Christopher P; Boyle-Vavra, Susan; Shatzkes, Kenneth; Maybank, Rosslyn; Frank, Bryan C; Peterson, Scott N; Daum, Robert S

    2014-12-19

    Staphylococcus aureus is a human pathogen responsible for substantial morbidity and mortality through its ability to cause a number of human infections including bacteremia, pneumonia and soft tissue infections. Of great concern is the emergence and dissemination of methicillin-resistant Staphylococcus aureus strains (MRSA) that are resistant to nearly all β-lactams. The emergence of the USA300 MRSA genetic background among community associated S. aureus infections (CA-MRSA) in the USA was followed by the disappearance of USA400 CA-MRSA isolates. To gain a greater understanding of the potential fitness advantages and virulence capacity of S. aureus USA300 clones, we performed whole genome sequencing of 15 USA300 and 4 USA400 clinical isolates. A comparison of representative genomes of the USA300 and USA400 pulsotypes indicates a number of differences in mobile genome elements. We examined the in vitro gene expression profiles by microarray hybridization and the in vivo transcriptomes during lung infection in mice of a USA300 and a USA400 MRSA strain by performing complete genome qRT-PCR analysis. The unique presence and increased expression of 6 exotoxins in USA300 (12- to 600-fold) compared to USA400 may contribute to the increased virulence of USA300 clones. Importantly, we also observed the up-regulation of prophage genes in USA300 (compared with USA400) during mouse lung infection (including genes encoded by both prophages ΦSa2usa and ΦSa3usa), suggesting that these prophages may play an important role in vivo by contributing to the elevated virulence characteristic of the USA300 clone. We observed differences in the genetic content of USA300 and USA400 strains, as well as significant differences of in vitro and in vivo gene expression of mobile elements in a lung pneumonia model. This is the first study to document the global transcription differences between USA300 and USA400 strains during both in vitro and in vivo growth.

  16. Reported awareness of tobacco advertising and promotion in China compared to Thailand, Australia and the USA

    PubMed Central

    Li, L; Yong, H-H; Borland, R; Fong, G T; Thompson, M E; Jiang, Y; Yang, Y; Sirirassamee, B; Hastings, G; Harris, F

    2009-01-01

    Background: China currently does not have comprehensive laws or regulations on tobacco advertising and promotion, although it ratified the World Health Organization (WHO) Framework Convention on Tobacco Control (FCTC) in October 2005 and promised to ban all tobacco advertising by January 2011. Much effort is needed to monitor the current situation of tobacco advertising and promotion in China. Objective: This study aims to examine levels of awareness of tobacco advertising and promotion among smokers in China as compared to other countries with different levels of restrictions. Methods: One developing country (Thailand) and two developed countries (Australia and the USA) were selected for comparison. All four countries are part of the International Tobacco Control (ITC) Policy Evaluation Survey project. Between 2005 and 2006, parallel ITC surveys were conducted among adult smokers (at least smoked weekly) in China (n = 4763), Thailand (n = 2000), Australia (n = 1767) and the USA (n = 1780). Unprompted and prompted recall of noticing tobacco advertising and promotion were measured. Results: Chinese respondents reported noticing tobacco advertisements in a range of channels and venues, with highest exposure levels on television (34.5%), billboards (33.4%) and in stores (29.2%). A quarter of respondents noticed tobacco sponsorships, and a high level of awareness of promotion was reported. Cross-country comparison reveals that overall reported awareness was significantly higher in China than in Thailand (particularly) and Australia, but lower than in the USA. Conclusions: There is a big gap between China and the better-performing countries such as Thailand and Australia regarding tobacco promotion restrictions. China needs to do more, including enhanced policy and more robust enforcement. PMID:19332425

  17. Parallel Guessing: A Strategy for High-Speed Computation

    DTIC Science & Technology

    1984-09-19

    for using additional hardware to obtain higher processing speed). In this paper we argue that parallel guessing for image analysis is a useful...from a true solution, or the correctness of a guess, can be readily checked. We review image - analysis algorithms having a parallel guessing or

  18. A hedonic analysis of big game hunting club dues in Georgia, USA

    Treesearch

    James C. Mingie; Neelam C. Poudyal; J. M.  Bowker; Michael T.  Mengak; Jacek P.  Siry

    2017-01-01

    Hunting lease revenue can be a reliable supplemental income for forest landowners. Although studies have examined factors influencing per acre lease rates, little is known about how various characteristics are capitalized in hunting club dues. The objective of this study was to conduct a hedonic analysis of big game hunting club dues in Georgia, USA using a variety of...

  19. Robust Variance Estimation with Dependent Effect Sizes: Practical Considerations Including a Software Tutorial in Stata and SPSS

    ERIC Educational Resources Information Center

    Tanner-Smith, Emily E.; Tipton, Elizabeth

    2014-01-01

    Methodologists have recently proposed robust variance estimation as one way to handle dependent effect sizes in meta-analysis. Software macros for robust variance estimation in meta-analysis are currently available for Stata (StataCorp LP, College Station, TX, USA) and SPSS (IBM, Armonk, NY, USA), yet there is little guidance for authors regarding…

  20. The Analysis of Content and Operational Components of Public School Teachers' Continuing Professional Development in Great Britain, Canada and the USA

    ERIC Educational Resources Information Center

    Mukan, Nataliya; Kravets, Svitlana; Khamulyak, Nataliya

    2016-01-01

    In the article the content and operational components of continuing professional development of public school teachers in Great Britain, Canada, the USA have been characterized. The main objectives are defined as the theoretical analysis of scientific-pedagogical literature, which highlights different aspects of the problem under research;…

  1. When Does a Nation-Level Analysis Make Sense? ESD and Educational Governance in Brazil, South Africa, and the USA

    ERIC Educational Resources Information Center

    Feinstein, Noah Weeth; Jacobi, Pedro Roberto; Lotz-Sisitka, Heila

    2013-01-01

    International policy analysis tends to simplify the nation state, portraying countries as coherent units that can be described by one statistic or placed into one category. As scholars from Brazil, South Africa, and the USA, we find the nation-centric research perspective particularly challenging. In each of our home countries, the effective…

  2. Anticipating forest and range land development in central Oregon (USA) for landscape analysis, with an example application involving mule deer

    Treesearch

    Jeffrey D. Kline; Alissa Moses; Theresa Burcsu

    2010-01-01

    Forest policymakers, public lands managers, and scientists in the Pacific Northwest (USA) seek ways to evaluate the landscape-level effects of policies and management through the multidisciplinary development and application of spatially explicit methods and models. The Interagency Mapping and Analysis Project (IMAP) is an ongoing effort to generate landscape-wide...

  3. Internet search query analysis can be used to demonstrate the rapidly increasing public awareness of palliative care in the USA.

    PubMed

    McLean, Sarah; Lennon, Paul; Glare, Paul

    2017-01-27

    A lack of public awareness of palliative care (PC) has been identified as one of the main barriers to appropriate PC access. Internet search query analysis is a novel methodology, which has been effectively used in surveillance of infectious diseases, and can be used to monitor public awareness of health-related topics. We aimed to demonstrate the utility of internet search query analysis to evaluate changes in public awareness of PC in the USA between 2005 and 2015. Google Trends provides a referenced score for the popularity of a search term, for defined regions over defined time periods. The popularity of the search term 'palliative care' was measured monthly between 1/1/2005 and 31/12/2015 in the USA and in the UK. Results were analysed using independent t-tests and joinpoint analysis. The mean monthly popularity of the search term increased between 2008-2009 (p<0.001), 2011-2012 (p<0.001), 2013-2014 (p=0.004) and 2014-2015 (p=0.002) in the USA. Joinpoint analysis was used to evaluate the monthly percentage change (MPC) in the popularity of the search term. In the USA, the MPC increase was 0.6%/month (p<0.05); in the UK the MPC of 0.05% was non-significant. Although internet search query surveillance is a novel methodology, it is freely accessible and has significant potential to monitor health-seeking behaviour among the public. PC is rapidly growing in the USA, and the rapidly increasing public awareness of PC as demonstrated in this study, in comparison with the UK, where PC is relatively well established is encouraging in increasingly ensuring appropriate PC access for all. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  4. Defining the challenges of the Modern Analytical Laboratory (CPSA USA 2014): the risks and reality of personalized healthcare.

    PubMed

    Weng, Naidong; Needham, Shane; Lee, Mike

    2015-01-01

    The 17th Annual Symposium on Clinical and Pharmaceutical Solutions through Analysis (CPSA) 29 September-2 October 2014, was held at the Sheraton Bucks County Hotel, Langhorne, PA, USA. The CPSA USA 2014 brought the various analytical fields defining the challenges of the modern analytical laboratory. Ongoing discussions focused on the future application of bioanalysis and other disciplines to support investigational new drugs (INDs) and new drug application (NDA) submissions, clinical diagnostics and pathology laboratory personnel that support patient sample analysis, and the clinical researchers that provide insights into new biomarkers within the context of the modern laboratory and personalized medicine.

  5. Performance Evaluation of Parallel Branch and Bound Search with the Intel iPSC (Intel Personal SuperComputer) Hypercube Computer.

    DTIC Science & Technology

    1986-12-01

    17 III. Analysis of Parallel Design ................................................ 18 Parallel Abstract Data ...Types ........................................... 18 Abstract Data Type .................................................. 19 Parallel ADT...22 Data -Structure Design ........................................... 23 Object-Oriented Design

  6. Programming Probabilistic Structural Analysis for Parallel Processing Computer

    NASA Technical Reports Server (NTRS)

    Sues, Robert H.; Chen, Heh-Chyun; Twisdale, Lawrence A.; Chamis, Christos C.; Murthy, Pappu L. N.

    1991-01-01

    The ultimate goal of this research program is to make Probabilistic Structural Analysis (PSA) computationally efficient and hence practical for the design environment by achieving large scale parallelism. The paper identifies the multiple levels of parallelism in PSA, identifies methodologies for exploiting this parallelism, describes the development of a parallel stochastic finite element code, and presents results of two example applications. It is demonstrated that speeds within five percent of those theoretically possible can be achieved. A special-purpose numerical technique, the stochastic preconditioned conjugate gradient method, is also presented and demonstrated to be extremely efficient for certain classes of PSA problems.

  7. Slope movements triggered by heavy rainfall, November 3–5, 1985, in Virginia and West Virginia, U.S.A.

    USGS Publications Warehouse

    Jacobson, Robert B.; Cron, Elizabeth D.; McGeehin, John P.

    1989-01-01

    Study of slope movements triggered by the storm of November 3–5, 1985, in the central Appalachian Mountains, U.S.A., has helped to define the meteorologic conditions leading to slope movements and the relative importance of land cover, bedrock, surficial geology, and geomorphology in slope movement location. This long-duration rainfall at moderate intensities triggered more than 1,000 slope movements in a 1,040-km2 study area. Most were shallow slips and slip-flows in thin colluvium and residuum on shale slopes. Locations of these failures were sensitive to land cover and slope aspect but were relatively insensitive to topographic setting. A few shallow slope movements were triggered by the same rainfall on interbedded limestone, shale, and sandstone. Several large debris slide-avalanches were triggered in sandstone regolith high on ridges in areas of the highest measured rainfall. Most of these sites were on slopes that dip 30 to 35° and lie parallel to bedding planes, presumably the sites of least stability.

  8. Conduct of the International Multigrid Conference

    NASA Technical Reports Server (NTRS)

    Mccormick, S.

    1984-01-01

    The 1983 International Multigrid Conference was held at Colorado's Copper Mountain Ski Resort, April 5-8. It was organized jointly by the Institute for Computational Studies at Colorado State University, U.S.A., and the Gasellschaft fur Mathematik und Datenverarbeitung Bonn, F.R. Germany, and was sponsored by the Air Force Office of Sponsored Research and National Aeronautics and Space Administration Headquarters. The conference was attended by 80 scientists, divided by institution almost equally into private industry, research laboratories, and academia. Fifteen attendees came from countries other than the U.S.A. In addition to the fruitful discussions, the most significant factor of the conference was of course the lectures. The lecturers include most of the leaders in the field of multigrid research. The program offered a nice integrated blend of theory, numerical studies, basic research, and applications. Some of the new areas of research that have surfaced since the Koln-Porz conference include: the algebraic multigrid approach; multigrid treatment of Euler equations for inviscid fluid flow problems; 3-D problems; and the application of MG methods on vector and parallel computers.

  9. Morphology, ecology and biogeography of Stauroneis pachycephala P.T. Cleve (Bacillariophyta) and its transfer to the genus Envekadea

    USGS Publications Warehouse

    Atazadeh, Islam; Edlund, Mark B.; van de Vijver, Bart; Mills, Keely; Spaulding, Sarah A.; Gell, Peter A.; Crawford, Simon; Barton, Andrew F.; Lee, Sylvia S.; Smith, Kathryn E.L.; Newall, Peter; Potapova, Marina

    2014-01-01

    Stauroneis pachycephala was described in 1881 from the Baakens River, Port Elizabeth, South Africa. Recently, it was found during surveys of the MacKenzie River (Victoria, Australia), the Florida Everglades (USA) and coastal marshes of Louisiana (USA). The morphology, ecology and geographic distribution of this species are described in this article. This naviculoid species is characterised by lanceolate valves with a gibbous centre, a sigmoid raphe, an axial area narrowing toward the valve ends, and capitate valve apices. The central area is a distinct stauros that is slightly widened near the valve margin. The raphe is straight and filiform, and the terminal raphe fissures are strongly deflected in opposite directions. Striae are fine and radiate in the middle of the valve, becoming parallel and eventually convergent toward the valve ends. The external surface of the valves and copulae is smooth and lacks ornamentation. We also examined the type material of S. pachycephala. Our observations show this species has morphological characteristics that fit within the genus Envekadea. Therefore, the transfer of S. pachycephala to Envekadea is proposed and a lectotype is designated.

  10. Motion capability analysis of a quadruped robot as a parallel manipulator

    NASA Astrophysics Data System (ADS)

    Yu, Jingjun; Lu, Dengfeng; Zhang, Zhongxiang; Pei, Xu

    2014-12-01

    This paper presents the forward and inverse displacement analysis of a quadruped robot MANA as a parallel manipulator in quadruple stance phase, which is used to obtain the workspace and control the motion of the body. The robot MANA designed on the basis of the structure of quadruped mammal is able to not only walk and turn in the uneven terrain, but also accomplish various manipulating tasks as a parallel manipulator in quadruple stance phase. The latter will be the focus of this paper, however. For this purpose, the leg kinematics is primarily analyzed, which lays the foundation on the gait planning in terms of locomotion and body kinematics analysis as a parallel manipulator. When all four feet of the robot contact on the ground, by assuming there is no slipping at the feet, each contacting point is treated as a passive spherical joint and the kinematic model of parallel manipulator is established. The method for choosing six non-redundant actuated joints for the parallel manipulator from all twelve optional joints is elaborated. The inverse and forward displacement analysis of the parallel manipulator is carried out using the method of coordinate transformation. Finally, based on the inverse and forward kinematic model, two issues on obtaining the reachable workspace of parallel manipulator and planning the motion of the body are implemented and verified by ADAMS simulation.

  11. Parallel simulation today

    NASA Technical Reports Server (NTRS)

    Nicol, David; Fujimoto, Richard

    1992-01-01

    This paper surveys topics that presently define the state of the art in parallel simulation. Included in the tutorial are discussions on new protocols, mathematical performance analysis, time parallelism, hardware support for parallel simulation, load balancing algorithms, and dynamic memory management for optimistic synchronization.

  12. Evaluation of the Nova StatSensor® XpressTM Creatinine Point-Of-Care Handheld Analyzer

    PubMed Central

    Kosack, Cara Simone; de Kieviet, Wim; Bayrak, Kubra; Milovic, Anastacija; Page, Anne Laure

    2015-01-01

    Creatinine is a parameter that is required to monitor renal function and is important to follow in patients under treatment with potentially toxic renal drugs, such as the anti-HIV drug Tenofovir. A point of care instrument to measure creatinine would be useful for patients monitoring in resource-limited settings, where more instruments that are sophisticated are not available. The StatSensor Xpress Creatinine (Nova Biomedical Cooperation, Waltham, MA, USA) point of care analyzer was evaluated for its diagnostic performance in indicating drug therapy change. Creatinine was measured in parallel using the Nova StatSensor Xpress Creatinine analyzer and the Vitros 5,1FS (Ortho Clinical Diagnostics, Inc, Rochester, USA), which served as reference standard. The precision (i.e., repeatability and reproducibility) and accuracy of the StatSensor Xpress Creatinine analyzer were calculated using a panel of specimens with normal, low pathological and high pathological values. Two different Nova StatSensor Xpress Creatinine analyzers were used for the assessment of accuracy using repeated measurements. The coefficient of variation of the StatSensor Xpress Creatinine analyzers ranged from 2.3 to 5.9% for repeatability and from 4.2 to 9.0% for between-run reproducibility. The concordance correlation agreement was good except for high values (>600 µmol/L). The Bland-Altman analysis in high pathological specimens suggests that the Nova StatSensor Xpress Creatinine test tends to underestimate high creatinine values (i.e., >600 µmol/L). The Nova StatSensor Xpress Creatinine analyzers showed acceptable to good results in terms of repeatability, inter-device reproducibility and between-run reproducibility over time using quality control reagents. The analyzer was found sufficiently accurate for detecting pathological values in patients (age >10 year) and can be used with a moderate risk of misclassification. PMID:25886375

  13. Evaluation of the Nova StatSensor® Xpress(TM) Creatinine point-of-care handheld analyzer.

    PubMed

    Kosack, Cara Simone; de Kieviet, Wim; Bayrak, Kubra; Milovic, Anastacija; Page, Anne Laure

    2015-01-01

    Creatinine is a parameter that is required to monitor renal function and is important to follow in patients under treatment with potentially toxic renal drugs, such as the anti-HIV drug Tenofovir. A point of care instrument to measure creatinine would be useful for patients monitoring in resource-limited settings, where more instruments that are sophisticated are not available. The StatSensor Xpress Creatinine (Nova Biomedical Cooperation, Waltham, MA, USA) point of care analyzer was evaluated for its diagnostic performance in indicating drug therapy change. Creatinine was measured in parallel using the Nova StatSensor Xpress Creatinine analyzer and the Vitros 5,1FS (Ortho Clinical Diagnostics, Inc, Rochester, USA), which served as reference standard. The precision (i.e., repeatability and reproducibility) and accuracy of the StatSensor Xpress Creatinine analyzer were calculated using a panel of specimens with normal, low pathological and high pathological values. Two different Nova StatSensor Xpress Creatinine analyzers were used for the assessment of accuracy using repeated measurements. The coefficient of variation of the StatSensor Xpress Creatinine analyzers ranged from 2.3 to 5.9% for repeatability and from 4.2 to 9.0% for between-run reproducibility. The concordance correlation agreement was good except for high values (>600 µmol/L). The Bland-Altman analysis in high pathological specimens suggests that the Nova StatSensor Xpress Creatinine test tends to underestimate high creatinine values (i.e., >600 µmol/L). The Nova StatSensor Xpress Creatinine analyzers showed acceptable to good results in terms of repeatability, inter-device reproducibility and between-run reproducibility over time using quality control reagents. The analyzer was found sufficiently accurate for detecting pathological values in patients (age >10 year) and can be used with a moderate risk of misclassification.

  14. Comparison of Outcomes of antibiotic Drugs and Appendectomy (CODA) trial: a protocol for the pragmatic randomised study of appendicitis treatment

    PubMed Central

    Davidson, Giana H; Flum, David R; Talan, David A; Kessler, Larry G; Lavallee, Danielle C; Bizzell, Bonnie J; Farjah, Farhood; Stewart, Skye D; Krishnadasan, Anusha; Carney, Erin E; Wolff, Erika M; Comstock, Bryan A; Monsell, Sarah E; Heagerty, Patrick J; Ehlers, Annie P; DeUgarte, Daniel A; Kaji, Amy H; Evans, Heather L; Yu, Julianna T; Mandell, Katherine A; Doten, Ian C; Clive, Kevin S; McGrane, Karen M; Tudor, Brandon C; Foster, Careen S; Saltzman, Darin J; Thirlby, Richard C; Lange, Erin O; Sabbatini, Amber K; Moran, Gregory J

    2017-01-01

    Introduction Several European studies suggest that some patients with appendicitis can be treated safely with antibiotics. A portion of patients eventually undergo appendectomy within a year, with 10%–15% failing to respond in the initial period and a similar additional proportion with suspected recurrent episodes requiring appendectomy. Nearly all patients with appendicitis in the USA are still treated with surgery. A rigorous comparative effectiveness trial in the USA that is sufficiently large and pragmatic to incorporate usual variations in care and measures the patient experience is needed to determine whether antibiotics are as good as appendectomy. Objectives The Comparing Outcomes of Antibiotic Drugs and Appendectomy (CODA) trial for acute appendicitis aims to determine whether the antibiotic treatment strategy is non-inferior to appendectomy. Methods/Analysis CODA is a randomised, pragmatic non-inferiority trial that aims to recruit 1552 English-speaking and Spanish-speaking adults with imaging-confirmed appendicitis. Participants are randomised to appendectomy or 10 days of antibiotics (including an option for complete outpatient therapy). A total of 500 patients who decline randomisation but consent to follow-up will be included in a parallel observational cohort. The primary analytic outcome is quality of life (measured by the EuroQol five dimension index) at 4 weeks. Clinical adverse events, rate of eventual appendectomy, decisional regret, return to work/school, work productivity and healthcare utilisation will be compared. Planned exploratory analyses will identify subpopulations that may have a differential risk of eventual appendectomy in the antibiotic treatment arm. Ethics and dissemination This trial was approved by the University of Washington’s Human Subjects Division. Results from this trial will be presented in international conferences and published in peer-reviewed journals. Trial registration number NCT02800785. PMID:29146633

  15. Chlorhexidine gluconate reduces transmission of methicillin-resistant Staphylococcus aureus USA300 among Marine recruits.

    PubMed

    Whitman, Timothy J; Schlett, Carey D; Grandits, Greg A; Millar, Eugene V; Mende, Katrin; Hospenthal, Duane R; Murray, Patrick R; Tribble, David R

    2012-08-01

    Methicillin-resistant Staphylococcus aureus (MRSA) pulsed-field type (PFT) USA300 causes skin and soft tissue infections in military recruits and invasive disease in hospitals. Chlorhexidine gluconate (CHG) is used to reduce MRSA colonization and infection. The impact of CHG on the molecular epidemiology of MRSA is not known. To evaluate the impact of 2% CHG-impregnated cloths on the molecular epidemiology of MRSA colonization. Cluster-randomized, double-blind, controlled trial. Marine Officer Candidate School, Quantico, Virginia, in 2007. Military recruits. Thrice-weekly application of CHG-impregnated or control (Comfort Bath; Sage) cloths over the entire body. Baseline and serial (every 2 weeks) nasal and/or axillary swab samples were assessed for MRSA colonization. Molecular analysis was performed with pulsed-field gel electrophoresis. During training, 77 subjects (4.9%) acquired MRSA, 26 (3.3%) in the CHG group and 51 (6.5%) in the control group (P=.004). When analyzed for PFT, 24 subjects (3.1%) in the control group but only 6 subjects (0.8%) in the CHG group (P=.001) had USA300. Of the 167 colonizing isolates recovered from 77 subjects, 99 were recovered from the control group, including USA300 (40.4%), USA800 (38.4%), USA1000 (12.1%), and USA100 (6.1%), and 68 were recovered from the CHG group, including USA800 (51.5%), USA100 (23.5%), and USA300 (13.2%). CHG decreased the transmission of MRSA--more specifically, USA300--among military recruits. In addition, USA300 and USA800 outcompeted other MRSA PFTs at incident colonization. Future studies should evaluate the broad-based use of CHG to decrease transmission of USA300 in hospital settings.

  16. Chlorhexidine Gluconate Reduces Transmission of Methicillin-Resistant Staphylococcus aureus USA300 among Marine Recruits

    PubMed Central

    Whitman, Timothy J.; Schlett, Carey D.; Grandits, Greg A.; Millar, Eugene V.; Mende, Katrin; Hospenthal, Duane R.; Murray, Patrick R.; Tribble, David R.

    2018-01-01

    BACKGROUND Methicillin-resistant Staphylococcus aureus (MRSA) pulsed-field type (PFT) USA300 causes skin and soft tissue infections in military recruits and invasive disease in hospitals. Chlorhexidine gluconate (CHG) is used to reduce MRSA colonization and infection. The impact of CHG on the molecular epidemiology of MRSA is not known. OBJECTIVE To evaluate the impact of 2% CHG–impregnated cloths on the molecular epidemiology of MRSA colonization. DESIGN Cluster-randomized, double-blind, controlled trial. SETTING Marine Officer Candidate School, Quantico, Virginia, in 2007. PARTICIPANTS Military recruits. INTERVENTION Thrice-weekly application of CHG-impregnated or control (Comfort Bath; Sage) cloths over the entire body. MEASUREMENTS Baseline and serial (every 2 weeks) nasal and/or axillary swab samples were assessed for MRSA colonization. Molecular analysis was performed with pulsed-field gel electrophoresis. RESULTS During training, 77 subjects (4.9%) acquired MRSA, 26 (3.3%) in the CHG group and 51 (6.5%) in the control group (P = .004). When analyzed for PFT, 24 subjects (3.1%) in the control group but only 6 subjects (0.8%) in the CHG group (P = .001) had USA300. Of the 167 colonizing isolates recovered from 77 subjects, 99 were recovered from the control group, including USA300 (40.4%), USA800 (38.4%), USA1000 (12.1%), and USA100 (6.1%), and 68 were recovered from the CHG group, including USA800 (51.5%), USA100 (23.5%), and USA300 (13.2%). CONCLUSIONS CHG decreased the transmission of MRSA—more specifically, USA300—among military recruits. In addition, USA300 and USA800 outcompeted other MRSA PFTs at incident colonization. Future studies should evaluate the broad-based use of CHG to decrease transmission of USA300 in hospital settings. PMID:22759549

  17. Parallel processing considerations for image recognition tasks

    NASA Astrophysics Data System (ADS)

    Simske, Steven J.

    2011-01-01

    Many image recognition tasks are well-suited to parallel processing. The most obvious example is that many imaging tasks require the analysis of multiple images. From this standpoint, then, parallel processing need be no more complicated than assigning individual images to individual processors. However, there are three less trivial categories of parallel processing that will be considered in this paper: parallel processing (1) by task; (2) by image region; and (3) by meta-algorithm. Parallel processing by task allows the assignment of multiple workflows-as diverse as optical character recognition [OCR], document classification and barcode reading-to parallel pipelines. This can substantially decrease time to completion for the document tasks. For this approach, each parallel pipeline is generally performing a different task. Parallel processing by image region allows a larger imaging task to be sub-divided into a set of parallel pipelines, each performing the same task but on a different data set. This type of image analysis is readily addressed by a map-reduce approach. Examples include document skew detection and multiple face detection and tracking. Finally, parallel processing by meta-algorithm allows different algorithms to be deployed on the same image simultaneously. This approach may result in improved accuracy.

  18. Parallel-vector solution of large-scale structural analysis problems on supercomputers

    NASA Technical Reports Server (NTRS)

    Storaasli, Olaf O.; Nguyen, Duc T.; Agarwal, Tarun K.

    1989-01-01

    A direct linear equation solution method based on the Choleski factorization procedure is presented which exploits both parallel and vector features of supercomputers. The new equation solver is described, and its performance is evaluated by solving structural analysis problems on three high-performance computers. The method has been implemented using Force, a generic parallel FORTRAN language.

  19. Parallel auto-correlative statistics with VTK.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pebay, Philippe Pierre; Bennett, Janine Camille

    2013-08-01

    This report summarizes existing statistical engines in VTK and presents both the serial and parallel auto-correlative statistics engines. It is a sequel to [PT08, BPRT09b, PT09, BPT09, PT10] which studied the parallel descriptive, correlative, multi-correlative, principal component analysis, contingency, k-means, and order statistics engines. The ease of use of the new parallel auto-correlative statistics engine is illustrated by the means of C++ code snippets and algorithm verification is provided. This report justifies the design of the statistics engines with parallel scalability in mind, and provides scalability and speed-up analysis results for the autocorrelative statistics engine.

  20. Introducing parallelism to histogramming functions for GEM systems

    NASA Astrophysics Data System (ADS)

    Krawczyk, Rafał D.; Czarski, Tomasz; Kolasinski, Piotr; Pozniak, Krzysztof T.; Linczuk, Maciej; Byszuk, Adrian; Chernyshova, Maryna; Juszczyk, Bartlomiej; Kasprowicz, Grzegorz; Wojenski, Andrzej; Zabolotny, Wojciech

    2015-09-01

    This article is an assessment of potential parallelization of histogramming algorithms in GEM detector system. Histogramming and preprocessing algorithms in MATLAB were analyzed with regard to adding parallelism. Preliminary implementation of parallel strip histogramming resulted in speedup. Analysis of algorithms parallelizability is presented. Overview of potential hardware and software support to implement parallel algorithm is discussed.

  1. Reliability Modeling Methodology for Independent Approaches on Parallel Runways Safety Analysis

    NASA Technical Reports Server (NTRS)

    Babcock, P.; Schor, A.; Rosch, G.

    1998-01-01

    This document is an adjunct to the final report An Integrated Safety Analysis Methodology for Emerging Air Transport Technologies. That report presents the results of our analysis of the problem of simultaneous but independent, approaches of two aircraft on parallel runways (independent approaches on parallel runways, or IAPR). This introductory chapter presents a brief overview and perspective of approaches and methodologies for performing safety analyses for complex systems. Ensuing chapter provide the technical details that underlie the approach that we have taken in performing the safety analysis for the IAPR concept.

  2. Genetic differentiation of Mexican Holstein cattle and its relationship with Canadian and U.S. Holsteins

    PubMed Central

    García-Ruiz, Adriana; Ruiz-López, Felipe de J.; Van Tassell, Curtis P.; Montaldo, Hugo H.; Huson, Heather J.

    2015-01-01

    The Mexican Holstein (HO) industry has imported Canadian and US (CAN + USA) HO germplasm for use in two different production systems, the conventional (Conv) and the low income (Lowi) system. The objective of this work was to study the genetic composition and differentiation of the Mexican HO cattle, considering the production system in which they perform and their relationship with the Canadian and US HO populations. The analysis included information from 149, 303, and 173 unrelated or with unknown pedigree HO animals from the Conv, Lowi, and CAN + USA populations, respectively. Canadian and US Jersey (JE) and Brown Swiss (BS) genotypes (162 and 86, respectively) were used to determine if Mexican HOs were hybridized with either of these breeds. After quality control filtering, a total of 6,617 out of 6,836 single nucleotide polymorphism markers were used. To describe the genetic diversity across the populations, principal component (PC), admixture composition, and linkage disequilibrium (LD; r2) analyses were performed. Through the PC analysis, HO × JE and HO × BS crossbreeding was detected in the Lowi system. The Conv system appeared to be in between Lowi and CAN + USA populations. Admixture analysis differentiated between the genetic composition of the Conv and Lowi systems, and five ancestry groups associated to sire’s country of origin were identified. The minimum distance between markers to estimate a useful LD was found to be 54.5 kb for the Mexican HO populations. At this average distance, the persistence of phase across autosomes of Conv and Lowi systems was 0.94, for Conv and CAN + USA was 0.92 and for the Lowi and CAN + USA was 0.91. Results supported the flow of germplasm among populations being Conv a source for Lowi, and dependent on migration from CAN + USA. Mexican HO cattle in Conv and Lowi populations share common ancestry with CAN + USA but have different genetic signatures. PMID:25709615

  3. Genetic differentiation of Mexican Holstein cattle and its relationship with Canadian and U.S. Holsteins.

    PubMed

    García-Ruiz, Adriana; Ruiz-López, Felipe de J; Van Tassell, Curtis P; Montaldo, Hugo H; Huson, Heather J

    2015-01-01

    The Mexican Holstein (HO) industry has imported Canadian and US (CAN + USA) HO germplasm for use in two different production systems, the conventional (Conv) and the low income (Lowi) system. The objective of this work was to study the genetic composition and differentiation of the Mexican HO cattle, considering the production system in which they perform and their relationship with the Canadian and US HO populations. The analysis included information from 149, 303, and 173 unrelated or with unknown pedigree HO animals from the Conv, Lowi, and CAN + USA populations, respectively. Canadian and US Jersey (JE) and Brown Swiss (BS) genotypes (162 and 86, respectively) were used to determine if Mexican HOs were hybridized with either of these breeds. After quality control filtering, a total of 6,617 out of 6,836 single nucleotide polymorphism markers were used. To describe the genetic diversity across the populations, principal component (PC), admixture composition, and linkage disequilibrium (LD; r(2) ) analyses were performed. Through the PC analysis, HO × JE and HO × BS crossbreeding was detected in the Lowi system. The Conv system appeared to be in between Lowi and CAN + USA populations. Admixture analysis differentiated between the genetic composition of the Conv and Lowi systems, and five ancestry groups associated to sire's country of origin were identified. The minimum distance between markers to estimate a useful LD was found to be 54.5 kb for the Mexican HO populations. At this average distance, the persistence of phase across autosomes of Conv and Lowi systems was 0.94, for Conv and CAN + USA was 0.92 and for the Lowi and CAN + USA was 0.91. Results supported the flow of germplasm among populations being Conv a source for Lowi, and dependent on migration from CAN + USA. Mexican HO cattle in Conv and Lowi populations share common ancestry with CAN + USA but have different genetic signatures.

  4. Interactive Parallel Data Analysis within Data-Centric Cluster Facilities using the IPython Notebook

    NASA Astrophysics Data System (ADS)

    Pascoe, S.; Lansdowne, J.; Iwi, A.; Stephens, A.; Kershaw, P.

    2012-12-01

    The data deluge is making traditional analysis workflows for many researchers obsolete. Support for parallelism within popular tools such as matlab, IDL and NCO is not well developed and rarely used. However parallelism is necessary for processing modern data volumes on a timescale conducive to curiosity-driven analysis. Furthermore, for peta-scale datasets such as the CMIP5 archive, it is no longer practical to bring an entire dataset to a researcher's workstation for analysis, or even to their institutional cluster. Therefore, there is an increasing need to develop new analysis platforms which both enable processing at the point of data storage and which provides parallelism. Such an environment should, where possible, maintain the convenience and familiarity of our current analysis environments to encourage curiosity-driven research. We describe how we are combining the interactive python shell (IPython) with our JASMIN data-cluster infrastructure. IPython has been specifically designed to bridge the gap between the HPC-style parallel workflows and the opportunistic curiosity-driven analysis usually carried out using domain specific languages and scriptable tools. IPython offers a web-based interactive environment, the IPython notebook, and a cluster engine for parallelism all underpinned by the well-respected Python/Scipy scientific programming stack. JASMIN is designed to support the data analysis requirements of the UK and European climate and earth system modeling community. JASMIN, with its sister facility CEMS focusing the earth observation community, has 4.5 PB of fast parallel disk storage alongside over 370 computing cores provide local computation. Through the IPython interface to JASMIN, users can make efficient use of JASMIN's multi-core virtual machines to perform interactive analysis on all cores simultaneously or can configure IPython clusters across multiple VMs. Larger-scale clusters can be provisioned through JASMIN's batch scheduling system. Outputs can be summarised and visualised using the full power of Python's many scientific tools, including Scipy, Matplotlib, Pandas and CDAT. This rich user experience is delivered through the user's web browser; maintaining the interactive feel of a workstation-based environment with the parallel power of a remote data-centric processing facility.

  5. EDITORIAL: Fluctuations and noise in photonics and quantum optics: a special issue in memory of Hermann Haus

    NASA Astrophysics Data System (ADS)

    Abbott, Derek; Shapiro, Jeffrey H.; Yamamoto, Yoshihisa

    2004-08-01

    This Special Issue of Journal of Optics B: Quantum and Semiclassical Optics brings together the contributions of various researchers working on theoretical and experimental aspects of fluctuational phenomena in photonics and quantum optics. The topics discussed in this issue extend from fundamental physics to applications of noise and fluctuational methods from quantum to classical systems, and include: bullet Quantum measurement bullet Quantum squeezing bullet Solitons and fibres bullet Gravitational wave inferometers bullet Fluorescence phenomena bullet Cavity QED bullet Photon statistics bullet Noise in lasers and laser systems bullet Quantum computing and information bullet Quantum lithography bullet Teleportation. This Special Issue is published in connection with the SPIE International Symposium on Fluctuations and Noise, held in Santa Fe, New Mexico, on 1-4 June 2003. The symposium contained six parallel conferences, and the papers in this Special Issue are connected to the conference entitled `Fluctuations and Noise in Photonics and Quantum Optics'. This was the first in a series of symposia organized with the support of the SPIE that have greatly contributed to progress in this area. The co-founders of the symposium series were Laszlo B Kish (Texas A&M University) and Derek Abbott (The University of Adelaide). The Chairs of the `Fluctuations and Noise in Photonics and Quantum Optics' conference were Derek Abbott, Jeffrey H Shapiro and Yoshihisa Yamamoto. The practical aspects of the organization were ably handled by Kristi Kelso and Marilyn Gorsuch of the SPIE, USA. Sadly, less than two weeks before the conference, Hermann A Haus passed away. Hermann Haus was a founding father of the field of noise in optics and quantum optics. He submitted three papers to the conference and was very excited to attend; as can be seen in the collection of papers, he was certainly present in spirit. In honour of his creativity and pioneering work in this field, we have dedicated this Special Issue to him. The first item is an obituary reflecting on his life and work. The first technical paper in this issue represents Hermann’s last sole author publication; a special thanks is due to A P Flitney for organizing this manuscript into publishable form. We thank the members of the International Programme Committee, listed below, and all those who contributed to making the event such a success. At this point we take the opportunity to express our gratitude to both the authors and reviewers, for their unfailing efforts in preparing and ensuring the high quality of the papers in this Special Issue. International Programme Committee David A Cardimona Air Force Research Laboratory, USA Howard Carmichael University of Auckland, New Zealand Carlton M Caves University of New Mexico, Albuquerque, USA Peter D Drummond University of Queensland, St Lucia, Australia Paul J Edwards University of Canberra, Australia Luca Gammaitoni Università degli Studi di Perugia, Italy Brage Golding Michigan State University, East Lansing, USA Gabriela Gonzalez Louisiana State University, Baton Rouge, USA Guangcan Guo University of Science and Technology of China, Hefei, China Salman Habib Los Alamos National Laboratory, NM, USA Murray Hamilton University of Adelaide, Australia Bei-Lok Hu University of Maryland/College Park, USA Daniel K Johnstone Virginia Commonwealth University, Richmond, USA Franz X Kärtner Massachusetts Institute of Technology, Cambridge, USA Prem Kumar Northwestern University, Evanston, IL, USA Zachary Lemnios DARPA, Arlington, VA, USA Gerd Leuchs Friedrich-Alexander Universität Erlangen--Nürnberg, Germany Hideo Mabuchi California Institute of Technology, Pasadena, USA Peter W Milonni Los Alamos National Laboratory, NM, USA Adrian C Ottewill University College Dublin, Ireland Martin B Plenio Imperial College, London, UK Rajeev J Ram Massachusetts Institute of Technology, Cambridge, USA Farhan Rana Massachusetts Institute of Technology, Cambridge, USA Peter R Smith Loughborough University of Technology, UK Rodney S Tucker University of Melbourne, Australia Howard M Wiseman Griffith University, Brisbane, Australia Stuart A Wolf DARPA, Arlington, VA, USA Anton Zeilinger University of Vienna, Austria Xi-Cheng Zhang Rensselaer Polytechnic Institute, Troy, NY, USA

  6. Zika Virus MB16-23 in Mosquitoes, Miami-Dade County, Florida, USA, 2016.

    PubMed

    Mutebi, John-Paul; Hughes, Holly R; Burkhalter, Kristen L; Kothera, Linda; Vasquez, Chalmers; Kenney, Joan L

    2018-04-17

    We isolated a strain of Zika virus, MB16-23, from Aedes aegypti mosquitoes collected in Miami Beach, Florida, USA, on September 2, 2016. Phylogenetic analysis suggests that MB16-23 most likely originated from the Caribbean region.

  7. Assessing multiple dimensions of urgency sensation: The University of South Australia Urinary Sensation Assessment (USA2 ).

    PubMed

    Das, Rebekah; Buckley, Jonathan; Williams, Marie

    2017-03-01

    To develop and assess structure, test-retest reliability, and discriminative validity of a self-report questionnaire (University of South Australia Urinary Sensation Assessment: USA 2 ) to assess multiple dimensions of urgency sensation. The USA 2 was designed and tested over two prospective, observational studies (2013-2014). Participants were English speaking Australians aged 50 or more with and without overactive bladder (OAB; determined by OAB awareness tool), recruited via health and recreation centers. In Study 1, exploratory factor analysis determined USA 2 structure and subscales. In Study 2, confirmatory factor analysis reassessed structure; Mann-Whitney U-tests determined discriminative validity (OAB vs. non-OAB for subscale and total scores) with Cohen's d effect sizes. Thirty-three individuals completed the USA 2 twice; intraclass correlation coefficients (ICCs) and Wilcoxon signed rank tests assessed test-retest reliability. Questionnaires were returned by 189 eligible participants in Study 1 and 211 in Study 2. Exploratory factor analysis revealed three subscales: "urgency," "affective," "fullness." Confirmatory factor analysis supported these subscales. Subscale and total scores were significantly different between groups with and without OAB (P < 0.001). Cohen's d effect sizes (95%CI) were total score 1.8 (0.5-3.1), "urgency" subscale 1.8 (1.3-2.3), "affective" 1.7 (0.95-2.4), and "fullness" 0.75 (0.42-1.09). Total and subscales scores demonstrated test-retest reliability; ICCs (95%CIs) of 0.95 (0.9-0.98), 0.96 (0.92-0.98), 0.94 (0.88-0.97), and 0.78 (0.56-0.89). The USA 2 assesses multiple dimensions of urgency sensation, is reliable over a 2-week period, and discriminates between older adults with and without OAB. Further validation is required in conditions other than overactive bladder. Neurourol. Urodynam. 36:667-672, 2017. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  8. Association of marital status and colorectal cancer screening participation in the USA.

    PubMed

    El-Haddad, B; Dong, F; Kallail, K J; Hines, R B; Ablah, E

    2015-05-01

    In the USA, for both men and women, colorectal cancer (CRC) ranks third in incidence and second in mortality. Despite evidence that it decreases mortality, CRC screening in the USA remains under-utilized. Some European studies have suggested that marital status affects participation in CRC screening, but the effect of marital status on CRC screening participation in the USA is unknown. In this study, the aim was to compare CRC screening participation rates among married and unmarried couples, separated, widowed, never married and divorced adults living in the USA. This was a retrospective data analysis of the 2010 Behavioural Risk Factor Surveillance System survey. The population studied included 239,300 participants, aged 50-75 years, who completed the 2010 survey. Logistic regression analysis was conducted to assess the association between adherence with CRC screening guidelines and marital status while accounting for survey stratum/weight and covariates. Individuals who were divorced or separated, never married or widowed had decreased odds of adherence with CRC screening guidelines compared with individuals who were married and unmarried couples. In this study, individuals living in the USA who were married and unmarried couples had increased odds of undergoing CRC screening compared to individuals in other marital status groups. Public health interventions are needed to promote CRC screening participation in these other groups. Colorectal Disease © 2015 The Association of Coloproctology of Great Britain and Ireland.

  9. Comparing and decomposing differences in preventive and hospital care: USA versus Taiwan.

    PubMed

    Hsiou, Tiffany R; Pylypchuk, Yuriy

    2012-07-01

    As the USA expands health insurance coverage, comparing utilization of healthcare services with countries like Taiwan that already have universal coverage can highlight problematic areas of each system. The universal coverage plan of Taiwan is the newest among developed countries, and it is known for readily providing access to care at low costs. However, Taiwan experiences problems on the supply side, such as inadequate compensation for providers, especially in the area of preventive care. We compare the use of preventive, hospital, and emergency care between the USA and Taiwan. The rate of preventive care use is much higher in the USA than in Taiwan, whereas the use of hospital and emergency care is about the same. Results of our decomposition analysis suggest that higher levels of education and income, along with inferior health status in the USA, are significant factors, each explaining between 7% and 15% of the gap in preventive care use. Our analysis suggests that, in addition to universal coverage, proper remuneration schemes, education levels, and cultural attitudes towards health care are important factors that influence the use of preventive care. Copyright © 2011 John Wiley & Sons, Ltd.

  10. The effects of AST-120 on chronic kidney disease progression in the United States of America: a post hoc subgroup analysis of randomized controlled trials.

    PubMed

    Schulman, Gerald; Berl, Tomas; Beck, Gerald J; Remuzzi, Giuseppe; Ritz, Eberhard; Shimizu, Miho; Shobu, Yuko; Kikuchi, Mami

    2016-09-30

    The orally administered spherical carbon adsorbent AST-120 is used on-label in Asian countries to slow renal disease progression in patients with progressive chronic kidney disease (CKD). Recently, two multinational, randomized, double-blind, placebo-controlled, phase 3 trials (Evaluating Prevention of Progression in Chronic Kidney Disease [EPPIC] trials) examined AST-120's efficacy in slowing CKD progression. This study assessed the efficacy of AST-120 in the subgroup of patients from the United States of America (USA) in the EPPIC trials. In the EPPIC trials, 2035 patients with moderate to severe CKD were studied, of which 583 were from the USA. The patients were randomly assigned to two groups of equal size that were treated with AST-120 or placebo (9 g/day). The primary end point was a composite of dialysis initiation, kidney transplantation, or serum creatinine doubling. The Kaplan-Meier curve for the time to achieve the primary end point in the placebo-treated patients from the USA was similar to that projected before the study. The per protocol subgroup analysis of the population from the USA which included patients with compliance rates of ≥67 % revealed a significant difference between the treatment groups in the time to achieve the primary end point (Hazard Ratio, 0.74; 95 % Confidence Interval, 0.56-0.97). This post hoc subgroup analysis of EPPIC study data suggests that treatment with AST-120 might delay the time to primary end point in CKD patients from the USA. A further randomized controlled trial in progressive CKD patients in the USA is necessary to confirm the beneficial effect of adding AST-120 to standard therapy regimens. ClinicalTrials.gov NCT00500682 ; NCT00501046 .

  11. BERGMANN USA SOIL SEDIMENT WASHING TECHNOLOGY - APPLICATIONS ANALYSIS REPORT

    EPA Science Inventory

    This document provides an evaluation of the performance of the Bergmann USA Soil/Sediment Washing System and its applicability for the treatment of soils or sediments contaminated with organic and/or inorganic compounds. Both the technical and economic aspects of the technology w...

  12. 78 FR 19158 - Safety Zone; USA Triathlon, Milwaukee Harbor, Milwaukee, WI

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-29

    ...-AA00 Safety Zone; USA Triathlon, Milwaukee Harbor, Milwaukee, WI AGENCY: Coast Guard, DHS. ACTION... standards. Therefore, we did not consider the use of voluntary consensus standards. 14. Environment We have... on the human environment. A preliminary environmental analysis checklist supporting this...

  13. User's Guide for ENSAERO_FE Parallel Finite Element Solver

    NASA Technical Reports Server (NTRS)

    Eldred, Lloyd B.; Guruswamy, Guru P.

    1999-01-01

    A high fidelity parallel static structural analysis capability is created and interfaced to the multidisciplinary analysis package ENSAERO-MPI of Ames Research Center. This new module replaces ENSAERO's lower fidelity simple finite element and modal modules. Full aircraft structures may be more accurately modeled using the new finite element capability. Parallel computation is performed by breaking the full structure into multiple substructures. This approach is conceptually similar to ENSAERO's multizonal fluid analysis capability. The new substructure code is used to solve the structural finite element equations for each substructure in parallel. NASTRANKOSMIC is utilized as a front end for this code. Its full library of elements can be used to create an accurate and realistic aircraft model. It is used to create the stiffness matrices for each substructure. The new parallel code then uses an iterative preconditioned conjugate gradient method to solve the global structural equations for the substructure boundary nodes.

  14. A Bootstrap Generalization of Modified Parallel Analysis for IRT Dimensionality Assessment

    ERIC Educational Resources Information Center

    Finch, Holmes; Monahan, Patrick

    2008-01-01

    This article introduces a bootstrap generalization to the Modified Parallel Analysis (MPA) method of test dimensionality assessment using factor analysis. This methodology, based on the use of Marginal Maximum Likelihood nonlinear factor analysis, provides for the calculation of a test statistic based on a parametric bootstrap using the MPA…

  15. Age at onset in bipolar I affective disorder in the USA and Europe.

    PubMed

    Bellivier, Frank; Etain, Bruno; Malafosse, Alain; Henry, Chantal; Kahn, Jean-Pierre; Elgrabli-Wajsbrot, Orly; Jamain, Stéphane; Azorin, Jean-Michel; Frank, Ellen; Scott, Jan; Grochocinski, Victoria; Kupfer, David J; Golmard, Jean-Louis; Leboyer, Marion

    2014-07-01

    To test for differences in reported age at onset (AAO) of bipolar I affective disorder in clinical samples drawn from Europe and the USA. Admixture analysis was used to identify the model best fitting the observed AAO distributions of two large samples of bipolar I patients from Europe and USA (n = 3616 and n = 2275, respectively). Theoretical AAO functions were compared between the two samples. The model best fitting the observed distribution of AAO in both samples was a mixture of three Gaussian distributions. The theoretical AAO functions of bipolar I disorder differed significantly between the European and USA populations, with further analyses indicating that (i) the proportion of patients belonging to the early-onset subgroup was higher in the USA sample (63 vs. 25%) and (ii) mean age at onset (±SD) in the early-onset subgroup was lower for the USA sample (14.5 ± 4.9 vs. 19 ± 2.7 years). The models best describing the reported AAO distributions of European and USA bipolar I patients were remarkably stable. The intermediate- and late-onset subgroups had similar characteristics in the two samples. However, the theoretical AAO function differed significantly between the USA and European samples due to the higher proportion of patients in the early-onset subgroup and the lower mean age-at-onset in the USA sample.

  16. Churchill: an ultra-fast, deterministic, highly scalable and balanced parallelization strategy for the discovery of human genetic variation in clinical and population-scale genomics.

    PubMed

    Kelly, Benjamin J; Fitch, James R; Hu, Yangqiu; Corsmeier, Donald J; Zhong, Huachun; Wetzel, Amy N; Nordquist, Russell D; Newsom, David L; White, Peter

    2015-01-20

    While advances in genome sequencing technology make population-scale genomics a possibility, current approaches for analysis of these data rely upon parallelization strategies that have limited scalability, complex implementation and lack reproducibility. Churchill, a balanced regional parallelization strategy, overcomes these challenges, fully automating the multiple steps required to go from raw sequencing reads to variant discovery. Through implementation of novel deterministic parallelization techniques, Churchill allows computationally efficient analysis of a high-depth whole genome sample in less than two hours. The method is highly scalable, enabling full analysis of the 1000 Genomes raw sequence dataset in a week using cloud resources. http://churchill.nchri.org/.

  17. Progressive failure of sheeted rock slopes: the 2009–2010 Rhombus Wall rock falls in Yosemite Valley, California, USA

    USGS Publications Warehouse

    Stock, Greg M.; Martel, Stephen J.; Collins, Brian D.; Harp, Edwin L.

    2012-01-01

    Progressive rock-fall failures in natural rock slopes are common in many environments, but often elude detailed quantitative documentation and analysis. Here we present high-resolution photography, video, and laser scanning data that document spatial and temporal patterns of a 15-month-long sequence of at least 14 rock falls from the Rhombus Wall, a sheeted granitic cliff in Yosemite Valley, California. The rock-fall sequence began on 26 August 2009 with a small failure at the tip of an overhanging rock slab. Several hours later, a series of five rock falls totaling 736 m3progressed upward along a sheeting joint behind the overhanging slab. Over the next 3 weeks, audible cracking occurred on the Rhombus Wall, suggesting crack propagation, while visual monitoring revealed opening of a sheeting joint adjacent to the previous failure surface. On 14 September 2009 a 110 m3 slab detached along this sheeting joint. Additional rock falls between 30 August and 20 November 2010, totaling 187 m3, radiated outward from the initial failure area along cliff (sub)parallel sheeting joints. We suggest that these progressive failures might have been related to stress redistributions accompanying propagation of sheeting joints behind the cliff face. Mechanical analyses indicate that tensile stresses should occur perpendicular to the cliff face and open sheeting joints, and that sheeting joints should propagate parallel to a cliff face from areas of stress concentrations. The analyses also account for how sheeting joints can propagate to lengths many times greater than their depths behind cliff faces. We posit that as a region of failure spreads across a cliff face, stress concentrations along its margin will spread with it, promoting further crack propagation and rock falls.

  18. Beat-the-wave evacuation mapping for tsunami hazards in Seaside, Oregon, USA

    USGS Publications Warehouse

    Priest, George R.; Stimely, Laura; Wood, Nathan J.; Madin, Ian; Watzig, Rudie

    2016-01-01

    Previous pedestrian evacuation modeling for tsunamis has not considered variable wave arrival times or critical junctures (e.g., bridges), nor does it effectively communicate multiple evacuee travel speeds. We summarize an approach that identifies evacuation corridors, recognizes variable wave arrival times, and produces a map of minimum pedestrian travel speeds to reach safety, termed a “beat-the-wave” (BTW) evacuation analysis. We demonstrate the improved approach by evaluating difficulty of pedestrian evacuation of Seaside, Oregon, for a local tsunami generated by a Cascadia subduction zone earthquake. We establish evacuation paths by calculating the least cost distance (LCD) to safety for every grid cell in a tsunami-hazard zone using geospatial, anisotropic path distance algorithms. Minimum BTW speed to safety on LCD paths is calculated for every grid cell by dividing surface distance from that cell to safety by the tsunami arrival time at safety. We evaluated three scenarios of evacuation difficulty: (1) all bridges are intact with a 5-minute evacuation delay from the start of earthquake, (2) only retrofitted bridges are considered intact with a 5-minute delay, and (3) only retrofitted bridges are considered intact with a 10-minute delay. BTW maps also take into account critical evacuation points along complex shorelines (e.g., peninsulas, bridges over shore-parallel estuaries) where evacuees could be caught by tsunami waves. The BTW map is able to communicate multiple pedestrian travel speeds, which are typically visualized by multiple maps with current LCD-based mapping practices. Results demonstrate that evacuation of Seaside is problematic seaward of the shore-parallel waterways for those with any limitations on mobility. Tsunami vertical-evacuation refuges or additional pedestrian bridges may be effective ways of reducing loss of life seaward of these waterways.

  19. New Insights From Whole Rock and Mineral Data on the Magmatic and Tectonic Evolution of the Columbia River Basalt Group (USA)

    NASA Astrophysics Data System (ADS)

    Caprarelli, G.; Reidel, S. P.

    2004-12-01

    The Miocene Columbia River Basalt Group (CRBG) of north-western USA was emplaced in a geologically dynamic setting characterized by a close association between magmatism and lithospheric thinning and rifting. We present and discuss electron probe microanalysis and XRFA data obtained from samples spanning the entire sequence of the CRBG. The examined basalts have near-aphyric textures. No glass is present, and plagioclase and augitic clinopyroxene are dominant matrix and groundmass phases. Plagioclase microcrysts are labradoritic to bytownitic. Whole rock compositions were taken as proxies of the liquid compositions. Application of plagioclase / melt and clinopyroxene / melt geothermobarometers indicated that during crustal ascent the magmas were dry, and that pre-eruptive pressures and temperatures ranged from 0 to 0.66 GPa and 1393 to 1495 K, respectively. In a P-T diagram most of the samples are distributed along a general CRBG trend, while some samples plot along a parallel higher temperature trend. The calculated P-T values, the positive correlation between calculated P and T, and no horizontal alignment of the data, exclude the presence of upper crustal solidification fronts, and indicate that magma aggregation zones were located deeper than 25 km, plausibly immediately below the Moho, that in this region is at a depth of approximately 35 km. Episodic stretching of the lithosphere best explains the observed parallel P-T trends. Whole rock major element abundances resulted from fractional crystallization of the magmas during ascent. To retrieve the compositions of the primitive melts we added to the bulk rock compositions variable amounts of magnesian olivine [Mg/(Mg+Fe) = 0.88], and derived the evolution of olivine fractionating magmas in equilibrium with mantle harzburgite. Two groups of samples were found, corresponding to the parallel P-T trends obtained from mineral / melt calculations. The highest temperature trend corresponds to samples whose calculated primitive compositions are in agreement with those obtained from peridotite melting experiments (as published in the relevant literature). Interpretation of results for rocks belonging to the general CRBG trend suggests, either: (a) that higher forsteritic content olivine should be used in the calculations; or, (b) that melt / ol / opx reactions occurred. Investigation of the CRBG primitive compositions has relevance with regard to the geodynamic evolution models of this region. We are currently undertaking melt inclusion studies of suitable CRBG samples.

  20. The Rigour of IFRS Education in the USA: Analysis, Reflection and Innovativeness

    ERIC Educational Resources Information Center

    Tan, Aldys; Chatterjee, Bikram; Bolt, Susan

    2014-01-01

    International Financial Reporting Standards (IFRS) are accepted throughout the world, particularly in the European Union, Australia, New Zealand and Canada. Emerging economies are also are aligning their practices with IFRS. Historically, the USA has been cautious about accepting IFRS. However, following acceptance of IFRS worldwide, the US…

  1. Experiences of Asian Psychologists and Counselors Trained in the USA: An Exploratory Study

    ERIC Educational Resources Information Center

    Goh, Michael; Yon, Kyu Jin; Shimmi, Yukiko; Hirai, Tatsuya

    2014-01-01

    This study qualitatively explored the pre-departure to reentry experiences of Asian international psychologists and counselors trained in the USA. Semi-structured interviews were conducted with 10 participants from four different Asian countries. Inductive analysis with Consensual Qualitative Research methods was used to analyze the interview…

  2. Professional Training of Economists in the USA

    ERIC Educational Resources Information Center

    Rudnitska, Kateryna

    2014-01-01

    The article deals with the peculiarities of American professional undergraduate and graduate training in economics. The analysis of documents, scientific and educational literature demonstrates the diversity of the US training courses and combinations of disciplines in economics. It has been defined that leading position of the USA in the world…

  3. Regional Differences in Stratospheric Intrusions over the USA Investigated using the NASA MERRA-2 Reanalysis

    NASA Astrophysics Data System (ADS)

    Knowland, K. E.; Ott, L.; Hodges, K.; Wargan, K.; Duncan, B. N.

    2016-12-01

    Stratospheric intrusions (SI) - the introduction of ozone-rich stratospheric air into the troposphere - have been linked with surface ozone air quality exceedences, especially at the high elevations in the western USA in springtime. However, the impact of SIs in the remaining seasons and over the rest of the USA is less clear. This study investigates the atmospheric dynamics that generate SIs over the western USA and the different mechanisms through which SIs may influence atmospheric chemistry and surface air quality over the eastern USA. An analysis of the spatiotemporal variability of SIs over the continental US is performed using NASA's Modern-Era Retrospective Analysis for Research and Applications Version-2 (MERRA-2) reanalysis dataset and other Goddard Earth Observing System Model, Version 5 (GEOS-5) model products. Both upper-level and lower-level dynamical features are examined on seasonal timescales using the tracking algorithm of Hodges (1995, 1999). We show how upper-level relative vorticity maxima - representing troughs and cut-off lows - can be tracked and related to the lower-level storm tracks. The influence of both sets of tracks on the assimilated MERRA-2 ozone and meteorological parameters throughout the troposphere and lower stratosphere is quantified. By focusing on the major modes of variability that influence the weather patterns in the USA, namely the Pacific North American (PNA) pattern, Arctic Oscillation (AO) and the North Atlantic Oscillation (NAO), predicative patterns in the meteorological fields that are associated with SIs are identified for their regional effects.

  4. Electromagnetic Contact-Force Sensing Electrophysiological Catheters: How Accurate is the Technology?

    PubMed

    Bourier, Felix; Hessling, Gabriele; Ammar-Busch, Sonia; Kottmaier, Marc; Buiatti, Alessandra; Grebmer, Christian; Telishevska, Marta; Semmler, Verena; Lennerz, Carsten; Schneider, Christine; Kolb, Christof; Deisenhofer, Isabel; Reents, Tilko

    2016-03-01

    Contact-force (CF) sensing catheters are increasingly used in clinical electrophysiological practice due to their efficacy and safety profile. As data about the accuracy of this technology are scarce, we sought to quantify accuracy based on in vitro experiments. A custom-made force sensor was constructed that allowed exact force reference measurements registered via a flexible membrane. A Smarttouch Surround Flow (ST SF) ablation catheter (Biosense Webster, Diamond Bar, CA, USA) was brought in contact with the membrane of the force sensor in order to compare the ST SF force measurements to force sensor reference measurements. ST SF force sensing technology is based on deflection registration between the distal and proximal catheter tip. The experiment was repeated for n = 10 ST SF catheters, which showed no significant difference in accuracy levels. A series of measurements (n = 1200) was carried out for different angles of force acting to the catheter tip (0°/perpendicular contact, 30°, 60°, 90°/parallel contact). The mean absolute differences between reference and ST SF measurements were 1.7 ± 1.8 g (0°), 1.6 ± 1.2 g (30°), 1.4 ± 1.3 g (60°), and 6.6 ± 5.9 g (90°). Measurement accuracy was significantly higher in non-parallel contact when compared with parallel contact (P < 0.01). Catheter force measurements using the ST SF catheters show a high level of accuracy regarding differences to reference measurements and reproducibility. The reduced accuracy in measurements of 90° acting forces (parallel contact) might be clinically important when creating, for example, linear lesions. © 2015 Wiley Periodicals, Inc.

  5. Geographical and genospecies distribution of Borrelia burgdorferi sensu lato DNA detected in humans in the USA.

    PubMed

    Clark, Kerry L; Leydet, Brian F; Threlkeld, Clifford

    2014-05-01

    The present study investigated the cause of illness in human patients primarily in the southern USA with suspected Lyme disease based on erythema migrans-like skin lesions and/or symptoms consistent with early localized or late disseminated Lyme borreliosis. The study also included some patients from other states throughout the USA. Several PCR assays specific for either members of the genus Borrelia or only for Lyme group Borrelia spp. (Borrelia burgdorferi sensu lato), and DNA sequence analysis, were used to identify Borrelia spp. DNA in blood and skin biopsy samples from human patients. B. burgdorferi sensu lato DNA was found in both blood and skin biopsy samples from patients residing in the southern states and elsewhere in the USA, but no evidence of DNA from other Borrelia spp. was detected. Based on phylogenetic analysis of partial flagellin (flaB) gene sequences, strains that clustered separately with B. burgdorferi sensu stricto, Borrelia americana or Borrelia andersonii were associated with Lyme disease-like signs and symptoms in patients from the southern states, as well as from some other areas of the country. Strains most similar to B. burgdorferi sensu stricto and B. americana were found most commonly and appeared to be widely distributed among patients residing throughout the USA. The study findings suggest that human cases of Lyme disease in the southern USA may be more common than previously recognized and may also be caused by more than one species of B. burgdorferi sensu lato. This study provides further evidence that B. burgdorferi sensu stricto is not the only species associated with signs and/or symptoms consistent with Lyme borreliosis in the USA.

  6. An asymptotic induced numerical method for the convection-diffusion-reaction equation

    NASA Technical Reports Server (NTRS)

    Scroggs, Jeffrey S.; Sorensen, Danny C.

    1988-01-01

    A parallel algorithm for the efficient solution of a time dependent reaction convection diffusion equation with small parameter on the diffusion term is presented. The method is based on a domain decomposition that is dictated by singular perturbation analysis. The analysis is used to determine regions where certain reduced equations may be solved in place of the full equation. Parallelism is evident at two levels. Domain decomposition provides parallelism at the highest level, and within each domain there is ample opportunity to exploit parallelism. Run time results demonstrate the viability of the method.

  7. Conference report: Clinical and Pharmaceutical Solutions through analysis (CPSA USA 2013): connecting patients and subject numbers through analysis.

    PubMed

    Needham, Shane; Premkumar, Noel; Weng, Naidong; Lee, Mike

    2014-02-01

    The 16th Annual Symposium on Clinical and Pharmaceutical Solutions through Analysis (CPSA) 7-10 October 2013, Sheraton Bucks County Hotel, Langhorne, PA, USA. The 2013 CPSA brought together the various US FDA regulated analytical fields affecting a 'patient' for the first time - bioanalysts supporting IND and NDAs, clinical diagnostic and pathology laboratory personnel, and clinical researchers that provide insights into new biomarkers. Although the regulatory requirements are different for each of the above disciplines, the unique analytical perspectives that affect the patient were shared - and the goal of the 2013 CPSA - 'Connecting Patients and Subject Numbers Through Analysis' was achieved.

  8. Prevalence and genetic heterogeneity of porcine group C rotaviruses in nursing and weaned piglets in Ohio, USA and identification of a potential new VP4 genotype.

    PubMed

    Amimo, J O; Vlasova, A N; Saif, L J

    2013-05-31

    Swine fecal samples collected from seven farms were screened for group C rotaviruses (RVCs) using a reverse transcription-polymerase chain reaction assay. A total of 380 samples were tested and 19.5% were positive. Of the 128 samples collected in 2012, 23.5% from nursing piglets and 8.5% from weaned piglets were RVC positive, with a higher RVC frequency in diarrheic (28.4%) than in non-diarrheic (6.6%) piglets. Two strains (RVC/Pig-wt/USA/RV0104/2011/G3PX and RVC/Pig-wt/USA/RV0143/2012/G6Px) from two different farms were characterized genetically to gain information on virus diversity based on full length sequences of the inner capsid VP6, enterotoxin NSP4 and the outer capsid VP7 and VP4 (partial for RV0104) genes. The VP6 gene of the two strains showed high (99%) nucleotide identity to one another, 84-91% identity to other porcine RVCstrains and 81-82% identity to human and bovine RVC strains. The NSP4 gene analysis revealed that RVC/Pig-wt/USA/RV0104/2011/G3PX and RVC/Pig-wt/USA/RV0143/2012/G6Px strains were not closely related to each other (87% identity), but shared higher identity with prototype RVC/Pig-wt/USA/Cowden/1980/G1Px strain (93% and 89%, respectively) and were more distantly related to human strains (72-76% identity). The VP7 gene analysis indicated that the two strains were distantly related to one another (72% identity). RVC/Pig-wt/USA/RV0143/2012/G6Px was most closely related to porcine RVC G6 strains (82-86% identity), whereas RVC/Pig-wt/USA/RV0104/2011/G3PX was most closely related to porcine HF (G3) strain (94% identity). Analysis of the full length nucleotide sequence of the VP4 gene revealed that RVC/Pig-wt/USA/RV0143/2012/G6Px was distantly related to porcine (75%), bovine (74%) and human (70%) strains. The deduced amino acid identities (69.5-75.6%) of VP4 between RVC/Pig-wt/USA/RV0143/2012/G6Px and other RVCs were low; hence, we propose that this strain comprises a new VP4 genotype. Our results indicate high genetic heterogeneity in RVCs genes and the concurrent co-circulation of different genotypes at the same time. Our findings are useful for the development of more accurate diagnostic tools, for basic research to understand gene function and to provide information for RVC diversity germane to vaccine development. Copyright © 2013 Elsevier B.V. All rights reserved.

  9. 76 FR 2853 - Approval and Promulgation of Air Quality Implementation Plans; Delaware; Infrastructure State...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-18

    ... technical analysis submitted for parallel-processing by DNREC on December 9, 2010, to address significant... technical analysis submitted by DNREC for parallel-processing on December 9, 2010, to satisfy the... consists of a technical analysis that provides detailed support for Delaware's position that it has...

  10. National Combustion Code: Parallel Implementation and Performance

    NASA Technical Reports Server (NTRS)

    Quealy, A.; Ryder, R.; Norris, A.; Liu, N.-S.

    2000-01-01

    The National Combustion Code (NCC) is being developed by an industry-government team for the design and analysis of combustion systems. CORSAIR-CCD is the current baseline reacting flow solver for NCC. This is a parallel, unstructured grid code which uses a distributed memory, message passing model for its parallel implementation. The focus of the present effort has been to improve the performance of the NCC flow solver to meet combustor designer requirements for model accuracy and analysis turnaround time. Improving the performance of this code contributes significantly to the overall reduction in time and cost of the combustor design cycle. This paper describes the parallel implementation of the NCC flow solver and summarizes its current parallel performance on an SGI Origin 2000. Earlier parallel performance results on an IBM SP-2 are also included. The performance improvements which have enabled a turnaround of less than 15 hours for a 1.3 million element fully reacting combustion simulation are described.

  11. Decoupling Principle Analysis and Development of a Parallel Three-Dimensional Force Sensor

    PubMed Central

    Zhao, Yanzhi; Jiao, Leihao; Weng, Dacheng; Zhang, Dan; Zheng, Rencheng

    2016-01-01

    In the development of the multi-dimensional force sensor, dimension coupling is the ubiquitous factor restricting the improvement of the measurement accuracy. To effectively reduce the influence of dimension coupling on the parallel multi-dimensional force sensor, a novel parallel three-dimensional force sensor is proposed using a mechanical decoupling principle, and the influence of the friction on dimension coupling is effectively reduced by making the friction rolling instead of sliding friction. In this paper, the mathematical model is established by combining with the structure model of the parallel three-dimensional force sensor, and the modeling and analysis of mechanical decoupling are carried out. The coupling degree (ε) of the designed sensor is defined and calculated, and the calculation results show that the mechanical decoupling parallel structure of the sensor possesses good decoupling performance. A prototype of the parallel three-dimensional force sensor was developed, and FEM analysis was carried out. The load calibration and data acquisition experiment system are built, and then calibration experiments were done. According to the calibration experiments, the measurement accuracy is less than 2.86% and the coupling accuracy is less than 3.02%. The experimental results show that the sensor system possesses high measuring accuracy, which provides a basis for the applied research of the parallel multi-dimensional force sensor. PMID:27649194

  12. Parallel processing of genomics data

    NASA Astrophysics Data System (ADS)

    Agapito, Giuseppe; Guzzi, Pietro Hiram; Cannataro, Mario

    2016-10-01

    The availability of high-throughput experimental platforms for the analysis of biological samples, such as mass spectrometry, microarrays and Next Generation Sequencing, have made possible to analyze a whole genome in a single experiment. Such platforms produce an enormous volume of data per single experiment, thus the analysis of this enormous flow of data poses several challenges in term of data storage, preprocessing, and analysis. To face those issues, efficient, possibly parallel, bioinformatics software needs to be used to preprocess and analyze data, for instance to highlight genetic variation associated with complex diseases. In this paper we present a parallel algorithm for the parallel preprocessing and statistical analysis of genomics data, able to face high dimension of data and resulting in good response time. The proposed system is able to find statistically significant biological markers able to discriminate classes of patients that respond to drugs in different ways. Experiments performed on real and synthetic genomic datasets show good speed-up and scalability.

  13. Wikipedia mining of hidden links between political leaders

    NASA Astrophysics Data System (ADS)

    Frahm, Klaus M.; Jaffrès-Runser, Katia; Shepelyansky, Dima L.

    2016-12-01

    We describe a new method of reduced Google matrix which allows to establish direct and hidden links between a subset of nodes of a large directed network. This approach uses parallels with quantum scattering theory, developed for processes in nuclear and mesoscopic physics and quantum chaos. The method is applied to the Wikipedia networks in different language editions analyzing several groups of political leaders of USA, UK, Germany, France, Russia and G20. We demonstrate that this approach allows to recover reliably direct and hidden links among political leaders. We argue that the reduced Google matrix method can form the mathematical basis for studies in social and political sciences analyzing Leader-Members eXchange (LMX).

  14. Characterization of a parallel beam CCD optical-CT apparatus for 3D radiation dosimetry

    NASA Astrophysics Data System (ADS)

    Krstajić, Nikola; Doran, Simon J.

    2006-12-01

    This paper describes the initial steps we have taken in establishing CCD based optical-CT as a viable alternative for 3-D radiation dosimetry. First, we compare the optical density (OD) measurements from a high quality test target and variable neutral density filter (VNDF). A modulation transfer function (MTF) of individual projections is derived for three positions of the sinusoidal test target within the scanning tank. Our CCD is then characterized in terms of its signal-to-noise ratio (SNR). Finally, a sample reconstruction of a scan of a PRESAGETM (registered trademark of Heuris Pharma, NJ, Skillman, USA.) dosimeter is given, demonstrating the capabilities of the apparatus.

  15. Innovations in vaccine development: can regulatory authorities keep up?

    PubMed

    Cox, Manon M J; Onraedt, Annelies

    2012-10-01

    Vaccine Production Summit San Francisco, CA, USA, 4-6 June 2012 IBC's 3rd Vaccine Production Summit featured 28 presentations discussing regulatory challenges in vaccine development, including the use of adjuvants, vaccine manufacturing and technology transfer, process development for vaccines and the role of quality by design, how to address vaccine stability, and how vaccine development timelines can be improved. The conference was run in parallel with the single-use applications for Biopharmaceutical Manufacturing conference. Approximately 250 attendees from large pharmaceutical companies, large and small biotech companies, vendors and a more limited number from academia were allowed to access sessions of either conference, including one shared session. This article summarizes the recurring themes across various presentations.

  16. Compound maar crater and co-eruptive scoria cone in the Lunar Crater Volcanic Field (Nevada, USA)

    NASA Astrophysics Data System (ADS)

    Amin, Jamal; Valentine, Greg A.

    2017-06-01

    Bea's Crater (Lunar Crater Volcanic Field, Nevada, USA) consists of two coalesced maar craters with diameters of 440 m and 1050 m, combined with a co-eruptive scoria cone that straddles the northeast rim of the larger crater. The two craters and the cone form an alignment that parallels many local and regional structures such as normal faults, and is interpreted to represent the orientation of the feeder dyke near the surface. The maar formed among a dense cluster of scoria cones; the cone-cluster topography resulted in crater rim that has a variable elevation. These older cones are composed of variably welded agglomerate and scoria with differing competence that subsequently affected the shape of Bea's Crater. Tephra ring deposits associated with phreatomagmatic maar-forming eruptions are rich in basaltic lithics derived from < 250 m depth, with variable contents of deeper-seated ignimbrite lithic clasts, consistent with ejection from relatively shallow explosions although a diatreme might extend to deeper levels beneath the maar. Interbedding of deposits on the northeastern cone and in the tephra ring record variations in the magmatic volatile driven and phreatomagmatic eruption styles in both space and time along a feeder dike.

  17. PREFACE: 11th International Conference on Nucleus-Nucleus Collisions (NN2012)

    NASA Astrophysics Data System (ADS)

    Li, Bao-An; Natowitz, Joseph B.

    2013-03-01

    The 11th International Conference on Nucleus-Nucleus Collisions (NN2012) was held from 27 May to 1 June 2012, in San Antonio, Texas, USA. It was jointly organized and hosted by The Cyclotron Institute at Texas A&M University, College Station and The Department of Physics and Astronomy at Texas A&M University-Commerce. Among the approximately 300 participants were a large number of graduate students and post-doctoral fellows. The Keynote Talk of the conference, 'The State of Affairs of Present and Future Nucleus-Nucleus Collision Science', was given by Dr Robert Tribble, University Distinguished Professor and Director of the TAMU Cyclotron Institute. During the conference a very well-received public lecture on neutrino astronomy, 'The ICEcube project', was given by Dr Francis Halzen, Hilldale and Gregory Breit Distinguished Professor at the University of Wisconsin, Madison. The Scientific program continued in the general spirit and intention of this conference series. As is typical of this conference a broad range of topics including fundamental areas of nuclear dynamics, structure, and applications were addressed in 42 plenary session talks, 150 parallel session talks, and 21 posters. The high quality of the work presented emphasized the vitality and relevance of the subject matter of this conference. Following the tradition, the NN2012 International Advisory Committee selected the host and site of the next conference in this series. The 12th International Conference on Nucleus-Nucleus Collisions (NN2015) will be held 21-26 June 2015 in Catania, Italy. It will be hosted by The INFN, Laboratori Nazionali del Sud, INFN, Catania and the Dipartimento di Fisica e Astronomia of the University of Catania. The NN2012 Proceedings contains the conference program and 165 articles organized into the following 10 sections 1. Heavy and Superheavy Elements 2. QCD and Hadron Physics 3. Relativistic Heavy-Ion Collisions 4. Nuclear Structure 5. Nuclear Energy and Applications of Nuclear Science and Technologies 6. Nuclear Reactions and Structure of Unstable Nuclei 7. Equation of State of Neutron-Rich Nuclear Matter, Clusters in Nuclei and Nuclear Reactions 8. Fusion and Fission 9. Nuclear Astrophysics 10. New Facilities and Detectors We would like to thank Texas A&M University and Texas A&M University-Commerce for their organizational support and for providing financial support for many students and postdocs and those who had special need. This support helped assure the success of NN2012. Special thanks also go to all members of the International Advisory Committee and the Local Organizing Committee (listed below) for their great work in advising upon, preparing and executing the NN2012 scientific program as well as the social events that all together made the NN2012 an enjoyable experience for both the participants and their companions. NN2012 International Advisory Committee N Auerbach (Israel) J Aysto (Finland) C Beck (France) S Cherubini (Italy) L Ferreira (Portugal) C Gagliardi (USA) S Gales (France) C Gale (Canada) W Gelletly (Great Britain) Paulo R S Gomes (Brazil) W Greiner (Germany) W Henning (USA) D Hinde (Australia) S Hofmann (Germany) M Hussein (Brazil) B Jacak (USA) S Kailas (India) W G Lynch (USA) Z Majka (Poland) L McLerran (USA) V Metag (Germany) K Morita (Japan) B Mueller (USA) D G Mueller (France) T Motobayashi (Japan) W Nazarewicz (USA) Y Oganessian (Russia) J Nolen (USA) E K Rehm (USA) N Rowley (France) B Sherrill (USA) J Schukraft (Switzerland) W Q Shen (China) A Stefanini (Italy) H Stoecker (Germany) A Szanto de Toledo (Brazil) U van Kolck (USA) W von Oertzen (Germany) M Wiescher (USA) N Xu (USA) N V Zamfir (Romania) W L Zhan (China) H Q Zhang (China) NN2012 Local Organizing Committee Marina Barbui Carlos Bertulani Robert Burch Jr Cheri Davis Cody Folden Kris Hagel John Hardy Bao-An Li (Co-Chair and Scientific Secretary) Joseph Natowitz (Co-Chair) Ralf Rapp Livius Trache Sherry Yennello Editors of NN2012 Proceedings Bao-An Li (Texas A&M University-Commerce) and Joseph Natowitz (Texas A&M University) 7 January 2013, Texas, USA

  18. Quantitative Relationship Between AUEC of Absolute Neutrophil Count and Duration of Severe Neutropenia for G-CSF in Breast Cancer Patients.

    PubMed

    Li, Liang; Ma, Lian; Schrieber, Sarah J; Rahman, Nam Atiqur; Deisseroth, Albert; Farrell, Ann T; Wang, Yaning; Sinha, Vikram; Marathe, Anshu

    2018-02-02

    The aim of the study was to evaluate the quantitative relationship between duration of severe neutropenia (DSN, the efficacy endpoint) and area under effect curve of absolute neutrophil counts (ANC-AUEC, the pharmacodynamic endpoint), based on data from filgrastim products, a human granulocyte colony-stimulating factor (G-CSF). Clinical data from filgrastim product comparator and test arms of two randomized, parallel-group, phase III studies in breast cancer patients treated with myelosuppressive chemotherapy were utilized. A zero-inflated Poisson regression model best described the negative correlation between DSN and ANC-AUEC. The models predicted that with 10 × 10 9 day/L of increase in ANC-AUEC, the mean DSN would decrease from 1.1 days to 0.93 day in Trial 1 and from 1.2 days to 1.0 day in Trial 2. The findings of the analysis provide useful information regarding the relationship between ANC and DSN that can be used for dose selection and optimization of clinical trial design for G-CSF. Published 2018. This article is a U.S. Government work and is in the public domain in the USA.

  19. Elemental, isotopic, and geochronological variability in Mogollon-Datil volcanic province archaeological obsidian, southwestern USA: Solving issues of intersource discrimination

    USGS Publications Warehouse

    Shackley, M. Steven; Morgan, Leah; Pyle, Douglas

    2017-01-01

    Solving issues of intersource discrimination in archaeological obsidian is a recurring problem in geoarchaeological investigation, particularly since the number of known sources of archaeological obsidian worldwide has grown nearly exponentially in the last few decades, and the complexity of archaeological questions asked has grown equally so. These two parallel aspects of archaeological investigation have required more exacting understanding of the geological relationship between sources and the more accurate analysis of these sources of archaeological obsidian. This is particularly the case in the North American Southwest where the frequency of archaeological investigation is some of the highest in the world, and the theory and method used to interpret that record has become increasingly nuanced. Here, we attempt to unravel the elemental similarity of archaeological obsidian in the Mogollon-Datil volcanic province of southwestern New Mexico where some of the most important and extensively distributed sources are located and the elemental similarity between the sources is great even though the distance between the sources is large. Uniting elemental, isotopic, and geochronological analyses as an intensive pilot study, we unpack this complexity to provide greater understanding of these important sources of archaeological obsidian.

  20. Synchrotron-based multiple-beam FTIR chemical imaging of a multi-layered polymer in transmission and reflection: towards cultural heritage applications

    NASA Astrophysics Data System (ADS)

    Unger, Miriam; Mattson, Eric; Schmidt Patterson, Catherine; Alavi, Zahrasadet; Carson, David; Hirschmugl, Carol J.

    2013-04-01

    IRENI (infrared environmental imaging) is a recently commissioned Fourier transform infrared (FTIR) chemical imaging beamline at the Synchrotron Radiation Center in Madison, WI, USA. This novel beamline extracts 320 mrad of radiation, horizontally, from one bending magnet. The optical transport separates and recombines the beam into 12 parallel collimated beams to illuminate a commercial FTIR microspectrometer (Bruker Hyperion 3000) equipped with a focal plane array detector where single pixels in the detector image a projected sample area of either 0.54×0.54 μm2 or 2×2 μm2, depending in the measurement geometry. The 12 beams are partially overlapped and defocused, similar to wide-field microscopy, homogeneously illuminating a relatively large sample area compared to single-beam arrangements. Both transmission and reflection geometries are used to examine a model cross section from a layered polymer material. The compromises for sample preparation and measurement strategies are discussed, and the chemical composition and spatial definition of the layers are distinguished in chemical images generated from data sets. Deconvolution methods that may allow more detailed data analysis are also discussed.

  1. Comparative analysis of USA300 virulence determinants in a rabbit model of skin and soft tissue infection.

    PubMed

    Kobayashi, Scott D; Malachowa, Natalia; Whitney, Adeline R; Braughton, Kevin R; Gardner, Donald J; Long, Dan; Bubeck Wardenburg, Juliane; Schneewind, Olaf; Otto, Michael; Deleo, Frank R

    2011-09-15

    Community-associated methicillin-resistant Staphylococcus aureus (CA-MRSA) infections are frequently associated with strains harboring genes encoding Panton-Valentine leukocidin (PVL). The role of PVL in the success of the epidemic CA-MRSA strain USA300 remains unknown. Here we developed a skin and soft tissue infection model in rabbits to test the hypothesis that PVL contributes to USA300 pathogenesis and compare it with well-established virulence determinants: alpha-hemolysin (Hla), phenol-soluble modulin-alpha peptides (PSMα), and accessory gene regulator (Agr). The data indicate that Hla, PSMα, and Agr contribute to the pathogenesis of USA300 skin infections in rabbits, whereas a role for PVL could not be detected.

  2. Content Trends in Sustainable Business Education: An Analysis of Introductory Courses in the USA

    ERIC Educational Resources Information Center

    Landrum, Nancy E.; Ohsowski, Brian

    2017-01-01

    Purpose: This study aims to identify the content in introductory business sustainability courses in the USA to determine the most frequently assigned reading material and its sustainability orientation. Design/methodology/approach: In total, 81 introductory sustainable business course syllabi reading lists were analyzed from 51 US colleges and…

  3. Accuracy of the Parallel Analysis Procedure with Polychoric Correlations

    ERIC Educational Resources Information Center

    Cho, Sun-Joo; Li, Feiming; Bandalos, Deborah

    2009-01-01

    The purpose of this study was to investigate the application of the parallel analysis (PA) method for choosing the number of factors in component analysis for situations in which data are dichotomous or ordinal. Although polychoric correlations are sometimes used as input for component analyses, the random data matrices generated for use in PA…

  4. Diderot: a Domain-Specific Language for Portable Parallel Scientific Visualization and Image Analysis.

    PubMed

    Kindlmann, Gordon; Chiw, Charisee; Seltzer, Nicholas; Samuels, Lamont; Reppy, John

    2016-01-01

    Many algorithms for scientific visualization and image analysis are rooted in the world of continuous scalar, vector, and tensor fields, but are programmed in low-level languages and libraries that obscure their mathematical foundations. Diderot is a parallel domain-specific language that is designed to bridge this semantic gap by providing the programmer with a high-level, mathematical programming notation that allows direct expression of mathematical concepts in code. Furthermore, Diderot provides parallel performance that takes advantage of modern multicore processors and GPUs. The high-level notation allows a concise and natural expression of the algorithms and the parallelism allows efficient execution on real-world datasets.

  5. Assessing Past Fracture Connectivity in Geothermal Reservoirs Using Clumped Isotopes: Proof of Concept in the Blue Mountain Geothermal Field, Nevada USA

    NASA Astrophysics Data System (ADS)

    Huntington, K. W.; Sumner, K. K.; Camp, E. R.; Cladouhos, T. T.; Uddenberg, M.; Swyer, M.; Garrison, G. H.

    2015-12-01

    Subsurface fluid flow is strongly influenced by faults and fractures, yet the transmissivity of faults and fractures changes through time due to deformation and cement precipitation, making flow paths difficult to predict. Here we assess past fracture connectivity in an active hydrothermal system in the Basin and Range, Nevada, USA, using clumped isotope geochemistry and cold cathodoluminescence (CL) analysis of fracture filling cements from the Blue Mountain geothermal field. Calcite cements were sampled from drill cuttings and two cores at varying distances from faults. CL microscopy of some of the cements shows banding parallel to the fracture walls as well as brecciation, indicating that the cements record variations in the composition and source of fluids that moved through the fractures as they opened episodically. CL microscopy, δ13C and δ18O values were used to screen homogeneous samples for clumped isotope analysis. Clumped isotope thermometry of most samples indicates paleofluid temperatures of around 150°C, with several wells peaking at above 200°C. We suggest that the consistency of these temperatures is related to upwelling of fluids in the convective hydrothermal system, and interpret the similarity of the clumped isotope temperatures to modern geothermal fluid temperatures of ~160-180°C as evidence that average reservoir temperatures have changed little since precipitation of the calcite cements. In contrast, two samples, one of which was associated with fault gauge observed in drill logs, record significantly cooler temperatures of 19 and 73°C and anomalous δ13C and δ18Owater values, which point to fault-controlled pathways for downwelling meteoric fluid. Finally, we interpret correspondence of paleofluid temperatures and δ18Owater values constrained by clumped isotope thermometry of calcite from different wells to suggest past connectivity of fractures among wells within the geothermal field. Results show the ability of clumped isotope geothermometry to assess fracture connectivity and geothermal reservoir characteristics in the past—with the potential to help optimize resource production and injection programs and better understand structural controls on mass and heat transfer in the subsurface.

  6. Interfacing Computer Aided Parallelization and Performance Analysis

    NASA Technical Reports Server (NTRS)

    Jost, Gabriele; Jin, Haoqiang; Labarta, Jesus; Gimenez, Judit; Biegel, Bryan A. (Technical Monitor)

    2003-01-01

    When porting sequential applications to parallel computer architectures, the program developer will typically go through several cycles of source code optimization and performance analysis. We have started a project to develop an environment where the user can jointly navigate through program structure and performance data information in order to make efficient optimization decisions. In a prototype implementation we have interfaced the CAPO computer aided parallelization tool with the Paraver performance analysis tool. We describe both tools and their interface and give an example for how the interface helps within the program development cycle of a benchmark code.

  7. Genetic structure of Aegilops cylindrica Host in its native range and in the United States of America.

    PubMed

    Gandhi, Harish T; Vales, M Isabel; Mallory-Smith, Carol; Riera-Lizarazu, Oscar

    2009-10-01

    Chloroplast and nuclear microsatellite markers were used to study genetic diversity and genetic structure of Aegilops cylindrica Host collected in its native range and in adventive sites in the USA. Our analysis suggests that Ae. cylindrica, an allotetraploid, arose from multiple hybridizations between Ae. markgrafii (Greuter) Hammer. and Ae. tauschii Coss. presumably along the Fertile Crescent, where the geographic distributions of its diploid progenitors overlap. However, the center of genetic diversity of this species now encompasses a larger area including northern Iraq, eastern Turkey, and Transcaucasia. Although the majority of accessions of Ae. cylindrica (87%) had D-type plastomes derived from Ae. tauschii, accessions with C-type plastomes (13%), derived from Ae. markgrafii, were also observed. This corroborates a previous study suggesting the dimaternal origin of Ae. cylindrica. Model-based and genetic distance-based clustering using both chloroplast and nuclear markers indicated that Ae. tauschii ssp. tauschii contributed one of its D-type plastomes and its D genome to Ae. cylindrica. Analysis of genetic structure using nuclear markers suggested that Ae. cylindrica accessions could be grouped into three subpopulations (arbitrarily named N-K1, N-K2, and N-K3). Members of the N-K1 subpopulation were the most numerous in its native range and members of the N-K2 subpopulation were the most common in the USA. Our analysis also indicated that Ae. cylindrica accessions in the USA were derived from a few founder genotypes. The frequency of Ae. cylindrica accessions with the C-type plastome in the USA (approximately 24%) was substantially higher than in its native range of distribution (approximately 3%) and all C-type Ae. cylindrica in the USA except one belonged to subpopulation N-K2. The high frequency of the C-type plastome in the USA may reflect a favorable nucleo-cytoplasmic combination.

  8. StrAuto: automation and parallelization of STRUCTURE analysis.

    PubMed

    Chhatre, Vikram E; Emerson, Kevin J

    2017-03-24

    Population structure inference using the software STRUCTURE has become an integral part of population genetic studies covering a broad spectrum of taxa including humans. The ever-expanding size of genetic data sets poses computational challenges for this analysis. Although at least one tool currently implements parallel computing to reduce computational overload of this analysis, it does not fully automate the use of replicate STRUCTURE analysis runs required for downstream inference of optimal K. There is pressing need for a tool that can deploy population structure analysis on high performance computing clusters. We present an updated version of the popular Python program StrAuto, to streamline population structure analysis using parallel computing. StrAuto implements a pipeline that combines STRUCTURE analysis with the Evanno Δ K analysis and visualization of results using STRUCTURE HARVESTER. Using benchmarking tests, we demonstrate that StrAuto significantly reduces the computational time needed to perform iterative STRUCTURE analysis by distributing runs over two or more processors. StrAuto is the first tool to integrate STRUCTURE analysis with post-processing using a pipeline approach in addition to implementing parallel computation - a set up ideal for deployment on computing clusters. StrAuto is distributed under the GNU GPL (General Public License) and available to download from http://strauto.popgen.org .

  9. Parallelizing Timed Petri Net simulations

    NASA Technical Reports Server (NTRS)

    Nicol, David M.

    1993-01-01

    The possibility of using parallel processing to accelerate the simulation of Timed Petri Nets (TPN's) was studied. It was recognized that complex system development tools often transform system descriptions into TPN's or TPN-like models, which are then simulated to obtain information about system behavior. Viewed this way, it was important that the parallelization of TPN's be as automatic as possible, to admit the possibility of the parallelization being embedded in the system design tool. Later years of the grant were devoted to examining the problem of joint performance and reliability analysis, to explore whether both types of analysis could be accomplished within a single framework. In this final report, the results of our studies are summarized. We believe that the problem of parallelizing TPN's automatically for MIMD architectures has been almost completely solved for a large and important class of problems. Our initial investigations into joint performance/reliability analysis are two-fold; it was shown that Monte Carlo simulation, with importance sampling, offers promise of joint analysis in the context of a single tool, and methods for the parallel simulation of general Continuous Time Markov Chains, a model framework within which joint performance/reliability models can be cast, were developed. However, very much more work is needed to determine the scope and generality of these approaches. The results obtained in our two studies, future directions for this type of work, and a list of publications are included.

  10. Multirate-based fast parallel algorithms for 2-D DHT-based real-valued discrete Gabor transform.

    PubMed

    Tao, Liang; Kwan, Hon Keung

    2012-07-01

    Novel algorithms for the multirate and fast parallel implementation of the 2-D discrete Hartley transform (DHT)-based real-valued discrete Gabor transform (RDGT) and its inverse transform are presented in this paper. A 2-D multirate-based analysis convolver bank is designed for the 2-D RDGT, and a 2-D multirate-based synthesis convolver bank is designed for the 2-D inverse RDGT. The parallel channels in each of the two convolver banks have a unified structure and can apply the 2-D fast DHT algorithm to speed up their computations. The computational complexity of each parallel channel is low and is independent of the Gabor oversampling rate. All the 2-D RDGT coefficients of an image are computed in parallel during the analysis process and can be reconstructed in parallel during the synthesis process. The computational complexity and time of the proposed parallel algorithms are analyzed and compared with those of the existing fastest algorithms for 2-D discrete Gabor transforms. The results indicate that the proposed algorithms are the fastest, which make them attractive for real-time image processing.

  11. Three-dimensional anisotropy contrast periodically rotated overlapping parallel lines with enhanced reconstruction (3DAC PROPELLER) on a 3.0T system: a new modality for routine clinical neuroimaging.

    PubMed

    Nakada, Tsutomu; Matsuzawa, Hitoshi; Fujii, Yukihiko; Takahashi, Hitoshi; Nishizawa, Masatoyo; Kwee, Ingrid L

    2006-07-01

    Clinical magnetic resonance imaging (MRI) has recently entered the "high-field" era, and systems equipped with 3.0-4.0T superconductive magnets are becoming the gold standard for diagnostic imaging. While higher signal-to-noise ratio (S/N) is a definite advantage of higher field systems, higher susceptibility effect remains to be a significant trade-off. To take advantage of a higher field system in performing routine clinical images of higher anatomical resolution, we implemented a vector contrast image technique to 3.0T imaging, three-dimensional anisotropy contrast (3DAC), with a PROPELLER (Periodically Rotated Overlapping Parallel Lines with Enhanced Reconstruction) sequence, a method capable of effectively eliminating undesired artifacts on rapid diffusion imaging sequences. One hundred subjects (20 normal volunteers and 80 volunteers with various central nervous system diseases) participated in the study. Anisotropic diffusion-weighted PROPELLER images were obtained on a General Electric (Waukesha, WI, USA) Signa 3.0T for each axis, with b-value of 1100 sec/mm(2). Subsequently, 3DAC images were constructed using in-house software written on MATLAB (MathWorks, Natick, MA, USA). The vector contrast allows for providing exquisite anatomical detail illustrated by clear identification of all major tracts through the entire brain. 3DAC images provide better anatomical resolution for brainstem glioma than higher-resolution T2 reversed images. Degenerative processes of disease-specific tracts were clearly identified as illustrated in cases of multiple system atrophy and Joseph-Machado disease. Anatomical images of significantly higher resolution than the best current standard, T2 reversed images, were successfully obtained. As a technique readily applicable under routine clinical setting, 3DAC PROPELLER on a 3.0T system will be a powerful addition to diagnostic imaging.

  12. Social isolation and loneliness in later life: A parallel convergent mixed-methods case study of older adults and their residential contexts in the Minneapolis metropolitan area, USA.

    PubMed

    Finlay, Jessica M; Kobayashi, Lindsay C

    2018-07-01

    Social isolation and loneliness are increasingly prevalent among older adults in the United States, with implications for morbidity and mortality risk. Little research to date has examined the complex person-place transactions that contribute to social well-being in later life. This study aimed to characterize personal and neighborhood contextual influences on social isolation and loneliness among older adults. Interviews were conducted with independent-dwelling men and women (n = 124; mean age 71 years) in the Minneapolis metropolitan area (USA) from June to October, 2015. A convergent mixed-methods design was applied, whereby quantitative and qualitative approaches were used in parallel to gain simultaneous insights into statistical associations and in-depth individual perspectives. Logistic regression models predicted self-reported social isolation and loneliness, adjusted for age, gender, past occupation, race/ethnicity, living alone, street type, residential location, and residential density. Qualitative thematic analyses of interview transcripts probed individual experiences with social isolation and loneliness. The quantitative results suggested that African American adults, those with a higher socioeconomic status, those who did not live alone, and those who lived closer to the city center were less likely to report feeling socially isolated or lonely. The qualitative results identified and explained variation in outcomes within each of these factors. They provided insight on those who lived alone but did not report feeling lonely, finding that solitude was sought after and enjoyed by a portion of participants. Poor physical and mental health often resulted in reporting social isolation, particularly when coupled with poor weather or low-density neighborhoods. At the same time, poor health sometimes provided opportunities for valued social engagement with caregivers, family, and friends. The combination of group-level risk factors and in-depth personal insights provided by this mixed-methodology may be useful to develop strategies that address social isolation and loneliness in older communities. Copyright © 2018 Elsevier Ltd. All rights reserved.

  13. Comparison of intervention effects in split-mouth and parallel-arm randomized controlled trials: a meta-epidemiological study

    PubMed Central

    2014-01-01

    Background Split-mouth randomized controlled trials (RCTs) are popular in oral health research. Meta-analyses frequently include trials of both split-mouth and parallel-arm designs to derive combined intervention effects. However, carry-over effects may induce bias in split- mouth RCTs. We aimed to assess whether intervention effect estimates differ between split- mouth and parallel-arm RCTs investigating the same questions. Methods We performed a meta-epidemiological study. We systematically reviewed meta- analyses including both split-mouth and parallel-arm RCTs with binary or continuous outcomes published up to February 2013. Two independent authors selected studies and extracted data. We used a two-step approach to quantify the differences between split-mouth and parallel-arm RCTs: for each meta-analysis. First, we derived ratios of odds ratios (ROR) for dichotomous data and differences in standardized mean differences (∆SMD) for continuous data; second, we pooled RORs or ∆SMDs across meta-analyses by random-effects meta-analysis models. Results We selected 18 systematic reviews, for 15 meta-analyses with binary outcomes (28 split-mouth and 28 parallel-arm RCTs) and 19 meta-analyses with continuous outcomes (28 split-mouth and 28 parallel-arm RCTs). Effect estimates did not differ between split-mouth and parallel-arm RCTs (mean ROR, 0.96, 95% confidence interval 0.52–1.80; mean ∆SMD, 0.08, -0.14–0.30). Conclusions Our study did not provide sufficient evidence for a difference in intervention effect estimates derived from split-mouth and parallel-arm RCTs. Authors should consider including split-mouth RCTs in their meta-analyses with suitable and appropriate analysis. PMID:24886043

  14. A Dual Super-Element Domain Decomposition Approach for Parallel Nonlinear Finite Element Analysis

    NASA Astrophysics Data System (ADS)

    Jokhio, G. A.; Izzuddin, B. A.

    2015-05-01

    This article presents a new domain decomposition method for nonlinear finite element analysis introducing the concept of dual partition super-elements. The method extends ideas from the displacement frame method and is ideally suited for parallel nonlinear static/dynamic analysis of structural systems. In the new method, domain decomposition is realized by replacing one or more subdomains in a "parent system," each with a placeholder super-element, where the subdomains are processed separately as "child partitions," each wrapped by a dual super-element along the partition boundary. The analysis of the overall system, including the satisfaction of equilibrium and compatibility at all partition boundaries, is realized through direct communication between all pairs of placeholder and dual super-elements. The proposed method has particular advantages for matrix solution methods based on the frontal scheme, and can be readily implemented for existing finite element analysis programs to achieve parallelization on distributed memory systems with minimal intervention, thus overcoming memory bottlenecks typically faced in the analysis of large-scale problems. Several examples are presented in this article which demonstrate the computational benefits of the proposed parallel domain decomposition approach and its applicability to the nonlinear structural analysis of realistic structural systems.

  15. 'Dilute-and-shoot' triple parallel mass spectrometry method for analysis of vitamin D and triacylglycerols in dietary supplements

    USDA-ARS?s Scientific Manuscript database

    A method is demonstrated for analysis of vitamin D-fortified dietary supplements that eliminates virtually all chemical pretreatment prior to analysis, and is referred to as a ‘dilute and shoot’ method. Three mass spectrometers, in parallel, plus a UV detector, an evaporative light scattering detec...

  16. Simplified Parallel Domain Traversal

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Erickson III, David J

    2011-01-01

    Many data-intensive scientific analysis techniques require global domain traversal, which over the years has been a bottleneck for efficient parallelization across distributed-memory architectures. Inspired by MapReduce and other simplified parallel programming approaches, we have designed DStep, a flexible system that greatly simplifies efficient parallelization of domain traversal techniques at scale. In order to deliver both simplicity to users as well as scalability on HPC platforms, we introduce a novel two-tiered communication architecture for managing and exploiting asynchronous communication loads. We also integrate our design with advanced parallel I/O techniques that operate directly on native simulation output. We demonstrate DStep bymore » performing teleconnection analysis across ensemble runs of terascale atmospheric CO{sub 2} and climate data, and we show scalability results on up to 65,536 IBM BlueGene/P cores.« less

  17. Software Design for Real-Time Systems on Parallel Computers: Formal Specifications.

    DTIC Science & Technology

    1996-04-01

    This research investigated the important issues related to the analysis and design of real - time systems targeted to parallel architectures. In...particular, the software specification models for real - time systems on parallel architectures were evaluated. A survey of current formal methods for...uniprocessor real - time systems specifications was conducted to determine their extensibility in specifying real - time systems on parallel architectures. In

  18. The Methodological Socialization of Social Science Doctoral Students in China and the USA

    ERIC Educational Resources Information Center

    Rhoads, Robert A.; Zheng, Mi; Sun, Xiaoyang

    2017-01-01

    This qualitative study reports findings from a comparative analysis of the methodological socialization of doctoral students in the social sciences at two universities: one in China and one in the USA. Relying primarily on theories of organizational socialization, the study focuses on formal and informal processes students report as part of…

  19. The Characteristics of the Systems of Continuing Pedagogical Education in Great Britain, Canada and the USA

    ERIC Educational Resources Information Center

    Mukan, Nataliya; Myskiv, Iryna; Kravets, Svitlana

    2016-01-01

    In the article the systems of continuing pedagogical education in Great Britain, Canada and the USA have been characterized. The main objectives are defined as the theoretical analysis of scientific-pedagogical literature, which highlights different aspects of the problem under research; identification of the common and distinctive features of the…

  20. Institutional Research: What Problems Are We Trying to Solve?

    ERIC Educational Resources Information Center

    Longden, Bernard; Yorke, Mantz

    2009-01-01

    Institutional research in UK higher education is rarely consolidated into a central office function. This is in marked comparison to the position of IR in the USA where most universities accord it a high status which is absent from the UK context. The collection, analysis and interpretation of data in the USA appears, on the whole, more systematic…

  1. A Detailed Model Atmosphere Analysis of Cool White Dwarfs in the Sloan Digital Sky Survey

    DTIC Science & Technology

    2010-09-01

    Road, Flagstaff, AZ 86001, USA 6 Department of Astronomy, 1 University Station C1400, Austin, TX 78712, USA 7 Kavli Institute for Cosmological Physics...with the Leggett et al. (1998) result, but the lack of infrared photometry prevented Harris et al. (2006) from a definite conclusion about the implied

  2. Extent of Kentucky bluegrass and its effect on native plant species diversity and ecosystem services in the Northern Great Plains of the USA

    USDA-ARS?s Scientific Manuscript database

    The geographic spread of Kentucky bluegrass in rangelands of the USA has increased significantly over the past decades. Preliminary analysis of National Resources Inventory data indicates that Kentucky bluegrass occupies a majority of ecological sites across the Northern Great Plains. Despite its fa...

  3. Museums USA: Art, History, Science, and Other Museums.

    ERIC Educational Resources Information Center

    National Endowment for the Arts, Washington, DC.

    The results and analysis of an earlier museum survey, presented in "Museum U.S.A.: Highlights" (ED 093 777), are given in this document. The purpose is to present a comprehensive picture of museums in the United States--their numbers and locations, types and functions, facilities and finances, personnel and trustees, and activities and attendance.…

  4. Retrospective Analysis of Technological Literacy of K-12 Students in the USA

    ERIC Educational Resources Information Center

    Eisenkraft, Arthur

    2010-01-01

    Assessing technological literacy in the USA will require a large expenditure of resources. While these important initiatives are taking place, it is useful to analyze existing archival data to get a sense of students' understanding of technology. Such archival data exists from the entries submitted to the Toshiba/NSTA ExploraVisions competition…

  5. Extent of Kentucky bluegrass and its effect on native plant species diversity and ecosystem services in the Northern Great Plains of the USA

    USDA-ARS?s Scientific Manuscript database

    The geographic spread of Kentucky bluegrass in rangelands of the USA has increased significantly over the past 3 decades. Preliminary analysis indicates that Kentucky bluegrass occupies over half of all ecological sites across the Northern Great Plains. Kentucky bluegrass has served as nutritious fo...

  6. An Analysis of Song-Leading by Kindergarten Teachers in Taiwan and the USA

    ERIC Educational Resources Information Center

    Liao, Mei-Ying; Campbell, Patricia Shehan

    2014-01-01

    The purpose of this study was to examine components of the song-leading process used by kindergarten teachers in Taiwan and the United States, including the critical matter of starting pitch. Five public school kindergarten teachers in Taipei, Taiwan, and five public kindergarten teachers in Seattle, USA, were invited to participate in this study…

  7. Translators and Interpreters Certification in Australia, Canada, the USA and Ukraine: Comparative Analysis

    ERIC Educational Resources Information Center

    Skyba, Kateryna

    2014-01-01

    The article presents an overview of the certification process by which potential translators and interpreters demonstrate minimum standards of performance to warrant official or professional recognition of their ability to translate or interpret and to practice professionally in Australia, Canada, the USA and Ukraine. The aim of the study is to…

  8. Analysis and performance of paralleling circuits for modular inverter-converter systems

    NASA Technical Reports Server (NTRS)

    Birchenough, A. G.; Gourash, F.

    1972-01-01

    As part of a modular inverter-converter development program, control techniques were developed to provide load sharing among paralleled inverters or converters. An analysis of the requirements of paralleling circuits and a discussion of the circuits developed and their performance are included in this report. The current sharing was within 5.6 percent of rated-load current for the ac modules and 7.4 percent for the dc modules for an initial output voltage unbalance of 5 volts.

  9. Biophysical Aspects of Cyclodextrin Interaction with Paraoxon

    DTIC Science & Technology

    2013-12-19

    Rockville, MD, USA article is a U.S. Government work and is in the public domain in the USA. 1 1 1 Figure 2. NMR analysis of paraoxon (PX) and β-CD...interaction. Job’s plot analysis (continuous variation method) was performed for β-CD H1’, H2’, and H4’ protons and is shown in a–c respectively. The PX...resonances analyzed using nonlinear regression analysis for a. H1’, b. H2’, c. H5’, d. H2 H8, and e. H3 H5. S.-D. Soni, J. B. Bhonsle and G. E. Garcia

  10. Hierarchical Parallelization of Gene Differential Association Analysis

    PubMed Central

    2011-01-01

    Background Microarray gene differential expression analysis is a widely used technique that deals with high dimensional data and is computationally intensive for permutation-based procedures. Microarray gene differential association analysis is even more computationally demanding and must take advantage of multicore computing technology, which is the driving force behind increasing compute power in recent years. In this paper, we present a two-layer hierarchical parallel implementation of gene differential association analysis. It takes advantage of both fine- and coarse-grain (with granularity defined by the frequency of communication) parallelism in order to effectively leverage the non-uniform nature of parallel processing available in the cutting-edge systems of today. Results Our results show that this hierarchical strategy matches data sharing behavior to the properties of the underlying hardware, thereby reducing the memory and bandwidth needs of the application. The resulting improved efficiency reduces computation time and allows the gene differential association analysis code to scale its execution with the number of processors. The code and biological data used in this study are downloadable from http://www.urmc.rochester.edu/biostat/people/faculty/hu.cfm. Conclusions The performance sweet spot occurs when using a number of threads per MPI process that allows the working sets of the corresponding MPI processes running on the multicore to fit within the machine cache. Hence, we suggest that practitioners follow this principle in selecting the appropriate number of MPI processes and threads within each MPI process for their cluster configurations. We believe that the principles of this hierarchical approach to parallelization can be utilized in the parallelization of other computationally demanding kernels. PMID:21936916

  11. Hierarchical parallelization of gene differential association analysis.

    PubMed

    Needham, Mark; Hu, Rui; Dwarkadas, Sandhya; Qiu, Xing

    2011-09-21

    Microarray gene differential expression analysis is a widely used technique that deals with high dimensional data and is computationally intensive for permutation-based procedures. Microarray gene differential association analysis is even more computationally demanding and must take advantage of multicore computing technology, which is the driving force behind increasing compute power in recent years. In this paper, we present a two-layer hierarchical parallel implementation of gene differential association analysis. It takes advantage of both fine- and coarse-grain (with granularity defined by the frequency of communication) parallelism in order to effectively leverage the non-uniform nature of parallel processing available in the cutting-edge systems of today. Our results show that this hierarchical strategy matches data sharing behavior to the properties of the underlying hardware, thereby reducing the memory and bandwidth needs of the application. The resulting improved efficiency reduces computation time and allows the gene differential association analysis code to scale its execution with the number of processors. The code and biological data used in this study are downloadable from http://www.urmc.rochester.edu/biostat/people/faculty/hu.cfm. The performance sweet spot occurs when using a number of threads per MPI process that allows the working sets of the corresponding MPI processes running on the multicore to fit within the machine cache. Hence, we suggest that practitioners follow this principle in selecting the appropriate number of MPI processes and threads within each MPI process for their cluster configurations. We believe that the principles of this hierarchical approach to parallelization can be utilized in the parallelization of other computationally demanding kernels.

  12. Analysis of the thermal balance characteristics for multiple-connected piezoelectric transformers.

    PubMed

    Park, Joung-Hu; Cho, Bo-Hyung; Choi, Sung-Jin; Lee, Sang-Min

    2009-08-01

    Because the amount of power that a piezoelectric transformer (PT) can handle is limited, multiple connections of PTs are necessary for the power-capacity improvement of PT-applications. In the connection, thermal imbalance between the PTs should be prevented to avoid the thermal runaway of each PT. The thermal balance of the multiple-connected PTs is dominantly affected by the electrothermal characteristics of individual PTs. In this paper, the thermal balance of both parallel-parallel and parallel-series connections are analyzed by electrical model parameters. For quantitative analysis, the thermal-balance effects are estimated by the simulation of the mechanical loss ratio between the PTs. The analysis results show that with PTs of similar characteristics, the parallel-series connection has better thermal balance characteristics due to the reduced mechanical loss of the higher temperature PT. For experimental verification of the analysis, a hardware-prototype test of a Cs-Lp type 40 W adapter system with radial-vibration mode PTs has been performed.

  13. Interactive Fringe Analysis System: Applications To Moire Contourogram And Interferogram

    NASA Astrophysics Data System (ADS)

    Yatagai, T.; Idesawa, M.; Yamaashi, Y.; Suzuki, M.

    1982-10-01

    A general purpose fringe pattern processing facility was developed in order to analyze moire photographs used for scoliosis diagnoses and interferometric patterns in optical shops. A TV camera reads a fringe profile to be analyzed, and peaks of the fringe are detected by a microcomputer. Fringe peak correction and fringe order determination are performed with the man-machine interactive software developed. A light pen facility and an image digitizer are employed for interaction. In the case of two-dimensional fringe analysis, we analyze independently analysis lines parallel to each other and a reference line perpendicular to the parallel analysis lines. Fringe orders of parallel analysis lines are uniquely determined by using the fringe order of the reference line. Some results of analysis of moire contourograms, interferometric testing of silicon wafers, and holographic measurement of thermal deformation are presented.

  14. Comparison of multihardware parallel implementations for a phase unwrapping algorithm

    NASA Astrophysics Data System (ADS)

    Hernandez-Lopez, Francisco Javier; Rivera, Mariano; Salazar-Garibay, Adan; Legarda-Sáenz, Ricardo

    2018-04-01

    Phase unwrapping is an important problem in the areas of optical metrology, synthetic aperture radar (SAR) image analysis, and magnetic resonance imaging (MRI) analysis. These images are becoming larger in size and, particularly, the availability and need for processing of SAR and MRI data have increased significantly with the acquisition of remote sensing data and the popularization of magnetic resonators in clinical diagnosis. Therefore, it is important to develop faster and accurate phase unwrapping algorithms. We propose a parallel multigrid algorithm of a phase unwrapping method named accumulation of residual maps, which builds on a serial algorithm that consists of the minimization of a cost function; minimization achieved by means of a serial Gauss-Seidel kind algorithm. Our algorithm also optimizes the original cost function, but unlike the original work, our algorithm is a parallel Jacobi class with alternated minimizations. This strategy is known as the chessboard type, where red pixels can be updated in parallel at same iteration since they are independent. Similarly, black pixels can be updated in parallel in an alternating iteration. We present parallel implementations of our algorithm for different parallel multicore architecture such as CPU-multicore, Xeon Phi coprocessor, and Nvidia graphics processing unit. In all the cases, we obtain a superior performance of our parallel algorithm when compared with the original serial version. In addition, we present a detailed comparative performance of the developed parallel versions.

  15. Symposium on Parallel Computational Methods for Large-scale Structural Analysis and Design, 2nd, Norfolk, VA, US

    NASA Technical Reports Server (NTRS)

    Storaasli, Olaf O. (Editor); Housner, Jerrold M. (Editor)

    1993-01-01

    Computing speed is leaping forward by several orders of magnitude each decade. Engineers and scientists gathered at a NASA Langley symposium to discuss these exciting trends as they apply to parallel computational methods for large-scale structural analysis and design. Among the topics discussed were: large-scale static analysis; dynamic, transient, and thermal analysis; domain decomposition (substructuring); and nonlinear and numerical methods.

  16. Physics Structure Analysis of Parallel Waves Concept of Physics Teacher Candidate

    NASA Astrophysics Data System (ADS)

    Sarwi, S.; Supardi, K. I.; Linuwih, S.

    2017-04-01

    The aim of this research was to find a parallel structure concept of wave physics and the factors that influence on the formation of parallel conceptions of physics teacher candidates. The method used qualitative research which types of cross-sectional design. These subjects were five of the third semester of basic physics and six of the fifth semester of wave course students. Data collection techniques used think aloud and written tests. Quantitative data were analysed with descriptive technique-percentage. The data analysis technique for belief and be aware of answers uses an explanatory analysis. Results of the research include: 1) the structure of the concept can be displayed through the illustration of a map containing the theoretical core, supplements the theory and phenomena that occur daily; 2) the trend of parallel conception of wave physics have been identified on the stationary waves, resonance of the sound and the propagation of transverse electromagnetic waves; 3) the influence on the parallel conception that reading textbooks less comprehensive and knowledge is partial understanding as forming the structure of the theory.

  17. Image segmentation by iterative parallel region growing with application to data compression and image analysis

    NASA Technical Reports Server (NTRS)

    Tilton, James C.

    1988-01-01

    Image segmentation can be a key step in data compression and image analysis. However, the segmentation results produced by most previous approaches to region growing are suspect because they depend on the order in which portions of the image are processed. An iterative parallel segmentation algorithm avoids this problem by performing globally best merges first. Such a segmentation approach, and two implementations of the approach on NASA's Massively Parallel Processor (MPP) are described. Application of the segmentation approach to data compression and image analysis is then described, and results of such application are given for a LANDSAT Thematic Mapper image.

  18. A mechanism for efficient debugging of parallel programs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, B.P.; Choi, J.D.

    1988-01-01

    This paper addresses the design and implementation of an integrated debugging system for parallel programs running on shared memory multi-processors (SMMP). The authors describe the use of flowback analysis to provide information on causal relationships between events in a program's execution without re-executing the program for debugging. The authors introduce a mechanism called incremental tracing that, by using semantic analyses of the debugged program, makes the flowback analysis practical with only a small amount of trace generated during execution. The extend flowback analysis to apply to parallel programs and describe a method to detect race conditions in the interactions ofmore » the co-operating processes.« less

  19. Rotation of an immersed cylinder sliding near a thin elastic coating

    NASA Astrophysics Data System (ADS)

    Rallabandi, Bhargav; Saintyves, Baudouin; Jules, Theo; Salez, Thomas; Schönecker, Clarissa; Mahadevan, L.; Stone, Howard A.

    2017-07-01

    It is known that an object translating parallel to a soft wall in a viscous fluid produces hydrodynamic stresses that deform the wall, which in turn results in a lift force on the object. Recent experiments with cylinders sliding under gravity near a soft incline, which confirmed theoretical arguments for the lift force, also reported an unexplained steady-state rotation of the cylinders [B. Saintyves et al., Proc. Natl. Acad. Sci. USA 113, 5847 (2016), 10.1073/pnas.1525462113]. Motivated by these observations, we show, in the lubrication limit, that an infinite cylinder that translates in a viscous fluid parallel to a soft wall at constant speed and separation distance must also rotate in order to remain free of torque. Using the Lorentz reciprocal theorem, we show analytically that for small deformations of the elastic layer, the angular velocity of the cylinder scales with the cube of the sliding speed. These predictions are confirmed numerically. We then apply the theory to the gravity-driven motion of a cylinder near a soft incline and find qualitative agreement with the experimental observations, namely, that a softer elastic layer results in a greater angular speed of the cylinder.

  20. Xyce parallel electronic simulator users guide, version 6.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keiter, Eric R; Mei, Ting; Russo, Thomas V.

    This manual describes the use of the Xyce Parallel Electronic Simulator. Xyce has been designed as a SPICE-compatible, high-performance analog circuit simulator, and has been written to support the simulation needs of the Sandia National Laboratories electrical designers. This development has focused on improving capability over the current state-of-the-art in the following areas; Capability to solve extremely large circuit problems by supporting large-scale parallel computing platforms (up to thousands of processors). This includes support for most popular parallel and serial computers; A differential-algebraic-equation (DAE) formulation, which better isolates the device model package from solver algorithms. This allows one to developmore » new types of analysis without requiring the implementation of analysis-specific device models; Device models that are specifically tailored to meet Sandia's needs, including some radiationaware devices (for Sandia users only); and Object-oriented code design and implementation using modern coding practices. Xyce is a parallel code in the most general sense of the phrase-a message passing parallel implementation-which allows it to run efficiently a wide range of computing platforms. These include serial, shared-memory and distributed-memory parallel platforms. Attention has been paid to the specific nature of circuit-simulation problems to ensure that optimal parallel efficiency is achieved as the number of processors grows.« less

  1. Xyce parallel electronic simulator users' guide, Version 6.0.1.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keiter, Eric R; Mei, Ting; Russo, Thomas V.

    This manual describes the use of the Xyce Parallel Electronic Simulator. Xyce has been designed as a SPICE-compatible, high-performance analog circuit simulator, and has been written to support the simulation needs of the Sandia National Laboratories electrical designers. This development has focused on improving capability over the current state-of-the-art in the following areas: Capability to solve extremely large circuit problems by supporting large-scale parallel computing platforms (up to thousands of processors). This includes support for most popular parallel and serial computers. A differential-algebraic-equation (DAE) formulation, which better isolates the device model package from solver algorithms. This allows one to developmore » new types of analysis without requiring the implementation of analysis-specific device models. Device models that are specifically tailored to meet Sandias needs, including some radiationaware devices (for Sandia users only). Object-oriented code design and implementation using modern coding practices. Xyce is a parallel code in the most general sense of the phrase a message passing parallel implementation which allows it to run efficiently a wide range of computing platforms. These include serial, shared-memory and distributed-memory parallel platforms. Attention has been paid to the specific nature of circuit-simulation problems to ensure that optimal parallel efficiency is achieved as the number of processors grows.« less

  2. Xyce parallel electronic simulator users guide, version 6.0.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keiter, Eric R; Mei, Ting; Russo, Thomas V.

    This manual describes the use of the Xyce Parallel Electronic Simulator. Xyce has been designed as a SPICE-compatible, high-performance analog circuit simulator, and has been written to support the simulation needs of the Sandia National Laboratories electrical designers. This development has focused on improving capability over the current state-of-the-art in the following areas: Capability to solve extremely large circuit problems by supporting large-scale parallel computing platforms (up to thousands of processors). This includes support for most popular parallel and serial computers. A differential-algebraic-equation (DAE) formulation, which better isolates the device model package from solver algorithms. This allows one to developmore » new types of analysis without requiring the implementation of analysis-specific device models. Device models that are specifically tailored to meet Sandias needs, including some radiationaware devices (for Sandia users only). Object-oriented code design and implementation using modern coding practices. Xyce is a parallel code in the most general sense of the phrase a message passing parallel implementation which allows it to run efficiently a wide range of computing platforms. These include serial, shared-memory and distributed-memory parallel platforms. Attention has been paid to the specific nature of circuit-simulation problems to ensure that optimal parallel efficiency is achieved as the number of processors grows.« less

  3. Multidisciplinary Optimization Methods for Aircraft Preliminary Design

    NASA Technical Reports Server (NTRS)

    Kroo, Ilan; Altus, Steve; Braun, Robert; Gage, Peter; Sobieski, Ian

    1994-01-01

    This paper describes a research program aimed at improved methods for multidisciplinary design and optimization of large-scale aeronautical systems. The research involves new approaches to system decomposition, interdisciplinary communication, and methods of exploiting coarse-grained parallelism for analysis and optimization. A new architecture, that involves a tight coupling between optimization and analysis, is intended to improve efficiency while simplifying the structure of multidisciplinary, computation-intensive design problems involving many analysis disciplines and perhaps hundreds of design variables. Work in two areas is described here: system decomposition using compatibility constraints to simplify the analysis structure and take advantage of coarse-grained parallelism; and collaborative optimization, a decomposition of the optimization process to permit parallel design and to simplify interdisciplinary communication requirements.

  4. Comparing forest fragmentation and its drivers in China and the USA with Globcover v2.2.

    PubMed

    Li, Mingshi; Mao, Lijun; Zhou, Chunguo; Vogelmann, James E; Zhu, Zhiliang

    2010-12-01

    Forest loss and fragmentation are of major concern to the international community, in large part because they impact so many important environmental processes. The main objective of this study was to assess the differences in forest fragmentation patterns and drivers between China and the conterminous United States (USA). Using the latest 300-m resolution global land cover product, Globcover v2.2, a comparative analysis of forest fragmentation patterns and drivers was made. The fragmentation patterns were characterized by using a forest fragmentation model built on the sliding window analysis technique in association with landscape indices. Results showed that China's forests were substantially more fragmented than those of the USA. This was evidenced by a large difference in the amount of interior forest area share, with China having 48% interior forest versus the 66% for the USA. China's forest fragmentation was primarily attributed to anthropogenic disturbances, driven particularly by agricultural expansion from an increasing and large population, as well as poor forest management practices. In contrast, USA forests were principally fragmented by natural land cover types. However, USA urban sprawl contributed more to forest fragmentation than in China. This is closely tied to the USA's economy, lifestyle and institutional processes. Fragmentation maps were generated from this study, which provide valuable insights and implications regarding habitat planning for rare and endangered species. Such maps enable development of strategic plans for sustainable forest management by identifying areas with high amounts of human-induced fragmentation, which improve risk assessments and enable better targeting for protection and remediation efforts. Because forest fragmentation is a long-term, complex process that is highly related to political, institutional, economic and philosophical arenas, both nations need to take effective and comprehensive measures to mitigate the negative effects of forest loss and fragmentation on the existing forest ecosystems. Copyright © 2010 Elsevier Ltd. All rights reserved.

  5. Solving Integer Programs from Dependence and Synchronization Problems

    DTIC Science & Technology

    1993-03-01

    DEFF.NSNE Solving Integer Programs from Dependence and Synchronization Problems Jaspal Subhlok March 1993 CMU-CS-93-130 School of Computer ScienceT IC...method Is an exact and efficient way of solving integer programming problems arising in dependence and synchronization analysis of parallel programs...7/;- p Keywords: Exact dependence tesing, integer programming. parallelilzng compilers, parallel program analysis, synchronization analysis Solving

  6. SAPNEW: Parallel finite element code for thin shell structures on the Alliant FX/80

    NASA Astrophysics Data System (ADS)

    Kamat, Manohar P.; Watson, Brian C.

    1992-02-01

    The results of a research activity aimed at providing a finite element capability for analyzing turbo-machinery bladed-disk assemblies in a vector/parallel processing environment are summarized. Analysis of aircraft turbofan engines is very computationally intensive. The performance limit of modern day computers with a single processing unit was estimated at 3 billions of floating point operations per second (3 gigaflops). In view of this limit of a sequential unit, performance rates higher than 3 gigaflops can be achieved only through vectorization and/or parallelization as on Alliant FX/80. Accordingly, the efforts of this critically needed research were geared towards developing and evaluating parallel finite element methods for static and vibration analysis. A special purpose code, named with the acronym SAPNEW, performs static and eigen analysis of multi-degree-of-freedom blade models built-up from flat thin shell elements.

  7. Tutorial: Parallel Computing of Simulation Models for Risk Analysis.

    PubMed

    Reilly, Allison C; Staid, Andrea; Gao, Michael; Guikema, Seth D

    2016-10-01

    Simulation models are widely used in risk analysis to study the effects of uncertainties on outcomes of interest in complex problems. Often, these models are computationally complex and time consuming to run. This latter point may be at odds with time-sensitive evaluations or may limit the number of parameters that are considered. In this article, we give an introductory tutorial focused on parallelizing simulation code to better leverage modern computing hardware, enabling risk analysts to better utilize simulation-based methods for quantifying uncertainty in practice. This article is aimed primarily at risk analysts who use simulation methods but do not yet utilize parallelization to decrease the computational burden of these models. The discussion is focused on conceptual aspects of embarrassingly parallel computer code and software considerations. Two complementary examples are shown using the languages MATLAB and R. A brief discussion of hardware considerations is located in the Appendix. © 2016 Society for Risk Analysis.

  8. SAPNEW: Parallel finite element code for thin shell structures on the Alliant FX/80

    NASA Technical Reports Server (NTRS)

    Kamat, Manohar P.; Watson, Brian C.

    1992-01-01

    The results of a research activity aimed at providing a finite element capability for analyzing turbo-machinery bladed-disk assemblies in a vector/parallel processing environment are summarized. Analysis of aircraft turbofan engines is very computationally intensive. The performance limit of modern day computers with a single processing unit was estimated at 3 billions of floating point operations per second (3 gigaflops). In view of this limit of a sequential unit, performance rates higher than 3 gigaflops can be achieved only through vectorization and/or parallelization as on Alliant FX/80. Accordingly, the efforts of this critically needed research were geared towards developing and evaluating parallel finite element methods for static and vibration analysis. A special purpose code, named with the acronym SAPNEW, performs static and eigen analysis of multi-degree-of-freedom blade models built-up from flat thin shell elements.

  9. Concurrent Probabilistic Simulation of High Temperature Composite Structural Response

    NASA Technical Reports Server (NTRS)

    Abdi, Frank

    1996-01-01

    A computational structural/material analysis and design tool which would meet industry's future demand for expedience and reduced cost is presented. This unique software 'GENOA' is dedicated to parallel and high speed analysis to perform probabilistic evaluation of high temperature composite response of aerospace systems. The development is based on detailed integration and modification of diverse fields of specialized analysis techniques and mathematical models to combine their latest innovative capabilities into a commercially viable software package. The technique is specifically designed to exploit the availability of processors to perform computationally intense probabilistic analysis assessing uncertainties in structural reliability analysis and composite micromechanics. The primary objectives which were achieved in performing the development were: (1) Utilization of the power of parallel processing and static/dynamic load balancing optimization to make the complex simulation of structure, material and processing of high temperature composite affordable; (2) Computational integration and synchronization of probabilistic mathematics, structural/material mechanics and parallel computing; (3) Implementation of an innovative multi-level domain decomposition technique to identify the inherent parallelism, and increasing convergence rates through high- and low-level processor assignment; (4) Creating the framework for Portable Paralleled architecture for the machine independent Multi Instruction Multi Data, (MIMD), Single Instruction Multi Data (SIMD), hybrid and distributed workstation type of computers; and (5) Market evaluation. The results of Phase-2 effort provides a good basis for continuation and warrants Phase-3 government, and industry partnership.

  10. The methodological and reporting quality of systematic reviews from China and the USA are similar.

    PubMed

    Tian, Jinhui; Zhang, Jun; Ge, Long; Yang, Kehu; Song, Fujian

    2017-05-01

    To compare the methodological and reporting quality of systematic reviews by authors from China and those from the United States (USA). From systematic reviews of randomized trials published in 2014 in English, we randomly selected 100 from China and 100 from the USA. The methodological quality was assessed using the Assessing the Methodological Quality of Systematic Reviews (AMSTAR) tool, and reporting quality assessed using the Preferred Reporting Items for Systematic Reviews and Meta-analyses (PRISMA) tool. Compared with systematic reviews from the USA, those from China were more likely to be a meta-analysis, published in low-impact journals, and a non-Cochrane review. The mean summary Assessing the Methodological Quality of Systematic Reviews score was 6.7 (95% confidence interval: 6.5, 7.0) for reviews from China and 6.6 (6.1, 7.1) for reviews from the USA, and the mean summary Preferred Reporting Items for Systematic Reviews and Meta-analyses score was 21.2 (20.7, 21.6) for reviews from China and 20.6 (19.9, 21.3) for reviews from the USA. The differences in summary quality scores between China and the USA were statistically nonsignificant after adjusting for multiple review factors. The overall methodological and reporting quality of systematic reviews by authors from China are similar to those from the USA, although the quality of systematic reviews from both countries could be further improved. Copyright © 2017 The Author(s). Published by Elsevier Inc. All rights reserved.

  11. Earth Observations taken by Expedition 30 crewmember

    NASA Image and Video Library

    2012-01-29

    ISS030-E-055569 (29 Jan. 2012) --- Southeastern USA at night is featured in this image photographed by an Expedition 30 crew member on the International Space Station. The brightly lit metropolitan areas of Atlanta, GA (center) and Jacksonville, FL (lower right) appear largest in the image with numerous other urban areas forming an interconnected network of light across the region. A large dark region to the northwest of Jacksonville, FL is the Okefenokee National Wildlife Refuge; likewise the ridges of the Appalachian Mountains form dark swaths to the north of Atlanta, GA and west of Charlotte, NC (center). The faint gold and green line of airglow—caused by ultraviolet radiation exciting the gas molecules in the upper atmosphere—parallels the horizon (or Earth limb).

  12. Singularity and workspace analysis of three isoconstrained parallel manipulators with schoenflies motion

    NASA Astrophysics Data System (ADS)

    Lee, Po-Chih; Lee, Jyh-Jone

    2012-06-01

    This paper presents the analysis of three parallel manipulators with Schoenflies-motion. Each parallel manipulator possesses two limbs in structure and the end-effector has three DOFs (degree of freedom) in the translational motion and one DOF in rotational motion about a given direction axis with respect to the world coordinate system. The three isoconstrained parallel manipulators have the structures denoted as C{u/u}UwHw-//-C{v/v}UwHw, CuR{u/u}Uhw-//-CvR{v/v}Uhw and CuPuUhw-//-CvPvUhw. The kinematic equations are first introduced for each manipulator. Then, Jacobian matrix, singularity, workspace, and performance index for each mechanism are subsequently derived and analysed for the first time. The results can be helpful for the engineers to evaluate such kind of parallel robots for possible application in industry where pick-and-place motion is required.

  13. West Nile Virus Infection among Humans, Texas, USA, 2002–2011

    PubMed Central

    Nolan, Melissa S.; Schuermann, Jim

    2013-01-01

    We conducted an epidemiologic analysis to document West Nile virus infections among humans in Texas, USA, during 2002–2011. West Nile virus has become endemic to Texas; the number of reported cases increased every 3 years. Risk for infection was greatest in rural northwestern Texas, where Culex tarsalis mosquitoes are the predominant mosquito species. PMID:23260575

  14. #Eduresistance: A Critical Analysis of the Role of Digital Media in Collective Struggles for Public Education in the Usa

    ERIC Educational Resources Information Center

    Thapliyal, Nisha

    2018-01-01

    From Facebook-coordinated high-school walkouts to compelling Internet-based protest art that has accompanied recent teacher strikes, grassroots education activism in the USA has gone digital. Despite the proliferation of research on the mediatisation of education policy, few studies have explored the ways in which activists for public education…

  15. "Truth or Consequences": A Feminist Critical Policy Analysis of the STEM Crisis

    ERIC Educational Resources Information Center

    Mansfield, Katherine Cumings; Welton, Anjalé D.; Grogan, Margaret

    2014-01-01

    STEM education has received significant attention in the USA and is largely fueled by rhetoric suggesting the USA is losing its global competitive edge and that there is a lack of qualified workers available to fill growing STEM jobs. However, a counter discourse is emerging that questions the legitimacy of these claims. In response, we employed…

  16. Topographic controls on the regional-scale biodiversity of the south-western USA

    Treesearch

    David D. Coblentz; Kurt H. Riitters

    2004-01-01

    Aim Topography is a fundamental geophysical observable that contains valuable information about the geodynamic, tectonic and climatic history of a region. Here, we extend the traditional uses of topographic analysis to evaluate the role played by topography in the distribution of regional-scale biodiversity in the south-western USA. An important aspect of our study is...

  17. Creativity and Mathematical Problem Posing: An Analysis of High School Students' Mathematical Problem Posing in China and the USA

    ERIC Educational Resources Information Center

    Van Harpen, Xianwei Y.; Sriraman, Bharath

    2013-01-01

    In the literature, problem-posing abilities are reported to be an important aspect/indicator of creativity in mathematics. The importance of problem-posing activities in mathematics is emphasized in educational documents in many countries, including the USA and China. This study was aimed at exploring high school students' creativity in…

  18. Modeling forest site productivity using mapped geospatial attributes within a South Carolina landscape, USA

    Treesearch

    B.R. Parresol; D.A. Scott; S.J. Zarnoch; L.A. Edwards; J.I. Blake

    2017-01-01

    Spatially explicit mapping of forest productivity is important to assess many forest management alternatives. We assessed the relationship between mapped variables and site index of forests ranging from southern pine plantations to natural hardwoods on a 74,000-ha landscape in South Carolina, USA. Mapped features used in the analysis were soil association, land use...

  19. Forest biomass estimated from MODIS and FIA data in the Lake States: MN, WI and MI, USA

    Treesearch

    Daolan Zheng; Linda S. Heath; Mark J. Ducey

    2007-01-01

    This study linked the Moderate Resolution Imaging Spectrometer and USDA Forest Service, Forest Inventory and Analysis (FIA) data through empirical models established using high-resolution Landsat Enhanced Thematic Mapper Plus observations to estimate aboveground biomass (AGB) in three Lake States in the north-central USA. While means obtained from larger sample sizes...

  20. How Do Elementary Textbooks Address Fractions? A Review of Mathematics Textbooks in the USA, Japan, and Kuwait

    ERIC Educational Resources Information Center

    Alajmi, Amal Hussain

    2012-01-01

    Textbooks play an important part in the design of instruction. This study analyzed the presentation of fractions in textbooks designed for the elementary grades in Kuwait, Japan, and the USA. The analysis focused on the physical characteristics of the books, the structure of the lessons, and the nature of the mathematical problems presented.…

  1. HIV/AIDS Knowledge, Perception of Knowledge and Sources of Information among University Students in USA, Turkey, South Africa and Nigeria

    ERIC Educational Resources Information Center

    Abiona, Titilayo; Balogun, Joseph; Yohannes, Eden; Adefuye, Adedeji; Yakut, Yavuz; Amosun, Seyi; Frantz, Jose

    2014-01-01

    Objective: To examine HIV/AIDS knowledge, perceptions of knowledge and sources of HIV information among university students in four countries with different HIV prevalence rates. Methods: A survey was completed by 2,570 randomly selected university students from the USA, Turkey, South Africa and Nigeria. Logistic regression analysis was used to…

  2. VEMAP phase 2 bioclimatic database. I. Gridded historical (20th century) climate for modeling ecosystem dynamics across the conterminous USA

    Treesearch

    Timothy G.F. Kittel; Nan. A. Rosenbloom; J.A. Royle; C. Daly; W.P. Gibson; H.H. Fisher; P. Thornton; D.N. Yates; S. Aulenbach; C. Kaufman; R. McKeown; Dominque Bachelet; David S. Schimel

    2004-01-01

    Analysis and simulation of biospheric responses to historical forcing require surface climate data that capture those aspects of climate that control ecological processes, including key spatial gradients and modes of temporal variability. We developed a multivariate, gridded historical climate dataset for the conterminous USA as a common input database for the...

  3. The Model of Unification and the Model of Diversification of Public School Teachers' Continuing Professional Development in Great Britain, Canada and the USA

    ERIC Educational Resources Information Center

    Mukan, Nataliya; Myskiv, Iryna; Kravets, Svitlana

    2016-01-01

    In the article the theoretical framework of public school teachers' continuing professional development (CPD) in Great Britain, Canada and the USA has been presented. The main objectives have been defined as theoretical analysis of scientific and pedagogical literature, which highlights different aspects of the problem under research; presentation…

  4. Evaluation of the Infinium Methylation 450K technology.

    PubMed

    Dedeurwaerder, Sarah; Defrance, Matthieu; Calonne, Emilie; Denis, Hélène; Sotiriou, Christos; Fuks, François

    2011-12-01

    Studies of DNA methylomes hold enormous promise for biomedicine but are hampered by the technological challenges of analyzing many samples cost-effectively. Recently, a major extension of the previous Infinium HumanMethylation27 BeadChip® (Illumina, Inc. CA, USA), called Infinium HumanMethylation450 (Infinium Methylation 450K; Illumina, Inc. CA, USA) was developed. This upgraded technology is a hybrid of two different chemical assays, the Infinium I and Infinium II assays, allowing (for 12 samples in parallel) assessment of the methylation status of more than 480,000 cytosines distributed over the whole genome. In this article, we evaluate Infinium Methylation 450K on cell lines and tissue samples, highlighting some of its advantages but also some of its limitations. In particular, we compare the methylation values of the Infinium I and Infinium II assays. We used Infinium Methylation 450K to profile: first, the well-characterized HCT116 wild-type and double-knockout cell lines and then, 16 breast tissue samples (including eight normal and eight primary tumor samples). Absolute methylation values (β-values) were extracted with the GenomeStudio™ software and then subjected to detailed analysis. While this technology appeared highly robust as previously shown, we noticed a divergence between the β-values retrieved from the type I and type II Infinium assays. Specifically, the β-values obtained from Infinium II probes were less accurate and reproducible than those obtained from Infinium I probes. This suggests that data from the type I and type II assays should be considered separately in any downstream bioinformatic analysis. To be able to deal with the Infinium I and Infinium II data together, we developed and tested a new correction technique, which we called 'peak-based correction'. The idea was to rescale the Infinium II data on the basis of the Infinium I data. While this technique should be viewed as an approximation method, it significantly improves the quality of Infinium II data. Infinium 450K is a powerful technique in terms of reagent costs, time of labor, sample throughput and coverage. It holds great promise for the better understanding of the epigenetic component in health and disease. Yet, due to the nature of its design comprising two different chemical assays, analysis of the whole set of data is not as easy as initially anticipated. Correction strategies, such as the peak-based approach proposed here, are a step towards adequate output data analysis.

  5. Parallel Computing for Probabilistic Response Analysis of High Temperature Composites

    NASA Technical Reports Server (NTRS)

    Sues, R. H.; Lua, Y. J.; Smith, M. D.

    1994-01-01

    The objective of this Phase I research was to establish the required software and hardware strategies to achieve large scale parallelism in solving PCM problems. To meet this objective, several investigations were conducted. First, we identified the multiple levels of parallelism in PCM and the computational strategies to exploit these parallelisms. Next, several software and hardware efficiency investigations were conducted. These involved the use of three different parallel programming paradigms and solution of two example problems on both a shared-memory multiprocessor and a distributed-memory network of workstations.

  6. Hierarchical Parallelism in Finite Difference Analysis of Heat Conduction

    NASA Technical Reports Server (NTRS)

    Padovan, Joseph; Krishna, Lala; Gute, Douglas

    1997-01-01

    Based on the concept of hierarchical parallelism, this research effort resulted in highly efficient parallel solution strategies for very large scale heat conduction problems. Overall, the method of hierarchical parallelism involves the partitioning of thermal models into several substructured levels wherein an optimal balance into various associated bandwidths is achieved. The details are described in this report. Overall, the report is organized into two parts. Part 1 describes the parallel modelling methodology and associated multilevel direct, iterative and mixed solution schemes. Part 2 establishes both the formal and computational properties of the scheme.

  7. A comparison of parallel and diverging screw angles in the stability of locked plate constructs.

    PubMed

    Wähnert, D; Windolf, M; Brianza, S; Rothstock, S; Radtke, R; Brighenti, V; Schwieger, K

    2011-09-01

    We investigated the static and cyclical strength of parallel and angulated locking plate screws using rigid polyurethane foam (0.32 g/cm(3)) and bovine cancellous bone blocks. Custom-made stainless steel plates with two conically threaded screw holes with different angulations (parallel, 10° and 20° divergent) and 5 mm self-tapping locking screws underwent pull-out and cyclical pull and bending tests. The bovine cancellous blocks were only subjected to static pull-out testing. We also performed finite element analysis for the static pull-out test of the parallel and 20° configurations. In both the foam model and the bovine cancellous bone we found the significantly highest pull-out force for the parallel constructs. In the finite element analysis there was a 47% more damage in the 20° divergent constructs than in the parallel configuration. Under cyclical loading, the mean number of cycles to failure was significantly higher for the parallel group, followed by the 10° and 20° divergent configurations. In our laboratory setting we clearly showed the biomechanical disadvantage of a diverging locking screw angle under static and cyclical loading.

  8. Dimensionality Assessment of Ordered Polytomous Items with Parallel Analysis

    ERIC Educational Resources Information Center

    Timmerman, Marieke E.; Lorenzo-Seva, Urbano

    2011-01-01

    Parallel analysis (PA) is an often-recommended approach for assessment of the dimensionality of a variable set. PA is known in different variants, which may yield different dimensionality indications. In this article, the authors considered the most appropriate PA procedure to assess the number of common factors underlying ordered polytomously…

  9. Automatic differentiation for design sensitivity analysis of structural systems using multiple processors

    NASA Technical Reports Server (NTRS)

    Nguyen, Duc T.; Storaasli, Olaf O.; Qin, Jiangning; Qamar, Ramzi

    1994-01-01

    An automatic differentiation tool (ADIFOR) is incorporated into a finite element based structural analysis program for shape and non-shape design sensitivity analysis of structural systems. The entire analysis and sensitivity procedures are parallelized and vectorized for high performance computation. Small scale examples to verify the accuracy of the proposed program and a medium scale example to demonstrate the parallel vector performance on multiple CRAY C90 processors are included.

  10. Grain Boundary Sliding (GBS) as a Plastic Instability Leading to Coeval Pseudotachylyte Development in Mylonites: an EBSD Study of the Seismic Cycle in Brittle-Ductile Transition Rocks of the South Mountains Core Complex, Arizona, USA

    NASA Astrophysics Data System (ADS)

    Miranda, E.; Stewart, C.

    2017-12-01

    Exposures of coeval pseudotachylytes and mylonites are relatively rare, but are crucial for understanding the seismic cycle in the vicinity of the brittle-ductile transition (BDT). We use both field observations and electron backscatter diffraction (EBSD) analysis to investigate the coeval pseudotachylytes and granodiorite mylonites exposed in the footwall of the South Mountains core complex, Arizona, to evaluate how strain is localized both prior to and during pseudotachylyte development at the BDT. In the field, we observe numerous pseudotachylyte veins oriented parallel to mylonitic foliation; the veins have synthetic shear sense with adjacent mylonites, and are < 2 cm thick, laterally discontinuous, and confined to a few m in structural thickness. EBSD analysis reveals that deformation is strongly partitioned into quartz in mylonites, where quartz shows subgrain rotation overprinted by bulging recrystallization microstructures and lattice preferred orientation (LPO) patterns indicative of dislocation creep. Foliation-parallel zones of finely recrystallized, (< 5 μm diameter) bulge-nucleated grains in the mylonites show four-grain junctions and randomized LPO patterns consistent with grain boundary sliding (GBS). Pseudotachylyte veins have elongate polycrystalline quartz survivor clasts that also exhibit GBS traits, suggesting that pseudotachylytes form within GBS zones in mylonites. We interpret the onset of GBS as a triggering mechanism for coeval pseudotachylyte development, where the accompanying decrease in effective viscosity and increase in strain rate initiated seismic slip and pseudotachylyte formation within GBS zones. Strain became localized within the pseudotachylyte until crystallization of melt impeded flow, inducing pseudotachylyte development in other GBS zones. We associate the pseudotachylyte veins and host mylonites with the coseismic and interseismic parts of the seismic cycle, respectively, where the abundance and lateral discontinuity of pseudotachylyte veins suggests repeated events. We speculate that periodic, GBS-initiated pseudotachylyte generation may correlate with intermediate slip rate seismic events in the vicinity of the BDT, suggesting that coeval pseudotachylytes and mylonites are evidence of a unique class of seismic event.

  11. Automatic recognition of vector and parallel operations in a higher level language

    NASA Technical Reports Server (NTRS)

    Schneck, P. B.

    1971-01-01

    A compiler for recognizing statements of a FORTRAN program which are suited for fast execution on a parallel or pipeline machine such as Illiac-4, Star or ASC is described. The technique employs interval analysis to provide flow information to the vector/parallel recognizer. Where profitable the compiler changes scalar variables to subscripted variables. The output of the compiler is an extension to FORTRAN which shows parallel and vector operations explicitly.

  12. Methods for design and evaluation of parallel computating systems (The PISCES project)

    NASA Technical Reports Server (NTRS)

    Pratt, Terrence W.; Wise, Robert; Haught, Mary JO

    1989-01-01

    The PISCES project started in 1984 under the sponsorship of the NASA Computational Structural Mechanics (CSM) program. A PISCES 1 programming environment and parallel FORTRAN were implemented in 1984 for the DEC VAX (using UNIX processes to simulate parallel processes). This system was used for experimentation with parallel programs for scientific applications and AI (dynamic scene analysis) applications. PISCES 1 was ported to a network of Apollo workstations by N. Fitzgerald.

  13. National Combustion Code Parallel Performance Enhancements

    NASA Technical Reports Server (NTRS)

    Quealy, Angela; Benyo, Theresa (Technical Monitor)

    2002-01-01

    The National Combustion Code (NCC) is being developed by an industry-government team for the design and analysis of combustion systems. The unstructured grid, reacting flow code uses a distributed memory, message passing model for its parallel implementation. The focus of the present effort has been to improve the performance of the NCC code to meet combustor designer requirements for model accuracy and analysis turnaround time. Improving the performance of this code contributes significantly to the overall reduction in time and cost of the combustor design cycle. This report describes recent parallel processing modifications to NCC that have improved the parallel scalability of the code, enabling a two hour turnaround for a 1.3 million element fully reacting combustion simulation on an SGI Origin 2000.

  14. Parallel Flux Tensor Analysis for Efficient Moving Object Detection

    DTIC Science & Technology

    2011-07-01

    computing as well as parallelization to enable real time performance in analyzing complex video [3, 4 ]. There are a number of challenging computer vision... 4 . TITLE AND SUBTITLE Parallel Flux Tensor Analysis for Efficient Moving Object Detection 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT...We use the trace of the flux tensor matrix, referred to as Tr JF , that is defined below, Tr JF = ∫ Ω W (x− y)(I2xt(y) + I2yt(y) + I2tt(y))dy ( 4 ) as

  15. Integrity(®) bare-metal coronary stent-induced platelet and endothelial cell activation results in a higher risk of restenosis compared to Xience(®) everolimus-eluting stents in stable angina patients.

    PubMed

    Szük, Tibor; Fejes, Zsolt; Debreceni, Ildikó Beke; Kerényi, Adrienne; Édes, István; Kappelmayer, János; Nagy, Béla

    2016-07-01

    Drug-eluting stenting (DES) has become a reliable tool for coronary stenting; however, its direct effects on platelet and endothelium function differ from those of bare-metal stenting (BMS). This study involved a periprocedural analysis of various biomarkers of cellular activation after elective DES (Xience(®), Abbott Vascular, Santa Clara, CA, USA) or BMS (Integrity(®), Medtronic, Minneapolis, MI, USA). Forty-nine stable angina patients were recruited: 28 underwent BMS, and 21 received everolimus-eluting stents. Samples were collected (i) prior to stenting, (ii) at 24 hours after procedure, and (iii) after 1 month of dual antiplatelet therapy. Platelet activation was analyzed by surface P-selectin positivity in parallel with plasma levels of soluble P-selectin, CD40L and platelet-derived growth factor (PDGF). Endothelial cell (EC) activation was detected by measuring markers of early (von Willebrand factor) and delayed response (VCAM-1, ICAM-1, E-selectin). Patients were followed for 6 months for the occurrence of restenosis or stent thrombosis. Increased platelet activation was sustained regardless of stent type or antiplatelet medication. Concentrations of most EC markers were more elevated after BMS than after DES. No stent thrombosis was seen, but six BMS subjects displayed restenosis with significantly higher sCD40L (779 [397-899] vs. 381 [229-498] pg/mL; p = 0.032) and sICAM-1 (222 [181-272] vs. 162 [153-223] ng/mL; p = 0.046) levels than in those without complication, while DES patients exhibited significantly decreased PDGF (572 [428-626] vs. 244 [228-311] pg/mL; p = 0.004) after 1 month. Nonresponsiveness to antiplatelet drugs did not influence these changes. In conclusion, the degree of platelet and EC activation suggests that Xience(®) DES may be regarded a safer coronary intervention than Integrity(®) BMS, with a lower risk of in-stent restenosis.

  16. Super and parallel computers and their impact on civil engineering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kamat, M.P.

    1986-01-01

    This book presents the papers given at a conference on the use of supercomputers in civil engineering. Topics considered at the conference included solving nonlinear equations on a hypercube, a custom architectured parallel processing system, distributed data processing, algorithms, computer architecture, parallel processing, vector processing, computerized simulation, and cost benefit analysis.

  17. Comparison between four dissimilar solar panel configurations

    NASA Astrophysics Data System (ADS)

    Suleiman, K.; Ali, U. A.; Yusuf, Ibrahim; Koko, A. D.; Bala, S. I.

    2017-12-01

    Several studies on photovoltaic systems focused on how it operates and energy required in operating it. Little attention is paid on its configurations, modeling of mean time to system failure, availability, cost benefit and comparisons of parallel and series-parallel designs. In this research work, four system configurations were studied. Configuration I consists of two sub-components arranged in parallel with 24 V each, configuration II consists of four sub-components arranged logically in parallel with 12 V each, configuration III consists of four sub-components arranged in series-parallel with 8 V each, and configuration IV has six sub-components with 6 V each arranged in series-parallel. Comparative analysis was made using Chapman Kolmogorov's method. The derivation for explicit expression of mean time to system failure, steady state availability and cost benefit analysis were performed, based on the comparison. Ranking method was used to determine the optimal configuration of the systems. The results of analytical and numerical solutions of system availability and mean time to system failure were determined and it was found that configuration I is the optimal configuration.

  18. Advanced mathematical on-line analysis in nuclear experiments. Usage of parallel computing CUDA routines in standard root analysis

    NASA Astrophysics Data System (ADS)

    Grzeszczuk, A.; Kowalski, S.

    2015-04-01

    Compute Unified Device Architecture (CUDA) is a parallel computing platform developed by Nvidia for increase speed of graphics by usage of parallel mode for processes calculation. The success of this solution has opened technology General-Purpose Graphic Processor Units (GPGPUs) for applications not coupled with graphics. The GPGPUs system can be applying as effective tool for reducing huge number of data for pulse shape analysis measures, by on-line recalculation or by very quick system of compression. The simplified structure of CUDA system and model of programming based on example Nvidia GForce GTX580 card are presented by our poster contribution in stand-alone version and as ROOT application.

  19. Parallel gene analysis with allele-specific padlock probes and tag microarrays

    PubMed Central

    Banér, Johan; Isaksson, Anders; Waldenström, Erik; Jarvius, Jonas; Landegren, Ulf; Nilsson, Mats

    2003-01-01

    Parallel, highly specific analysis methods are required to take advantage of the extensive information about DNA sequence variation and of expressed sequences. We present a scalable laboratory technique suitable to analyze numerous target sequences in multiplexed assays. Sets of padlock probes were applied to analyze single nucleotide variation directly in total genomic DNA or cDNA for parallel genotyping or gene expression analysis. All reacted probes were then co-amplified and identified by hybridization to a standard tag oligonucleotide array. The technique was illustrated by analyzing normal and pathogenic variation within the Wilson disease-related ATP7B gene, both at the level of DNA and RNA, using allele-specific padlock probes. PMID:12930977

  20. Sensitivity Enhancement of an Inductively Coupled Local Detector Using a HEMT-Based Current Amplifier.

    PubMed

    Qian, Chunqi; Duan, Qi; Dodd, Steve; Koretsky, Alan; Murphy-Boesch, Joe

    2016-06-01

    To improve the signal transmission efficiency and sensitivity of a local detection coil that is weakly inductively coupled to a larger receive coil. The resonant detection coil is connected in parallel with the gate of a high electron mobility transistor (HEMT) transistor without impedance matching. When the drain of the transistor is capacitively shunted to ground, current amplification occurs in the resonator by feedback that transforms a capacitive impedance on the transistor's source to a negative resistance on its gate. High resolution images were obtained from a mouse brain using a small, 11 mm diameter surface coil that was inductively coupled to a commercial, phased array chest coil. Although the power consumption of the amplifier was only 88 μW, 14 dB gain was obtained with excellent noise performance. An integrated current amplifier based on a HEMT can enhance the sensitivity of inductively coupled local detectors when weakly coupled. This amplifier enables efficient signal transmission between customized user coils and commercial clinical coils, without the need for a specialized signal interface. Magn Reson Med 75:2573-2578, 2016. Published 2015. This article is a U.S. Government work and is in the public domain in the USA. Published 2015 This article is a U.S. Government work and is in the public domain in the USA.

  1. Rapid Globalization of Medical Device Clinical Development Programs in Japan - The Case of Drug-Eluting Stents.

    PubMed

    Murakami, Madoka; Suzuki, Yuka; Tominaga, Toshiyoshi

    2018-02-23

    Delays in the introduction to the Japanese market of drug-eluting stents (DES) developed overseas (i.e., "device lag") decreased sharply between 2004 and 2012. The reduction accompanied a shift in clinical development from a succession pattern (initial product development and approval overseas followed by eventual entrance into the Japanese market) to parallel development (employing multiregional clinical trials (MRCTs)). Although resource-intensive in the short-term, MRCTs are proving to be an effective tool in simultaneous global product development. Creative study designs and the absence of significant ethnic differences in Japanese subjects regarding DES safety and efficacy and the pharmacokinetic behavior of their coating drugs propel this process. More general factors such as medical need and industry incentivization also encourage this shift. Physicians' preference for DES over other percutaneous coronary interventions, the expanding global DES market, and streamlined development and approval prospects each motivate industry to continue investing in DES product development. The efforts of various stakeholders were also integral to overcoming practical obstacles, and contributions by 'Harmonization by Doing' and a premarket collaboration initiative between the USA and Japan were particularly effective. Today, USA/Japan regulatory cooperation is routine, and Japan is now integrated into global medical device development. MRCTs including Japanese subjects, sites, and investigators are now commonplace.

  2. A New Host for Caryospora lampropeltis (Apicomplexa: Eimeriidae) from the Eastern Hognose Snake, Heterodon platirhinos (Ophidia: Colubroidea: Dipsadinae), from Arkansas, U.S.A., with a Summary of Hosts of This Coccidian

    PubMed Central

    Seville, R. Scott; Connior, Matthew B.

    2016-01-01

    Between May 2012 and July 2013, four eastern hognose snakes (Heterodon platirhinos) were collected from Arkansas (n = 2) and Oklahoma (n = 2), U.S.A., and examined for coccidians. A single H. platirhinos from Arkansas was found to be passing oocysts of Caryospora lampropeltis Anderson, Duszynski, and Marquardt. Oocysts of C. lampropeltis were spheroidal to slightly subspheroidal with a rough, colourless, bi-layered wall, measure 23.5 × 22.8 µm, and have a length/width (L/W) ratio of 1.0; both micropyle and oocyst residuum were absent, but a prominent polar granule was present. Sporocysts are ovoidal, 16.8 × 12.8 µm, L/W 1.3; a prominent Stieda and subStieda body was present; a sporocyst residuum was present and composed of numerous spheroidal granules dispersed into small and large granules. Sporozoites lie lengthwise and parallel in a semi-spiral in sporocyst; a spheroidal anterior refractile and posterior refractile body is present; a single nucleus is located between the 2 refractile bodies. This represents the first report of a caryosporan reported from H. platirhinos as well as the only known coccidian from this host. A summary of hosts of C. lampropeltis is provided. PMID:27917072

  3. Lessons from the great egret: Cosmopolitan species as environmental guides

    NASA Astrophysics Data System (ADS)

    Lewis, Celia

    This dissertation is an experiment in environmental learning. The cosmopolitan species, the great egret (Egretta alba), is used as a guide to learning about local environmental history and local ecology in four places: Long Island Sound, USA; Delaware Bay, USA; Neusiedler See, Austria; and the Hunter River Valley in New South Wales, Australia. It is also used as a guide to the development of a cosmopolitan environmental perspective. The development of this broad perspective is based on the thesis that knowledge of the ecology of species mobility and cosmopolitanism may bring to light ecological connections within and between places, and that human migration and cultural mobility are also part of the ecological history of the environment. The concept of species guides is reviewed in nature literature, including examples from the works of Richard Nelson, Robert Michael Pyle, Terry Tempest Williams, Scott Weidensaul, and Peter Matthiessen. The author visits egret colonies, interviews biologists working at these sites, and develops narratives about the environmental history and the cultural history of each site, and the connections between egrets and humans in those places. Parallels are drawn between the migrant and cosmopolitan nature of great egrets and other species, and of the human species, and how recognition of these similarities can lead to a cosmopolitan environmental perspective.

  4. Response of Subalpine Conifers in the Sierra Nevada, California, U.S.A., to 20th-Century Warming and Decadal Climate Variability

    Treesearch

    Constance I. Millar; Robert D. Westfall; Diane L. Delany; John C. King; Lisa J. Graumlich

    2004-01-01

    Four independent studies of conifer growth between 1880 and 2002 in upper elevation forests of the central Sierra Nevada, California, U.S.A., showed correlated multidecadal and century-long responses associated with climate. Using tree-ring and ecological plot analysis, we studied annual branch growth of krummholz Pinus albicaulis; invasion by P....

  5. Exploring Racism inside and outside the Mathematics Classroom in Two Different Contexts: Colombia and USA

    ERIC Educational Resources Information Center

    Valoyes-Chávez, Luz; Martin, Danny Bernard

    2016-01-01

    We give attention to the racial contexts of mathematics education in Colombia and the USA. We discuss the particularities of these contexts but also explore the how in both contexts Blackness and Black people are relegated to the lower rungs of the social order. In offering this comparative analysis, we call for expanded research on race, racism,…

  6. Legal ecotones: A comparative analysis of riparian policy protection in the Oregon Coast Range, USA

    Treesearch

    Brett A. Boisjolie; Mary V. Santelmann; Rebecca L. Flitcroft; Sally L. Duncan

    2017-01-01

    Waterways of the USA are protected under the public trust doctrine, placing responsibility on the state to safeguard public resources for the benefit of current and future generations. This responsibility has led to the development of management standards for lands adjacent to streams. In the state of Oregon, policy protection for riparian areas varies by ownership (e....

  7. Applied Computational Electromagnetics Society Journal and Newletter, Volume 14 No. 1

    DTIC Science & Technology

    1999-03-01

    code validation, performance analysis, and input/output standardization; code or technique optimization and error minimization; innovations in...SOUTH AFRICA Alamo, CA, 94507-0516 USA Washington, DC 20330 USA MANAGING EDITOR Kueichien C. Hill Krishna Naishadham Richard W. Adler Wright Laboratory...INSTITUTIONAL MEMBERS ALLGON DERA Nasvagen 17 Common Road, Funtington INNOVATIVE DYNAMICS Akersberga, SWEDEN S-18425 Chichester, P018 9PD UK 2560 N. Triphammer

  8. Predicting Children's Media Use in the USA: Differences in Cross-Sectional and Longitudinal Analysis

    ERIC Educational Resources Information Center

    Lee, Sook-Jung; Bartolic, Silvia; Vandewater, Elizabeth A.

    2009-01-01

    The purpose of this paper is to examine the predictors of children's media use in the USA, comparing cross-sectional and longitudinal analyses. Data come from Waves 1 and 2 of the Child Development Supplement (CDS-I; CDS-II), a nationally representative sample of American children aged 0-12 in 1997 and 5-18 in 2002. Twenty-four hour time use…

  9. European Symposium on Reliability of Electron Devices, Failure Physics and Analysis (5th)

    DTIC Science & Technology

    1994-10-07

    Characterisation and Modelling WEDNESDAY 5th OCTOBER Session C Hot Carriers Session D Oxide States Session E Power Devices Workshop 2 Power Devices Session F...Medium Enterprises .......... 17 W2 Power Devices Workshop "Reliability of Power Semiconductors for Traction Applications...New Mexico, USA Sandia National Laboratories, Albuquerque, New Mexico, USA SESSION E Power Devices El Reliability Issues in New Technology

  10. Spatial distribution of forest aboveground biomass estimated from remote sensing and forest inventory data in New England, USA

    Treesearch

    Daolan Zheng; Linda S. Heath; Mark J. Ducey

    2008-01-01

    We combined satellite (Landsat 7 and Moderate Resolution Imaging Spectrometer) and U.S. Department of Agriculture forest inventory and analysis (FIA) data to estimate forest aboveground biomass (AGB) across New England, USA. This is practical for large-scale carbon studies and may reduce uncertainty of AGB estimates. We estimate that total regional forest AGB was 1,867...

  11. A broad scale analysis of tree risk, mitigation and potential habitat for cavity-nesting birds

    Treesearch

    Brian Kane; Paige S. Warren; Susannah B. Lerman

    2015-01-01

    Trees in towns and cities provide habitat for wildlife. In particular, cavity-nesting birds nest in the deadand decayed stems and branches of these trees. The same dead and decayed stems and branches alsohave a greater likelihood of failure, which, in some circumstances, increases risk. We examined 1760trees in Baltimore, MD, USA and western MA, USA, assessing tree...

  12. Molecular characterization of pea enation mosaic virus and bean leafroll virus from the Pacific Northwest, USA.

    PubMed

    Vemulapati, B; Druffel, K L; Eigenbrode, S D; Karasev, A; Pappu, H R

    2010-10-01

    The family Luteoviridae consists of eight viruses assigned to three different genera, Luteovirus, Polerovirus and Enamovirus. The complete genomic sequences of pea enation mosaic virus (genus Enamovirus) and bean leafroll virus (genus Luteovirus) from the Pacific Northwest, USA, were determined. Annotation, sequence comparisons, and phylogenetic analysis of selected genes together with those of known polero- and enamoviruses were conducted.

  13. Sensitivity analysis of the DRAINWAT model applied to an agricultural watershed in the lower coastal plain, North Carolina, USA

    Treesearch

    Hyunwoo Kim; Devendra M. Amatya; Stephen W. Broome; Dean L. Hesterberg; Minha Choi

    2011-01-01

    The DRAINWAT, DRAINmod for WATershed model, was selected for hydrological modelling to obtain water table depths and drainage outflows at Open Grounds Farm in Carteret County, North Carolina, USA. Six simulated storm events from the study period were compared with the measured data and analysed. Simulation results from the whole study period and selected rainfall...

  14. A Comparative Analysis of General Culture Courses within the Scope of Knowledge Categories in Undergraduate Teacher Education Programs "Turkey and the USA"

    ERIC Educational Resources Information Center

    Hayirsever, Fahriye; Kalayci, Nurdan

    2017-01-01

    In this study, general culture and general education courses within the scope of knowledge categories in undergraduate teacher education programs in Turkey and the USA are comparatively analyzed. The study is a comparative education study and uses a descriptive model. In the study, the general culture - general education courses taught in the…

  15. Alleviating Search Uncertainty through Concept Associations: Automatic Indexing, Co-Occurrence Analysis, and Parallel Computing.

    ERIC Educational Resources Information Center

    Chen, Hsinchun; Martinez, Joanne; Kirchhoff, Amy; Ng, Tobun D.; Schatz, Bruce R.

    1998-01-01

    Grounded on object filtering, automatic indexing, and co-occurrence analysis, an experiment was performed using a parallel supercomputer to analyze over 400,000 abstracts in an INSPEC computer engineering collection. A user evaluation revealed that system-generated thesauri were better than the human-generated INSPEC subject thesaurus in concept…

  16. Quadruple parallel mass Spectrometry for analysis of vitamin D and triacylglycerols in a dietary supplement

    USDA-ARS?s Scientific Manuscript database

    A ‘dilute-and-shoot’ method for vitamin D and triacylglycerols is demonstrated that employed four mass spectrometers, operating in different ionization modes, for a ‘quadruple parallel mass spectrometry’ analysis, plus three other detectors, for seven detectors overall. Sets of five samples of diet...

  17. Multi-GPU parallel algorithm design and analysis for improved inversion of probability tomography with gravity gradiometry data

    NASA Astrophysics Data System (ADS)

    Hou, Zhenlong; Huang, Danian

    2017-09-01

    In this paper, we make a study on the inversion of probability tomography (IPT) with gravity gradiometry data at first. The space resolution of the results is improved by multi-tensor joint inversion, depth weighting matrix and the other methods. Aiming at solving the problems brought by the big data in the exploration, we present the parallel algorithm and the performance analysis combining Compute Unified Device Architecture (CUDA) with Open Multi-Processing (OpenMP) based on Graphics Processing Unit (GPU) accelerating. In the test of the synthetic model and real data from Vinton Dome, we get the improved results. It is also proved that the improved inversion algorithm is effective and feasible. The performance of parallel algorithm we designed is better than the other ones with CUDA. The maximum speedup could be more than 200. In the performance analysis, multi-GPU speedup and multi-GPU efficiency are applied to analyze the scalability of the multi-GPU programs. The designed parallel algorithm is demonstrated to be able to process larger scale of data and the new analysis method is practical.

  18. B-MIC: An Ultrafast Three-Level Parallel Sequence Aligner Using MIC.

    PubMed

    Cui, Yingbo; Liao, Xiangke; Zhu, Xiaoqian; Wang, Bingqiang; Peng, Shaoliang

    2016-03-01

    Sequence alignment is the central process for sequence analysis, where mapping raw sequencing data to reference genome. The large amount of data generated by NGS is far beyond the process capabilities of existing alignment tools. Consequently, sequence alignment becomes the bottleneck of sequence analysis. Intensive computing power is required to address this challenge. Intel recently announced the MIC coprocessor, which can provide massive computing power. The Tianhe-2 is the world's fastest supercomputer now equipped with three MIC coprocessors each compute node. A key feature of sequence alignment is that different reads are independent. Considering this property, we proposed a MIC-oriented three-level parallelization strategy to speed up BWA, a widely used sequence alignment tool, and developed our ultrafast parallel sequence aligner: B-MIC. B-MIC contains three levels of parallelization: firstly, parallelization of data IO and reads alignment by a three-stage parallel pipeline; secondly, parallelization enabled by MIC coprocessor technology; thirdly, inter-node parallelization implemented by MPI. In this paper, we demonstrate that B-MIC outperforms BWA by a combination of those techniques using Inspur NF5280M server and the Tianhe-2 supercomputer. To the best of our knowledge, B-MIC is the first sequence alignment tool to run on Intel MIC and it can achieve more than fivefold speedup over the original BWA while maintaining the alignment precision.

  19. Xyce Parallel Electronic Simulator Users' Guide Version 6.8

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keiter, Eric R.; Aadithya, Karthik Venkatraman; Mei, Ting

    This manual describes the use of the Xyce Parallel Electronic Simulator. Xyce has been de- signed as a SPICE-compatible, high-performance analog circuit simulator, and has been written to support the simulation needs of the Sandia National Laboratories electrical designers. This development has focused on improving capability over the current state-of-the-art in the following areas: Capability to solve extremely large circuit problems by supporting large-scale parallel com- puting platforms (up to thousands of processors). This includes support for most popular parallel and serial computers. A differential-algebraic-equation (DAE) formulation, which better isolates the device model package from solver algorithms. This allows onemore » to develop new types of analysis without requiring the implementation of analysis-specific device models. Device models that are specifically tailored to meet Sandia's needs, including some radiation- aware devices (for Sandia users only). Object-oriented code design and implementation using modern coding practices. Xyce is a parallel code in the most general sense of the phrase$-$ a message passing parallel implementation $-$ which allows it to run efficiently a wide range of computing platforms. These include serial, shared-memory and distributed-memory parallel platforms. Attention has been paid to the specific nature of circuit-simulation problems to ensure that optimal parallel efficiency is achieved as the number of processors grows.« less

  20. Influence of Segmentation of Ring-Shaped NdFeB Magnets with Parallel Magnetization on Cylindrical Actuators

    PubMed Central

    Eckert, Paulo Roberto; Goltz, Evandro Claiton; Filho, Aly Ferreira Flores

    2014-01-01

    This work analyses the effects of segmentation followed by parallel magnetization of ring-shaped NdFeB permanent magnets used in slotless cylindrical linear actuators. The main purpose of the work is to evaluate the effects of that segmentation on the performance of the actuator and to present a general overview of the influence of parallel magnetization by varying the number of segments and comparing the results with ideal radially magnetized rings. The analysis is first performed by modelling mathematically the radial and circumferential components of magnetization for both radial and parallel magnetizations, followed by an analysis carried out by means of the 3D finite element method. Results obtained from the models are validated by measuring radial and tangential components of magnetic flux distribution in the air gap on a prototype which employs magnet rings with eight segments each with parallel magnetization. The axial force produced by the actuator was also measured and compared with the results obtained from numerical models. Although this analysis focused on a specific topology of cylindrical actuator, the observed effects on the topology could be extended to others in which surface-mounted permanent magnets are employed, including rotating electrical machines. PMID:25051032

  1. Influence of segmentation of ring-shaped NdFeB magnets with parallel magnetization on cylindrical actuators.

    PubMed

    Eckert, Paulo Roberto; Goltz, Evandro Claiton; Flores Filho, Aly Ferreira

    2014-07-21

    This work analyses the effects of segmentation followed by parallel magnetization of ring-shaped NdFeB permanent magnets used in slotless cylindrical linear actuators. The main purpose of the work is to evaluate the effects of that segmentation on the performance of the actuator and to present a general overview of the influence of parallel magnetization by varying the number of segments and comparing the results with ideal radially magnetized rings. The analysis is first performed by modelling mathematically the radial and circumferential components of magnetization for both radial and parallel magnetizations, followed by an analysis carried out by means of the 3D finite element method. Results obtained from the models are validated by measuring radial and tangential components of magnetic flux distribution in the air gap on a prototype which employs magnet rings with eight segments each with parallel magnetization. The axial force produced by the actuator was also measured and compared with the results obtained from numerical models. Although this analysis focused on a specific topology of cylindrical actuator, the observed effects on the topology could be extended to others in which surface-mounted permanent magnets are employed, including rotating electrical machines.

  2. Toward an Understanding of the Evolution of Staphylococcus aureus Strain USA300 during Colonization in Community Households

    PubMed Central

    Uhlemann, Anne-Catrin; Kennedy, Adam D.; Martens, Craig; Porcella, Stephen F.; DeLeo, Frank R.; Lowy, Franklin D.

    2012-01-01

    Staphylococcus aureus is a frequent cause of serious infections and also a human commensal. The emergence of community-associated methicillin-resistant S. aureus led to a dramatic increase in skin and soft tissue infections worldwide. This epidemic has been driven by a limited number of clones, such as USA300 in the United States. To better understand the extent of USA300 evolution and diversification within communities, we performed comparative whole-genome sequencing of three clinical and five colonizing USA300 isolates collected longitudinally from three unrelated households over a 15-month period. Phylogenetic analysis that incorporated additional geographically diverse USA300 isolates indicated that all but one likely arose from a common recent ancestor. Although limited genetic adaptation occurred over the study period, the greatest genetic heterogeneity occurred between isolates from different households and within one heavily colonized household. This diversity allowed for a more accurate tracking of interpersonal USA300 transmission. Sequencing of persisting USA300 isolates revealed mutations in genes involved in major aspects of S. aureus function: adhesion, cell wall biosynthesis, virulence, and carbohydrate metabolism. Genetic variations also included accumulation of multiple polymorphisms within select genes of two multigene operons, suggestive of small genome rearrangements rather than de novo single point mutations. Such rearrangements have been underappreciated in S. aureus and may represent novel means of strain variation. Subtle genetic changes may contribute to USA300 fitness and persistence. Elucidation of small genome rearrangements reveals a potentially new and intriguing mechanism of directed S. aureus genome diversification in environmental niches and during pathogen–host interactions. PMID:23104992

  3. [CMACPAR an modified parallel neuro-controller for control processes].

    PubMed

    Ramos, E; Surós, R

    1999-01-01

    CMACPAR is a Parallel Neurocontroller oriented to real time systems as for example Control Processes. Its characteristics are mainly a fast learning algorithm, a reduced number of calculations, great generalization capacity, local learning and intrinsic parallelism. This type of neurocontroller is used in real time applications required by refineries, hydroelectric centers, factories, etc. In this work we present the analysis and the parallel implementation of a modified scheme of the Cerebellar Model CMAC for the n-dimensional space projection using a mean granularity parallel neurocontroller. The proposed memory management allows for a significant memory reduction in training time and required memory size.

  4. PFLOTRAN-E4D: A parallel open source PFLOTRAN module for simulating time-lapse electrical resistivity data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, Timothy C.; Hammond, Glenn E.; Chen, Xingyuan

    Time-lapse electrical resistivity tomography (ERT) is finding increased application for remotely monitoring processes occurring in the near subsurface in three-dimensions (i.e. 4D monitoring). However, there are few codes capable of simulating the evolution of subsurface resistivity and corresponding tomographic measurements arising from a particular process, particularly in parallel and with an open source license. Herein we describe and demonstrate an electrical resistivity tomography module for the PFLOTRAN subsurface flow and reactive transport simulation code, named PFLOTRAN-E4D. The PFLOTRAN-E4D module operates in parallel using a dedicated set of compute cores in a master-slave configuration. At each time step, the master processesmore » receives subsurface states from PFLOTRAN, converts those states to bulk electrical conductivity, and instructs the slave processes to simulate a tomographic data set. The resulting multi-physics simulation capability enables accurate feasibility studies for ERT imaging, the identification of the ERT signatures that are unique to a given process, and facilitates the joint inversion of ERT data with hydrogeological data for subsurface characterization. PFLOTRAN-E4D is demonstrated herein using a field study of stage-driven groundwater/river water interaction ERT monitoring along the Columbia River, Washington, USA. Results demonstrate the complex nature of subsurface electrical conductivity changes, in both the saturated and unsaturated zones, arising from river stage fluctuations and associated river water intrusion into the aquifer. Furthermore, the results also demonstrate the sensitivity of surface based ERT measurements to those changes over time.« less

  5. Ergonomic analyses of downhill skiing.

    PubMed

    Clarys, J P; Publie, J; Zinzen, E

    1994-06-01

    The purpose of this study was to provide electromyographic feedback for (1) pedagogical advice in motor learning, (2) the ergonomics of materials choice and (3) competition. For these purposes: (1) EMG data were collected for the Stem Christie, the Stem Turn and the Parallel Christie (three basic ski initiation drills) and verified for the complexity of patterns; (2) integrated EMG (iEMG) and linear envelopes (LEs) were analysed from standardized positions, motions and slopes using compact, soft and competition skis; (3) in a simulated 'parallel special slalom', the muscular activity pattern and intensity of excavated and flat snow conditions were compared. The EMG data from the three studies were collected on location in the French Alps (Tignes). The analog raw EMG was recorded on the slopes with a portable seven-channel FM recorder (TEAC MR30) and with pre-amplified bipolar surface electrodes supplied with a precision instrumentation amplifier (AD 524, Analog Devices, Norwood, USA). The raw signal was full-wave rectified and enveloped using a moving average principle. This linear envelope was normalized according to the highest peak amplitude procedure per subject and was integrated in order to obtain a reference of muscular intensity. In the three studies and for all subjects (elite skiers: n = 25 in studies 1 and 2, n = 6 in study 3), we found a high level of co-contractions in the lower limb extensors and flexors, especially during the extension phase of the ski movement. The Stem Christie and the Parallel Christie showed higher levels of rhythmic movement (92 and 84%, respectively).(ABSTRACT TRUNCATED AT 250 WORDS)

  6. PFLOTRAN-E4D: A parallel open source PFLOTRAN module for simulating time-lapse electrical resistivity data

    DOE PAGES

    Johnson, Timothy C.; Hammond, Glenn E.; Chen, Xingyuan

    2016-09-22

    Time-lapse electrical resistivity tomography (ERT) is finding increased application for remotely monitoring processes occurring in the near subsurface in three-dimensions (i.e. 4D monitoring). However, there are few codes capable of simulating the evolution of subsurface resistivity and corresponding tomographic measurements arising from a particular process, particularly in parallel and with an open source license. Herein we describe and demonstrate an electrical resistivity tomography module for the PFLOTRAN subsurface flow and reactive transport simulation code, named PFLOTRAN-E4D. The PFLOTRAN-E4D module operates in parallel using a dedicated set of compute cores in a master-slave configuration. At each time step, the master processesmore » receives subsurface states from PFLOTRAN, converts those states to bulk electrical conductivity, and instructs the slave processes to simulate a tomographic data set. The resulting multi-physics simulation capability enables accurate feasibility studies for ERT imaging, the identification of the ERT signatures that are unique to a given process, and facilitates the joint inversion of ERT data with hydrogeological data for subsurface characterization. PFLOTRAN-E4D is demonstrated herein using a field study of stage-driven groundwater/river water interaction ERT monitoring along the Columbia River, Washington, USA. Results demonstrate the complex nature of subsurface electrical conductivity changes, in both the saturated and unsaturated zones, arising from river stage fluctuations and associated river water intrusion into the aquifer. Furthermore, the results also demonstrate the sensitivity of surface based ERT measurements to those changes over time.« less

  7. Fully Parallel MHD Stability Analysis Tool

    NASA Astrophysics Data System (ADS)

    Svidzinski, Vladimir; Galkin, Sergei; Kim, Jin-Soo; Liu, Yueqiang

    2014-10-01

    Progress on full parallelization of the plasma stability code MARS will be reported. MARS calculates eigenmodes in 2D axisymmetric toroidal equilibria in MHD-kinetic plasma models. It is a powerful tool for studying MHD and MHD-kinetic instabilities and it is widely used by fusion community. Parallel version of MARS is intended for simulations on local parallel clusters. It will be an efficient tool for simulation of MHD instabilities with low, intermediate and high toroidal mode numbers within both fluid and kinetic plasma models, already implemented in MARS. Parallelization of the code includes parallelization of the construction of the matrix for the eigenvalue problem and parallelization of the inverse iterations algorithm, implemented in MARS for the solution of the formulated eigenvalue problem. Construction of the matrix is parallelized by distributing the load among processors assigned to different magnetic surfaces. Parallelization of the solution of the eigenvalue problem is made by repeating steps of the present MARS algorithm using parallel libraries and procedures. Initial results of the code parallelization will be reported. Work is supported by the U.S. DOE SBIR program.

  8. Research in Parallel Algorithms and Software for Computational Aerosciences

    NASA Technical Reports Server (NTRS)

    Domel, Neal D.

    1996-01-01

    Phase I is complete for the development of a Computational Fluid Dynamics parallel code with automatic grid generation and adaptation for the Euler analysis of flow over complex geometries. SPLITFLOW, an unstructured Cartesian grid code developed at Lockheed Martin Tactical Aircraft Systems, has been modified for a distributed memory/massively parallel computing environment. The parallel code is operational on an SGI network, Cray J90 and C90 vector machines, SGI Power Challenge, and Cray T3D and IBM SP2 massively parallel machines. Parallel Virtual Machine (PVM) is the message passing protocol for portability to various architectures. A domain decomposition technique was developed which enforces dynamic load balancing to improve solution speed and memory requirements. A host/node algorithm distributes the tasks. The solver parallelizes very well, and scales with the number of processors. Partially parallelized and non-parallelized tasks consume most of the wall clock time in a very fine grain environment. Timing comparisons on a Cray C90 demonstrate that Parallel SPLITFLOW runs 2.4 times faster on 8 processors than its non-parallel counterpart autotasked over 8 processors.

  9. Research in Parallel Algorithms and Software for Computational Aerosciences

    NASA Technical Reports Server (NTRS)

    Domel, Neal D.

    1996-01-01

    Phase 1 is complete for the development of a computational fluid dynamics CFD) parallel code with automatic grid generation and adaptation for the Euler analysis of flow over complex geometries. SPLITFLOW, an unstructured Cartesian grid code developed at Lockheed Martin Tactical Aircraft Systems, has been modified for a distributed memory/massively parallel computing environment. The parallel code is operational on an SGI network, Cray J90 and C90 vector machines, SGI Power Challenge, and Cray T3D and IBM SP2 massively parallel machines. Parallel Virtual Machine (PVM) is the message passing protocol for portability to various architectures. A domain decomposition technique was developed which enforces dynamic load balancing to improve solution speed and memory requirements. A host/node algorithm distributes the tasks. The solver parallelizes very well, and scales with the number of processors. Partially parallelized and non-parallelized tasks consume most of the wall clock time in a very fine grain environment. Timing comparisons on a Cray C90 demonstrate that Parallel SPLITFLOW runs 2.4 times faster on 8 processors than its non-parallel counterpart autotasked over 8 processors.

  10. Parallel line analysis: multifunctional software for the biomedical sciences

    NASA Technical Reports Server (NTRS)

    Swank, P. R.; Lewis, M. L.; Damron, K. L.; Morrison, D. R.

    1990-01-01

    An easy to use, interactive FORTRAN program for analyzing the results of parallel line assays is described. The program is menu driven and consists of five major components: data entry, data editing, manual analysis, manual plotting, and automatic analysis and plotting. Data can be entered from the terminal or from previously created data files. The data editing portion of the program is used to inspect and modify data and to statistically identify outliers. The manual analysis component is used to test the assumptions necessary for parallel line assays using analysis of covariance techniques and to determine potency ratios with confidence limits. The manual plotting component provides a graphic display of the data on the terminal screen or on a standard line printer. The automatic portion runs through multiple analyses without operator input. Data may be saved in a special file to expedite input at a future time.

  11. Emergence of the Epidemic Methicillin-Resistant Staphylococcus aureus Strain USA300 Coincides with Horizontal Transfer of the Arginine Catabolic Mobile Element and speG-mediated Adaptations for Survival on Skin

    PubMed Central

    Planet, Paul J.; LaRussa, Samuel J.; Dana, Ali; Smith, Hannah; Xu, Amy; Ryan, Chanelle; Uhlemann, Anne-Catrin; Boundy, Sam; Goldberg, Julia; Narechania, Apurva; Kulkarni, Ritwij; Ratner, Adam J.; Geoghegan, Joan A.; Kolokotronis, Sergios-Orestis; Prince, Alice

    2013-01-01

    ABSTRACT The arginine catabolic mobile element (ACME) is the largest genomic region distinguishing epidemic USA300 strains of methicillin-resistant Staphylococcus aureus (MRSA) from other S. aureus strains. However, the functional relevance of ACME to infection and disease has remained unclear. Using phylogenetic analysis, we have shown that the modular segments of ACME were assembled into a single genetic locus in Staphylococcus epidermidis and then horizontally transferred to the common ancestor of USA300 strains in an extremely recent event. Acquisition of one ACME gene, speG, allowed USA300 strains to withstand levels of polyamines (e.g., spermidine) produced in skin that are toxic to other closely related S. aureus strains. speG-mediated polyamine tolerance also enhanced biofilm formation, adherence to fibrinogen/fibronectin, and resistance to antibiotic and keratinocyte-mediated killing. We suggest that these properties gave USA300 a major selective advantage during skin infection and colonization, contributing to the extraordinary evolutionary success of this clone. PMID:24345744

  12. Thermal anomaly mapping from night MODIS imagery of USA, a tool for environmental assessment.

    PubMed

    Miliaresis, George Ch

    2013-02-01

    A method is presented for elevation, latitude and longitude decorrelation stretch of multi-temporal MODIS MYD11C3 imagery (monthly average night land surface temperature (LST) across USA and Mexico). Multiple linear regression analysis of principal components images (PCAs) quantifies the variance explained by elevation (H), latitude (LAT), and longitude (LON). The multi-temporal LST imagery is reconstructed from the residual images and selected PCAs by taking into account the portion of variance that is not related to H, LAT, LON. The reconstructed imagery presents the magnitude the standardized LST value per pixel deviates from the H, LAT, LON predicted. LST anomaly is defined as a region that presents either positive or negative reconstructed LST value. The environmental assessment of USA indicated that only for the 25 % of the study area (Mississippi drainage basin), the LST is predicted by the H, LAT, LON. Regions with milled climatic pattern were identified in the West Coast while the coldest climatic pattern is observed for Mid USA. Positive season invariant LST anomalies are identified in SW (Arizona, Sierra Nevada, etc.) and NE USA.

  13. CFD Analysis and Design Optimization Using Parallel Computers

    NASA Technical Reports Server (NTRS)

    Martinelli, Luigi; Alonso, Juan Jose; Jameson, Antony; Reuther, James

    1997-01-01

    A versatile and efficient multi-block method is presented for the simulation of both steady and unsteady flow, as well as aerodynamic design optimization of complete aircraft configurations. The compressible Euler and Reynolds Averaged Navier-Stokes (RANS) equations are discretized using a high resolution scheme on body-fitted structured meshes. An efficient multigrid implicit scheme is implemented for time-accurate flow calculations. Optimum aerodynamic shape design is achieved at very low cost using an adjoint formulation. The method is implemented on parallel computing systems using the MPI message passing interface standard to ensure portability. The results demonstrate that, by combining highly efficient algorithms with parallel computing, it is possible to perform detailed steady and unsteady analysis as well as automatic design for complex configurations using the present generation of parallel computers.

  14. Vectorization and parallelization of the finite strip method for dynamic Mindlin plate problems

    NASA Technical Reports Server (NTRS)

    Chen, Hsin-Chu; He, Ai-Fang

    1993-01-01

    The finite strip method is a semi-analytical finite element process which allows for a discrete analysis of certain types of physical problems by discretizing the domain of the problem into finite strips. This method decomposes a single large problem into m smaller independent subproblems when m harmonic functions are employed, thus yielding natural parallelism at a very high level. In this paper we address vectorization and parallelization strategies for the dynamic analysis of simply-supported Mindlin plate bending problems and show how to prevent potential conflicts in memory access during the assemblage process. The vector and parallel implementations of this method and the performance results of a test problem under scalar, vector, and vector-concurrent execution modes on the Alliant FX/80 are also presented.

  15. Module Six: Parallel Circuits; Basic Electricity and Electronics Individualized Learning System.

    ERIC Educational Resources Information Center

    Bureau of Naval Personnel, Washington, DC.

    In this module the student will learn the rules that govern the characteristics of parallel circuits; the relationships between voltage, current, resistance and power; and the results of common troubles in parallel circuits. The module is divided into four lessons: rules of voltage and current, rules for resistance and power, variational analysis,…

  16. TECA: A Parallel Toolkit for Extreme Climate Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prabhat, Mr; Ruebel, Oliver; Byna, Surendra

    2012-03-12

    We present TECA, a parallel toolkit for detecting extreme events in large climate datasets. Modern climate datasets expose parallelism across a number of dimensions: spatial locations, timesteps and ensemble members. We design TECA to exploit these modes of parallelism and demonstrate a prototype implementation for detecting and tracking three classes of extreme events: tropical cyclones, extra-tropical cyclones and atmospheric rivers. We process a modern TB-sized CAM5 simulation dataset with TECA, and demonstrate good runtime performance for the three case studies.

  17. Kinematic Analysis and Performance Evaluation of Novel PRS Parallel Mechanism

    NASA Astrophysics Data System (ADS)

    Balaji, K.; Khan, B. Shahul Hamid

    2018-02-01

    In this paper, a 3 DoF (Degree of Freedom) novel PRS (Prismatic-Revolute- Spherical) type parallel mechanisms has been designed and presented. The combination of striaght and arc type linkages for 3 DOF parallel mechanism is introduced for the first time. The performances of the mechanisms are evaluated based on the indices such as Minimum Singular Value (MSV), Condition Number (CN), Local Conditioning Index (LCI), Kinematic Configuration Index (KCI) and Global Conditioning Index (GCI). The overall reachable workspace of all mechanisms are presented. The kinematic measure, dexterity measure and workspace analysis for all the mechanism have been evaluated and compared.

  18. Applications and accuracy of the parallel diagonal dominant algorithm

    NASA Technical Reports Server (NTRS)

    Sun, Xian-He

    1993-01-01

    The Parallel Diagonal Dominant (PDD) algorithm is a highly efficient, ideally scalable tridiagonal solver. In this paper, a detailed study of the PDD algorithm is given. First the PDD algorithm is introduced. Then the algorithm is extended to solve periodic tridiagonal systems. A variant, the reduced PDD algorithm, is also proposed. Accuracy analysis is provided for a class of tridiagonal systems, the symmetric, and anti-symmetric Toeplitz tridiagonal systems. Implementation results show that the analysis gives a good bound on the relative error, and the algorithm is a good candidate for the emerging massively parallel machines.

  19. FPGA-Based Filterbank Implementation for Parallel Digital Signal Processing

    NASA Technical Reports Server (NTRS)

    Berner, Stephan; DeLeon, Phillip

    1999-01-01

    One approach to parallel digital signal processing decomposes a high bandwidth signal into multiple lower bandwidth (rate) signals by an analysis bank. After processing, the subband signals are recombined into a fullband output signal by a synthesis bank. This paper describes an implementation of the analysis and synthesis banks using (Field Programmable Gate Arrays) FPGAs.

  20. Parallel Analysis with Unidimensional Binary Data

    ERIC Educational Resources Information Center

    Weng, Li-Jen; Cheng, Chung-Ping

    2005-01-01

    The present simulation investigated the performance of parallel analysis for unidimensional binary data. Single-factor models with 8 and 20 indicators were examined, and sample size (50, 100, 200, 500, and 1,000), factor loading (.45, .70, and .90), response ratio on two categories (50/50, 60/40, 70/30, 80/20, and 90/10), and types of correlation…

  1. Parallel versus Serial Processing Dependencies in the Perisylvian Speech Network: A Granger Analysis of Intracranial EEG Data

    ERIC Educational Resources Information Center

    Gow, David W., Jr.; Keller, Corey J.; Eskandar, Emad; Meng, Nate; Cash, Sydney S.

    2009-01-01

    In this work, we apply Granger causality analysis to high spatiotemporal resolution intracranial EEG (iEEG) data to examine how different components of the left perisylvian language network interact during spoken language perception. The specific focus is on the characterization of serial versus parallel processing dependencies in the dominant…

  2. An Analysis of the Role of ATC in the AILS Concept

    NASA Technical Reports Server (NTRS)

    Waller, Marvin C.; Doyle, Thomas M.; McGee, Frank G.

    2000-01-01

    Airborne information for lateral spacing (AILS) is a concept for making approaches to closely spaced parallel runways in instrument meteorological conditions (IMC). Under the concept, each equipped aircraft will assume responsibility for accurately managing its flight path along the approach course and maintaining separation from aircraft on the parallel approach. This document presents the results of an analysis of the AILS concept from an Air Traffic Control (ATC) perspective. The process has been examined in a step by step manner to determine ATC system support necessary to safely conduct closely spaced parallel approaches using the AILS concept. The analysis resulted in recognizing a number of issues related to integrating the process into the airspace system and proposes operating procedures.

  3. Quantifying scaling effects on satellite-derived forest area estimates for the conterminous USA

    Treesearch

    Daolan Zheng; L.S. Heath; M.J. Ducey; J.E. Smith

    2009-01-01

    We quantified the scaling effects on forest area estimates for the conterminous USA using regression analysis and the National Land Cover Dataset 30m satellite-derived maps in 2001 and 1992. The original data were aggregated to: (1) broad cover types (forest vs. non-forest); and (2) coarser resolutions (1km and 10 km). Standard errors of the model estimates were 2.3%...

  4. Environmental justice and U.S. Forest Service hazardous fuels reduction: A spatial method for impact assessment of federal resource management actions

    Treesearch

    Mark D.O. Adams; Susan Charnley

    2018-01-01

    Natural resource managers of federal lands in the USA are often tasked with various forms of social and economic impact analysis. Federal agencies in the USA also have a mandate to analyze the potential environmental justice consequences of their activities. Relatively little is known about the environmental justice impacts of natural resource management in rural areas...

  5. Family Size Preferences in Europe and USA: Ultimate Expected Number of Children. Comparative Studies Number 26: ECE Analyses of Surveys in Europe and USA.

    ERIC Educational Resources Information Center

    Berent, Jerzy

    This survey analysis compares fertility levels in the United States and European countries, discusses socioeconomic influences in ultimate expected family size, and examines birth rate trends. The average number of ultimately expected children varies from 2.13 children per woman in Bulgaria to 2.80 in Spain. Eighty to 90 percent of U.S. and…

  6. Public health, autonomous automobiles, and the rush to market.

    PubMed

    Kelley, Ben

    2017-05-01

    The USA has the worst motor vehicle safety problem among high-income countries and is pressing forward with the development of autonomous automobiles to address it. Government guidance and regulation, still inadequate, will be critical to the safety of the public. The analysis of this public health problem in the USA reveals the key factors that will determine the benefits and risks of autonomous vehicles around the world.

  7. CACDA JIFFY III War Game. Volume II. Methodology

    DTIC Science & Technology

    1980-09-01

    Devens , MA 01433 Commandant USA Air ,Defense School ATTN:’ ATSA-CD-SC-S Fort Bliss, TX 79916 Commandant USA Intelligence Center and School Fort Huachuca...RELEASE: DISTRIBUTION UNLIMITED 0 801030o 033 1 Technical Report TR 6-80, Septenber 1980 US Army Combined Arms Studies and Analysis Activity Fort ...manual war game developed and operated at the USATRADOC Combined Arms Combat Developments Activity (CACDA),, Fort Leavenworth, Kansas, for scenzrio

  8. Characteristics of breast cancer in Central China, literature review and comparison with USA.

    PubMed

    Chen, Chuang; Sun, Si; Yuan, Jing-Ping; Wang, Yao-Huai; Cao, Tian-Ze; Zheng, Hong-Mei; Jiang, Xue-Qing; Gong, Yi-Ping; Tu, Yi; Yao, Feng; Hu, Ming-Bai; Li, Juan-Juan; Sun, Sheng-Rong; Wei, Wen

    2016-12-01

    This work was to analyze characteristics of breast cancer (BC) in Central China, summarize main characteristics in China and compare with USA. BC main characteristics from four hospitals in Central China from 2002 to 2012 were collected and analyzed. All the single and large-scale clinical reports covering at least ten years were selected and summarized to calculate the BC characteristics of China. BC Characteristics in USA were selected based on the database from Surveillance, Epidemiology, and End Results (SEER) Program. Age distribution in Central China was normal with one age peak at 45-49 years, displaying differences from USA and Chinese American with two age peaks. BC characteristics in Central China displayed distinct features from USA and Chinese American, including significant younger onset age, lower proportion of patients with stage I, lymph node negative, small tumor size and ER positive. A total ten long-term and large-scale clinical reports were selected for BC characteristics of Mainland China analysis. A total of 53,571 BC patients were enrolled from 1995 to 2012. The main characteristics of BC in Mainland China were similar as that in Central China, but were significant different from developed regions of China (Hong Kong and Taiwan), USA and Chinese American. BC characteristics in Central China displayed representative patterns of Mainland China, while showed distinct patterns from Chinese patients in other developed areas and USA. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. Parallel Hybrid Gas-Electric Geared Turbofan Engine Conceptual Design and Benefits Analysis

    NASA Technical Reports Server (NTRS)

    Lents, Charles; Hardin, Larry; Rheaume, Jonathan; Kohlman, Lee

    2016-01-01

    The conceptual design of a parallel gas-electric hybrid propulsion system for a conventional single aisle twin engine tube and wing vehicle has been developed. The study baseline vehicle and engine technology are discussed, followed by results of the hybrid propulsion system sizing and performance analysis. The weights analysis for the electric energy storage & conversion system and thermal management system is described. Finally, the potential system benefits are assessed.

  10. How pattern is selected in drift wave turbulence: Role of parallel flow shear

    NASA Astrophysics Data System (ADS)

    Kosuga, Y.

    2017-12-01

    The role of parallel shear flow in the pattern selection problem in drift wave turbulence is discussed. Patterns of interest here are E × B convective cells, which include poloidally symmetric zonal flows and radially elongated streamers. The competition between zonal flow formation and streamer formation is analyzed in the context of modulational instability analysis, with the parallel flow shear as a parameter. For drift wave turbulence with k⊥ρs ≲ O (1 ) and without parallel flow coupling, zonal flows are preferred structures. While increasing the magnitude of parallel flow shear, streamer growth overcomes zonal flow growth. This is because the self-focusing effect of the modulational instability becomes more effective for streamers through density and parallel velocity modulation. As a consequence, the bursty release of free energy may result as the parallel flow shear increases.

  11. Is there a real-estate bubble in the US?

    NASA Astrophysics Data System (ADS)

    Zhou, Wei-Xing; Sornette, Didier

    2006-02-01

    Using a methodology developed in previous papers, we analyze the quarterly average sale prices of new houses sold in the USA as a whole, in the Northeast, Midwest, South, and West of the USA, in each of the 50 states and the District of Columbia of the USA, to determine whether they have grown at a faster-than-exponential rate which we take as the diagnostic of a bubble. We find that 22 states (mostly Northeast and West) exhibit clear-cut signatures of a fast-growing bubble. From the analysis of the S&P 500 Home Index, we conclude that the turning point of the bubble will probably occur around mid-2006.

  12. The age of fathers in the USA is rising: an analysis of 168 867 480 births from 1972 to 2015.

    PubMed

    Khandwala, Yash S; Zhang, Chiyuan A; Lu, Ying; Eisenberg, Michael L

    2017-10-01

    How has the mean paternal age in the USA changed over the past 4 decades? The age at which men are fathering children in the USA has been increasing over time, although it varies by race, geographic region and paternal education level. While the rise in mean maternal age and its implications for fertility, birth outcomes and public health have been well documented, little is known about paternal characteristics of births within the USA. A retrospective data analysis of paternal age and reporting patterns for 168 867 480 live births within the USA since 1972 was conducted. All live births within the USA collected through the National Vital Statistics System (NVSS) of the Centers for Disease Control and Prevention (CDC) were evaluated. Inverse probability weighting (IPW) was used to reduce bias due to missing paternal records. Mean paternal age has increased over the past 44 years from 27.4 to 30.9 years. College education and Northeastern birth states were associated with higher paternal age. Racial/ethnic differences were also identified, whereby Asian fathers were the oldest and Black fathers were the youngest. The parental age difference (paternal age minus maternal age) has decreased over the past 44 years. Births to Black and Native American mothers were most often lacking paternal data, implying low paternal reporting. Paternal reporting was higher for older and more educated women. Although we utilized IPW to reduce the impact of paternal reporting bias, our estimates may still be influenced by the missing data in the NVSS. Paternal age is rising within the USA among all regions, races and education levels. Given the implications for offspring health and demographic patterns, further research on this trend is warranted. No funding was received for this study and there are no competing interests. N/A. © The Author 2017. Published by Oxford University Press on behalf of the European Society of Human Reproduction and Embryology. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  13. Available Tools to Facilitate Early Patient Access to Medicines in the EU and the USA: Analysis of Conditional Approvals and the Implications for Personalized Medicine.

    PubMed

    Leyens, Lada; Richer, Étienne; Melien, Øyvind; Ballensiefen, Wolfgang; Brand, Angela

    2015-01-01

    Scientific knowledge and our understanding of the human body and diseases have limited any possible treatment tailoring to each patient. The technological advances enabling the integration of various data sets (e.g. '-omics', microbiome, epigenetics and environmental exposure) have facilitated a greater understanding of the human body, the molecular basis of disease and all the factors influencing disease onset, progression and response to treatment, thereby ushering in the era of personalized medicine. We evaluate the regulatory approaches available to facilitate early patient access to efficacious and safe compounds in the EU and the USA in order to make more informed recommendations in the future as to the gaps in regulations for early patient access. An in-depth analysis of conditional approvals (EU) and accelerated approvals (USA) is performed based on the publicly available information (European public assessment reports and a summary review of products approved under both programmes). The types of product, indications, time to approval and type of evidence submitted were analysed. Between 2007 and early 2015, 17 products were conditionally approved in the EU and 25 in the USA, most of them in the area of oncology and based on evidence from phase II clinical trial data. Early approval of promising products based on data from early phases of development is already possible in the EU and the USA. Some of the improvements could entail implementing a rolling assessment of evidence in Europe and extending the scope of early dialogues. © 2015 S. Karger AG, Basel.

  14. Genetic variation in Southern USA rice genotypes for seedling salinity tolerance

    PubMed Central

    De Leon, Teresa B.; Linscombe, Steven; Gregorio, Glenn; Subudhi, Prasanta K.

    2015-01-01

    The success of a rice breeding program in developing salt tolerant varieties depends on genetic variation and the salt stress response of adapted and donor rice germplasm. In this study, we used a combination of morphological and physiological traits in multivariate analyses to elucidate the phenotypic and genetic variation in salinity tolerance of 30 Southern USA rice genotypes, along with 19 donor genotypes with varying degree of tolerance. Significant genotypic variation and correlations were found among the salt injury score (SIS), ion leakage, chlorophyll reduction, shoot length reduction, shoot K+ concentration, and shoot Na+/K+ ratio. Using these parameters, the combined methods of cluster analysis and discriminant analysis validated the salinity response of known genotypes and classified most of the USA varieties into sensitive groups, except for three and seven varieties placed in the tolerant and moderately tolerant groups, respectively. Discriminant function and MANOVA delineated the differences in tolerance and suggested no differences between sensitive and highly sensitive (HS) groups. DNA profiling using simple sequence repeat markers showed narrow genetic diversity among USA genotypes. However, the overall genetic clustering was mostly due to subspecies and grain type differentiation and not by varietal grouping based on salinity tolerance. Among the donor genotypes, Nona Bokra, Pokkali, and its derived breeding lines remained the donors of choice for improving salinity tolerance during the seedling stage. However, due to undesirable agronomic attributes and photosensitivity of these donors, alternative genotypes such as TCCP266, Geumgangbyeo, and R609 are recommended as useful and novel sources of salinity tolerance for USA rice breeding programs. PMID:26074937

  15. Observations of large parallel electric fields in the auroral ionosphere

    NASA Technical Reports Server (NTRS)

    Mozer, F. S.

    1976-01-01

    Rocket borne measurements employing a double probe technique were used to gather evidence for the existence of electric fields in the auroral ionosphere having components parallel to the magnetic field direction. An analysis of possible experimental errors leads to the conclusion that no known uncertainties can account for the roughly 10 mV/m parallel electric fields that are observed.

  16. Cloud parallel processing of tandem mass spectrometry based proteomics data.

    PubMed

    Mohammed, Yassene; Mostovenko, Ekaterina; Henneman, Alex A; Marissen, Rob J; Deelder, André M; Palmblad, Magnus

    2012-10-05

    Data analysis in mass spectrometry based proteomics struggles to keep pace with the advances in instrumentation and the increasing rate of data acquisition. Analyzing this data involves multiple steps requiring diverse software, using different algorithms and data formats. Speed and performance of the mass spectral search engines are continuously improving, although not necessarily as needed to face the challenges of acquired big data. Improving and parallelizing the search algorithms is one possibility; data decomposition presents another, simpler strategy for introducing parallelism. We describe a general method for parallelizing identification of tandem mass spectra using data decomposition that keeps the search engine intact and wraps the parallelization around it. We introduce two algorithms for decomposing mzXML files and recomposing resulting pepXML files. This makes the approach applicable to different search engines, including those relying on sequence databases and those searching spectral libraries. We use cloud computing to deliver the computational power and scientific workflow engines to interface and automate the different processing steps. We show how to leverage these technologies to achieve faster data analysis in proteomics and present three scientific workflows for parallel database as well as spectral library search using our data decomposition programs, X!Tandem and SpectraST.

  17. A visual parallel-BCI speller based on the time-frequency coding strategy.

    PubMed

    Xu, Minpeng; Chen, Long; Zhang, Lixin; Qi, Hongzhi; Ma, Lan; Tang, Jiabei; Wan, Baikun; Ming, Dong

    2014-04-01

    Spelling is one of the most important issues in brain-computer interface (BCI) research. This paper is to develop a visual parallel-BCI speller system based on the time-frequency coding strategy in which the sub-speller switching among four simultaneously presented sub-spellers and the character selection are identified in a parallel mode. The parallel-BCI speller was constituted by four independent P300+SSVEP-B (P300 plus SSVEP blocking) spellers with different flicker frequencies, thereby all characters had a specific time-frequency code. To verify its effectiveness, 11 subjects were involved in the offline and online spellings. A classification strategy was designed to recognize the target character through jointly using the canonical correlation analysis and stepwise linear discriminant analysis. Online spellings showed that the proposed parallel-BCI speller had a high performance, reaching the highest information transfer rate of 67.4 bit min(-1), with an average of 54.0 bit min(-1) and 43.0 bit min(-1) in the three rounds and five rounds, respectively. The results indicated that the proposed parallel-BCI could be effectively controlled by users with attention shifting fluently among the sub-spellers, and highly improved the BCI spelling performance.

  18. Xyce™ Parallel Electronic Simulator Users' Guide, Version 6.5.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keiter, Eric R.; Aadithya, Karthik V.; Mei, Ting

    This manual describes the use of the Xyce Parallel Electronic Simulator. Xyce has been designed as a SPICE-compatible, high-performance analog circuit simulator, and has been written to support the simulation needs of the Sandia National Laboratories electrical designers. This development has focused on improving capability over the current state-of-the-art in the following areas: Capability to solve extremely large circuit problems by supporting large-scale parallel computing platforms (up to thousands of processors). This includes support for most popular parallel and serial computers. A differential-algebraic-equation (DAE) formulation, which better isolates the device model package from solver algorithms. This allows one to developmore » new types of analysis without requiring the implementation of analysis-specific device models. Device models that are specifically tailored to meet Sandia's needs, including some radiation- aware devices (for Sandia users only). Object-oriented code design and implementation using modern coding practices. Xyce is a parallel code in the most general sense of the phrase -- a message passing parallel implementation -- which allows it to run efficiently a wide range of computing platforms. These include serial, shared-memory and distributed-memory parallel platforms. Attention has been paid to the specific nature of circuit-simulation problems to ensure that optimal parallel efficiency is achieved as the number of processors grows. The information herein is subject to change without notice. Copyright © 2002-2016 Sandia Corporation. All rights reserved.« less

  19. GPU accelerated dynamic functional connectivity analysis for functional MRI data.

    PubMed

    Akgün, Devrim; Sakoğlu, Ünal; Esquivel, Johnny; Adinoff, Bryon; Mete, Mutlu

    2015-07-01

    Recent advances in multi-core processors and graphics card based computational technologies have paved the way for an improved and dynamic utilization of parallel computing techniques. Numerous applications have been implemented for the acceleration of computationally-intensive problems in various computational science fields including bioinformatics, in which big data problems are prevalent. In neuroimaging, dynamic functional connectivity (DFC) analysis is a computationally demanding method used to investigate dynamic functional interactions among different brain regions or networks identified with functional magnetic resonance imaging (fMRI) data. In this study, we implemented and analyzed a parallel DFC algorithm based on thread-based and block-based approaches. The thread-based approach was designed to parallelize DFC computations and was implemented in both Open Multi-Processing (OpenMP) and Compute Unified Device Architecture (CUDA) programming platforms. Another approach developed in this study to better utilize CUDA architecture is the block-based approach, where parallelization involves smaller parts of fMRI time-courses obtained by sliding-windows. Experimental results showed that the proposed parallel design solutions enabled by the GPUs significantly reduce the computation time for DFC analysis. Multicore implementation using OpenMP on 8-core processor provides up to 7.7× speed-up. GPU implementation using CUDA yielded substantial accelerations ranging from 18.5× to 157× speed-up once thread-based and block-based approaches were combined in the analysis. Proposed parallel programming solutions showed that multi-core processor and CUDA-supported GPU implementations accelerated the DFC analyses significantly. Developed algorithms make the DFC analyses more practical for multi-subject studies with more dynamic analyses. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. Analysis of multigrid methods on massively parallel computers: Architectural implications

    NASA Technical Reports Server (NTRS)

    Matheson, Lesley R.; Tarjan, Robert E.

    1993-01-01

    We study the potential performance of multigrid algorithms running on massively parallel computers with the intent of discovering whether presently envisioned machines will provide an efficient platform for such algorithms. We consider the domain parallel version of the standard V cycle algorithm on model problems, discretized using finite difference techniques in two and three dimensions on block structured grids of size 10(exp 6) and 10(exp 9), respectively. Our models of parallel computation were developed to reflect the computing characteristics of the current generation of massively parallel multicomputers. These models are based on an interconnection network of 256 to 16,384 message passing, 'workstation size' processors executing in an SPMD mode. The first model accomplishes interprocessor communications through a multistage permutation network. The communication cost is a logarithmic function which is similar to the costs in a variety of different topologies. The second model allows single stage communication costs only. Both models were designed with information provided by machine developers and utilize implementation derived parameters. With the medium grain parallelism of the current generation and the high fixed cost of an interprocessor communication, our analysis suggests an efficient implementation requires the machine to support the efficient transmission of long messages, (up to 1000 words) or the high initiation cost of a communication must be significantly reduced through an alternative optimization technique. Furthermore, with variable length message capability, our analysis suggests the low diameter multistage networks provide little or no advantage over a simple single stage communications network.

  1. Stability of tapered and parallel-walled dental implants: A systematic review and meta-analysis.

    PubMed

    Atieh, Momen A; Alsabeeha, Nabeel; Duncan, Warwick J

    2018-05-15

    Clinical trials have suggested that dental implants with a tapered configuration have improved stability at placement, allowing immediate placement and/or loading. The aim of this systematic review and meta-analysis was to evaluate the implant stability of tapered dental implants compared to standard parallel-walled dental implants. Applying the guidelines of Preferred Reporting Items for Systematic Reviews and Meta-analyses (PRISMA) statement, randomized controlled trials (RCTs) were searched for in electronic databases and complemented by hand searching. The risk of bias was assessed using the Cochrane Collaboration's Risk of Bias tool and data were analyzed using statistical software. A total of 1199 studies were identified, of which, five trials were included with 336 dental implants in 303 participants. Overall meta-analysis showed that tapered dental implants had higher implant stability values than parallel-walled dental implants at insertion and 8 weeks but the difference was not statistically significant. Tapered dental implants had significantly less marginal bone loss compared to parallel-walled dental implants. No significant differences in implant failure rate were found between tapered and parallel-walled dental implants. There is limited evidence to demonstrate the effectiveness of tapered dental implants in achieving greater implant stability compared to parallel-walled dental implants. Superior short-term results in maintaining peri-implant marginal bone with tapered dental implants are possible. Further properly designed RCTs are required to endorse the supposed advantages of tapered dental implants in immediate loading protocol and other complex clinical scenarios. © 2018 Wiley Periodicals, Inc.

  2. Fully Parallel MHD Stability Analysis Tool

    NASA Astrophysics Data System (ADS)

    Svidzinski, Vladimir; Galkin, Sergei; Kim, Jin-Soo; Liu, Yueqiang

    2015-11-01

    Progress on full parallelization of the plasma stability code MARS will be reported. MARS calculates eigenmodes in 2D axisymmetric toroidal equilibria in MHD-kinetic plasma models. It is a powerful tool for studying MHD and MHD-kinetic instabilities and it is widely used by fusion community. Parallel version of MARS is intended for simulations on local parallel clusters. It will be an efficient tool for simulation of MHD instabilities with low, intermediate and high toroidal mode numbers within both fluid and kinetic plasma models, already implemented in MARS. Parallelization of the code includes parallelization of the construction of the matrix for the eigenvalue problem and parallelization of the inverse iterations algorithm, implemented in MARS for the solution of the formulated eigenvalue problem. Construction of the matrix is parallelized by distributing the load among processors assigned to different magnetic surfaces. Parallelization of the solution of the eigenvalue problem is made by repeating steps of the present MARS algorithm using parallel libraries and procedures. Results of MARS parallelization and of the development of a new fix boundary equilibrium code adapted for MARS input will be reported. Work is supported by the U.S. DOE SBIR program.

  3. Computer-Aided Parallelizer and Optimizer

    NASA Technical Reports Server (NTRS)

    Jin, Haoqiang

    2011-01-01

    The Computer-Aided Parallelizer and Optimizer (CAPO) automates the insertion of compiler directives (see figure) to facilitate parallel processing on Shared Memory Parallel (SMP) machines. While CAPO currently is integrated seamlessly into CAPTools (developed at the University of Greenwich, now marketed as ParaWise), CAPO was independently developed at Ames Research Center as one of the components for the Legacy Code Modernization (LCM) project. The current version takes serial FORTRAN programs, performs interprocedural data dependence analysis, and generates OpenMP directives. Due to the widely supported OpenMP standard, the generated OpenMP codes have the potential to run on a wide range of SMP machines. CAPO relies on accurate interprocedural data dependence information currently provided by CAPTools. Compiler directives are generated through identification of parallel loops in the outermost level, construction of parallel regions around parallel loops and optimization of parallel regions, and insertion of directives with automatic identification of private, reduction, induction, and shared variables. Attempts also have been made to identify potential pipeline parallelism (implemented with point-to-point synchronization). Although directives are generated automatically, user interaction with the tool is still important for producing good parallel codes. A comprehensive graphical user interface is included for users to interact with the parallelization process.

  4. A police education programme to integrate occupational safety and HIV prevention: protocol for a modified stepped-wedge study design with parallel prospective cohorts to assess behavioural outcomes

    PubMed Central

    Strathdee, Steffanie A; Arredondo, Jaime; Rocha, Teresita; Abramovitz, Daniela; Rolon, Maria Luisa; Patiño Mandujano, Efrain; Rangel, Maria Gudelia; Olivarria, Horcasitas Omar; Gaines, Tommi; Patterson, Thomas L; Beletsky, Leo

    2015-01-01

    Introduction Policing practices are key drivers of HIV among people who inject drugs (PWID). This paper describes the protocol for the first study to prospectively examine the impact of a police education programme (PEP) to align law enforcement and HIV prevention. PEPs incorporating HIV prevention (including harm reduction programmes like syringe exchange) have been successfully piloted in several countries but were limited to brief pre–post assessments; the impact of PEPs on policing behaviours and occupational safety is unknown. Objectives Proyecto ESCUDO (SHIELD) aims to evaluate the efficacy of the PEP on uptake of occupational safety procedures, as assessed through the incidence of needle stick injuries (NSIs) (primary outcome) and changes in knowledge of transmission, prevention and treatment of HIV and viral hepatitis; attitudes towards PWID, adverse behaviours that interfere with HIV prevention and protective behaviours (secondary outcomes). Methods/analysis ESCUDO is a hybrid type I design that simultaneously tests an intervention and an implementation strategy. Using a modified stepped-wedge design involving all active duty street-level police officers in Tijuana (N=∼1200), we will administer one 3 h PEP course to groups of 20–50 officers until the entire force is trained. NSI incidence and geocoded arrest data will be assessed from department-wide de-identified data. Of the consenting police officers, a subcohort (N=500) will be randomly sampled from each class to undergo pre-PEP and post-PEP surveys with a semiannual follow-up for 2 years to assess self-reported NSIs, attitudes and behaviour changes. The impact on PWIDs will be externally validated through a parallel cohort of Tijuana PWIDs. Ethics/dissemination Research ethics approval was obtained from the USA and Mexico. Findings will be disseminated through open access to protocol materials through the Law Enforcement and HIV Network. Trial registration number NCT02444403. PMID:26260350

  5. Exploring the Sensitivity of Horn's Parallel Analysis to the Distributional Form of Random Data

    ERIC Educational Resources Information Center

    Dinno, Alexis

    2009-01-01

    Horn's parallel analysis (PA) is the method of consensus in the literature on empirical methods for deciding how many components/factors to retain. Different authors have proposed various implementations of PA. Horn's seminal 1965 article, a 1996 article by Thompson and Daniel, and a 2004 article by Hayton, Allen, and Scarpello all make assertions…

  6. Accuracy of Revised and Traditional Parallel Analyses for Assessing Dimensionality with Binary Data

    ERIC Educational Resources Information Center

    Green, Samuel B.; Redell, Nickalus; Thompson, Marilyn S.; Levy, Roy

    2016-01-01

    Parallel analysis (PA) is a useful empirical tool for assessing the number of factors in exploratory factor analysis. On conceptual and empirical grounds, we argue for a revision to PA that makes it more consistent with hypothesis testing. Using Monte Carlo methods, we evaluated the relative accuracy of the revised PA (R-PA) and traditional PA…

  7. Parallel peak pruning for scalable SMP contour tree computation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carr, Hamish A.; Weber, Gunther H.; Sewell, Christopher M.

    As data sets grow to exascale, automated data analysis and visualisation are increasingly important, to intermediate human understanding and to reduce demands on disk storage via in situ analysis. Trends in architecture of high performance computing systems necessitate analysis algorithms to make effective use of combinations of massively multicore and distributed systems. One of the principal analytic tools is the contour tree, which analyses relationships between contours to identify features of more than local importance. Unfortunately, the predominant algorithms for computing the contour tree are explicitly serial, and founded on serial metaphors, which has limited the scalability of this formmore » of analysis. While there is some work on distributed contour tree computation, and separately on hybrid GPU-CPU computation, there is no efficient algorithm with strong formal guarantees on performance allied with fast practical performance. Here in this paper, we report the first shared SMP algorithm for fully parallel contour tree computation, withfor-mal guarantees of O(lgnlgt) parallel steps and O(n lgn) work, and implementations with up to 10x parallel speed up in OpenMP and up to 50x speed up in NVIDIA Thrust.« less

  8. Regional-scale yield simulations using crop and climate models: assessing uncertainties, sensitivity to temperature and adaptation options

    NASA Astrophysics Data System (ADS)

    Challinor, A. J.

    2010-12-01

    Recent progress in assessing the impacts of climate variability and change on crops using multiple regional-scale simulations of crop and climate (i.e. ensembles) is presented. Simulations for India and China used perturbed responses to elevated carbon dioxide constrained using observations from FACE studies and controlled environments. Simulations with crop parameter sets representing existing and potential future adapted varieties were also carried out. The results for India are compared to sensitivity tests on two other crop models. For China, a parallel approach used socio-economic data to account for autonomous farmer adaptation. Results for the USA analysed cardinal temperatures under a range of local warming scenarios for 2711 varieties of spring wheat. The results are as follows: 1. Quantifying and reducing uncertainty. The relative contribution of uncertainty in crop and climate simulation to the total uncertainty in projected yield changes is examined. The observational constraints from FACE and controlled environment studies are shown to be the likely critical factor in maintaining relatively low crop parameter uncertainty. Without these constraints, crop simulation uncertainty in a doubled CO2 environment would likely be greater than uncertainty in simulating climate. However, consensus across crop models in India varied across different biophysical processes. 2. The response of yield to changes in local mean temperature was examined and compared to that found in the literature. No consistent response to temperature change was found across studies. 3. Implications for adaptation. China. The simulations of spring wheat in China show the relative importance of tolerance to water and heat stress in avoiding future crop failures. The greatest potential for reducing the number of harvests less than one standard deviation below the baseline mean yield value comes from alleviating water stress; the greatest potential for reducing harvests less than two standard deviations below the mean comes from alleviation of heat stress. The socio-economic analysis suggests that adaptation is also possible through measures such as greater investment. India. The simulations of groundnut in India identified regions where heat stress will play an increasing role in limiting crop yields, and other regions where crops with greater thermal time requirement will be needed. The simulations were used, together with an observed dataset and a simple analysis of crop cardinal temperatures and thermal time, to estimate the potential for adaptation using existing cultivars. USA. Analysis of spring wheat in the USA showed that at +2oC of local warming, 87% of the 2711 varieties examined, and all of the five most common varieties, could be used to maintain the crop duration of the current climate (i.e. successful adaptation to mean warming). At +4o this fell to 54% of all varieties, and two of the top five. 4. Future research. The results, and the limitations of the study, suggest directions for research to link climate and crop models, socio-economic analyses and crop variety trial data in order to prioritise adaptation options such as capacity building, plant breeding and biotechnology.

  9. Paramedir: A Tool for Programmable Performance Analysis

    NASA Technical Reports Server (NTRS)

    Jost, Gabriele; Labarta, Jesus; Gimenez, Judit

    2004-01-01

    Performance analysis of parallel scientific applications is time consuming and requires great expertise in areas such as programming paradigms, system software, and computer hardware architectures. In this paper we describe a tool that facilitates the programmability of performance metric calculations thereby allowing the automation of the analysis and reducing the application development time. We demonstrate how the system can be used to capture knowledge and intuition acquired by advanced parallel programmers in order to be transferred to novice users.

  10. State-plane analysis of parallel resonant converter

    NASA Technical Reports Server (NTRS)

    Oruganti, R.; Lee, F. C.

    1985-01-01

    A method for analyzing the complex operation of a parallel resonant converter is developed, utilizing graphical state-plane techniques. The comprehensive mode analysis uncovers, for the first time, the presence of other complex modes besides the continuous conduction mode and the discontinuous conduction mode and determines their theoretical boundaries. Based on the insight gained from the analysis, a novel, high-frequency resonant buck converter is proposed. The voltage conversion ratio of the new converter is almost independent of load.

  11. Analysis of the longitudinal space charge impedance of a round uniform beam inside parallel plates and rectangular chambers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, L.; Li, Y.

    2015-02-03

    This paper analyzes the longitudinal space charge impedances of a round uniform beam inside a rectangular and parallel plate chambers using the image charge method. This analysis is valid for arbitrary wavelengths, and the calculations converge rapidly. The research shows that only a few of the image beams are needed to obtain a relative error less than 0.1%. The beam offset effect is also discussed in the analysis.

  12. Comparison of the pharmacokinetics and safety of three formulations of infliximab (CT-P13, EU-approved reference infliximab and the US-licensed reference infliximab) in healthy subjects: a randomized, double-blind, three-arm, parallel-group, single-dose, Phase I study.

    PubMed

    Park, Won; Lee, Sang Joon; Yun, Jihye; Yoo, Dae Hyun

    2015-01-01

    To compare the pharmacokinetics (PK), safety and tolerability of biosimilar infliximab (CT-P13 [Remsima(®), Inflectra(®)]) with two formulations of the reference medicinal product (RMP) (Remicade(®)) from either Europe (EU-RMP) or the USA (US-RMP). This was a double-blind, three-arm, parallel-group study (EudraCT number: 2013-003173-10). Healthy subjects received single doses (5 mg/kg) of CT-P13 (n = 71), EU-RMP (n = 71) or US-RMP (n = 71). The primary objective was to compare the PK profiles for the three formulations. Assessments of comparative safety and tolerability were secondary objectives. Baseline demographics were well balanced across the three groups. Primary end points (Cmax, AUClast and AUCinf) were equivalent between all formulations (CT-P13 vs EU-RMP; CT-P13 vs US-RMP; EU-RMP vs US-RMP). All other PK end points supported the high similarity of the three treatments. Tolerability profiles of the formulations were similar. The PK profile of CT-P13 is highly similar to EU-RMP and US-RMP. All three formulations were equally well tolerated.

  13. Comparing forest fragmentation and its drivers in China and the USA with Globcover v2.2

    USGS Publications Warehouse

    Chen, Mingshi; Mao, Lijun; Zhou, Chunguo; Vogelmann, James E.; Zhu, Zhiliang

    2010-01-01

    Forest loss and fragmentation are of major concern to the international community, in large part because they impact so many important environmental processes. The main objective of this study was to assess the differences in forest fragmentation patterns and drivers between China and the conterminous United States (USA). Using the latest 300-m resolution global land cover product, Globcover v2.2, a comparative analysis of forest fragmentation patterns and drivers was made. The fragmentation patterns were characterized by using a forest fragmentation model built on the sliding window analysis technique in association with landscape indices. Results showed that China’s forests were substantially more fragmented than those of the USA. This was evidenced by a large difference in the amount of interior forest area share, with China having 48% interior forest versus the 66% for the USA. China’s forest fragmentation was primarily attributed to anthropogenic disturbances, driven particularly by agricultural expansion from an increasing and large population, as well as poor forest management practices. In contrast, USA forests were principally fragmented by natural land cover types. However, USA urban sprawl contributed more to forest fragmentation than in China. This is closely tied to the USA’s economy, lifestyle and institutional processes. Fragmentation maps were generated from this study, which provide valuable insights and implications regarding habitat planning for rare and endangered species. Such maps enable development of strategic plans for sustainable forest management by identifying areas with high amounts of human-induced fragmentation, which improve risk assessments and enable better targeting for protection and remediation efforts. Because forest fragmentation is a long-term, complex process that is highly related to political, institutional, economic and philosophical arenas, both nations need to take effective and comprehensive measures to mitigate the negative effects of forest loss and fragmentation on the existing forest ecosystems.

  14. Geospatial Analysis of Drug Poisoning Deaths Involving Heroin in the USA, 2000-2014.

    PubMed

    Stewart, Kathleen; Cao, Yanjia; Hsu, Margaret H; Artigiani, Eleanor; Wish, Eric

    2017-08-01

    We investigate the geographic patterns of drug poisoning deaths involving heroin by county for the USA from 2000 to 2014. The county-level patterns of mortality are examined with respect to age-adjusted rates of death for different classes of urbanization and racial and ethnic groups, while rates based on raw counts of drug poisoning deaths involving heroin are estimated for different age groups and by gender. To account for possible underestimations in these rates due to small areas or small numbers, spatial empirical Baye's estimation techniques have been used to smooth the rates of death and alleviate underestimation when analyzing spatial patterns for these different groups. The geographic pattern of poisoning deaths involving heroin has shifted from the west coast of the USA in the year 2000 to New England, the Mid-Atlantic region, and the Great Lakes and central Ohio Valley by 2014. The evolution over space and time of clusters of drug poisoning deaths involving heroin is confirmed through the SaTScan analysis. For this period, White males were found to be the most impacted population group overall; however, Blacks and Hispanics are highly impacted in counties where significant populations of these two groups reside. Our results show that while 35-54-year-olds were the most highly impacted age group by county from 2000 to 2010, by 2014, the trend had changed with an increasing number of counties experiencing higher death rates for individuals 25-34 years. The percentage of counties across the USA classified as large metro with deaths involving heroin is estimated to have decreased from approximately 73% in 2010 to just fewer than 56% in 2014, with a shift to small metro and non-metro counties. Understanding the geographic variations in impact on different population groups in the USA has become particularly necessary in light of the extreme increase in the use and misuse of street drugs including heroin and the subsequent rise in opioid-related deaths in the USA.

  15. Comparison between dispersive solid-phase and dispersive liquid-liquid microextraction combined with spectrophotometric determination of malachite green in water samples based on ultrasound-assisted and preconcentration under multi-variable experimental design optimization.

    PubMed

    Alipanahpour Dil, Ebrahim; Ghaedi, Mehrorang; Asfaram, Arash; Zare, Fahimeh; Mehrabi, Fatemeh; Sadeghfar, Fardin

    2017-11-01

    The ultrasound-assisted dispersive solid-phase microextraction (USA-DSPME) and the ultrasound-assisted dispersive liquid-liquid microextraction (USA-DLLME) developed for as an ultra preconcentration and/or technique for the determination of malachite green (MG) in water samples. Central composite design based on analysis of variance and desirability function guide finding best operational conditions and represent dependency of response to variables viz. volume of extraction, eluent and disperser solvent, pH, adsorbent mass and ultrasonication time has significant influence on methods efficiency. Optimum conditions was set for USA-DSPME as: 1mg CNTs/Zn:ZnO@Ni 2 P-NCs; 4min sonication time and 130μL eluent at pH 6.0. Meanwhile optimum point for USA-DLLME conditions were fixed at pH 6.0; 4min sonication time and 130, 650μL and 10mL of extraction solvent (CHCl 3 ), disperser solvent (ethanol) and sample volume, respectively. Under the above specified best operational conditions, the enrichment factors for the USA-DSPME and USA-DLLME were 88.89 and 147.30, respectively. The methods has linear response in the range of 20.0 to 4000.0ngmL -1 with the correlation coefficients (r) between 0.9980 to 0.9995, while its reasonable detection limits viz. 1.386 to 2.348ngmL -1 and good relative standard deviations varied from 1.1% to 2.8% (n=10) candidate this method for successful monitoring of analyte from various media. The relative recoveries of the MG dye from water samples at spiking level of 500ngmL -1 were in the range between 94.50% and 98.86%. The proposed methods has been successfully applied to the analysis of the MG dye in water samples, and a satisfactory result was obtained. Copyright © 2017. Published by Elsevier B.V.

  16. Hepatitis E seroprevalence in the Americas: A systematic review and meta-analysis.

    PubMed

    Horvatits, Thomas; Ozga, Ann-Kathrin; Westhölter, Dirk; Hartl, Johannes; Manthey, Carolin F; Lütgehetmann, Marc; Rauch, Geraldine; Kriston, Levente; Lohse, Ansgar W; Bendall, Richard; Wedemeyer, Heiner; Dalton, Harry R; Pischke, Sven

    2018-04-16

    While hepatitis E virus infections are a relevant topic in Europe, knowledge about epidemiology of hepatitis E virus infections in the USA and Latin America is still limited. Aim of this study was to estimate anti-hepatitis E virus IgG seroprevalence in the Americas and to assess whether low socioeconomic status is associated with hepatitis E virus exposure. We performed a systematic review and meta-analysis. Literature search was performed in PubMed for articles published 01/1994-12/2016. Prevalence was estimated using a mixed-effects model and reported in line with PRISMA reporting guidelines. Seroprevalence was significantly higher in the USA than in Latin America, independently of assay, patient cohort, methodological quality or study year (OR: 1.82 (1.06-3.08), P = .03). Patients in the USA had a more than doubled estimated seroprevalence (up to 9%, confidence interval 5%-15.6%) than those in Brazil (up to 4.2%, confidence interval 2.4%-7.1%; OR: 2.27 (1.25-4.13); P = .007) and Mixed Caribbean (up to 1%, OR: 8.33 (1.15-81.61); P = .04). A comparison with published data from Europe demonstrated that anti-hepatitis E virus seroprevalence in the USA and Europe did not differ significantly (OR: 1.33 (0.81-2.19), P = .25), while rate in South America was significantly lower than that in Europe (OR: 0.67 (0.45-0.98), P = .04). Hepatitis E virus is common in the USA. Surprisingly, the risk of hepatitis E virus exposure was low in many South American countries. Seroprevalence did not differ significantly between Europe and the USA. Hence, hepatitis E virus is not limited to countries with low sanitary standards, and a higher socioeconomic status does not protect populations from hepatitis E virus exposure. © 2018 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  17. Colorectal cancer survival in the USA and Europe: a CONCORD high-resolution study

    PubMed Central

    Allemani, Claudia; Rachet, Bernard; Weir, Hannah K; Richardson, Lisa C; Lepage, Côme; Faivre, Jean; Gatta, Gemma; Capocaccia, Riccardo; Sant, Milena; Baili, Paolo; Lombardo, Claudio; Aareleid, Tiiu; Ardanaz, Eva; Bielska-Lasota, Magdalena; Bolick, Susan; Cress, Rosemary; Elferink, Marloes; Fulton, John P; Galceran, Jaume; Góźdź, Stanisław; Hakulinen, Timo; Primic-Žakelj, Maja; Rachtan, Jadwiga; Diba, Chakameh Safaei; Sánchez, Maria-José; Schymura, Maria J; Shen, Tiefu; Tagliabue, Giovanna; Tumino, Rosario; Vercelli, Marina; Wolf, Holly J; Wu, Xiao-Cheng; Coleman, Michel P

    2013-01-01

    Objectives To assess the extent to which stage at diagnosis and adherence to treatment guidelines may explain the persistent differences in colorectal cancer survival between the USA and Europe. Design A high-resolution study using detailed clinical data on Dukes’ stage, diagnostic procedures, treatment and follow-up, collected directly from medical records by trained abstractors under a single protocol, with standardised quality control and central statistical analysis. Setting and participants 21 population-based registries in seven US states and nine European countries provided data for random samples comprising 12 523 adults (15–99 years) diagnosed with colorectal cancer during 1996–1998. Outcome measures Logistic regression models were used to compare adherence to ‘standard care’ in the USA and Europe. Net survival and excess risk of death were estimated with flexible parametric models. Results The proportion of Dukes’ A and B tumours was similar in the USA and Europe, while that of Dukes’ C was more frequent in the USA (38% vs 21%) and of Dukes’ D more frequent in Europe (22% vs 10%). Resection with curative intent was more frequent in the USA (85% vs 75%). Elderly patients (75–99 years) were 70–90% less likely to receive radiotherapy and chemotherapy. Age-standardised 5-year net survival was similar in the USA (58%) and Northern and Western Europe (54–56%) and lowest in Eastern Europe (42%). The mean excess hazard up to 5 years after diagnosis was highest in Eastern Europe, especially among elderly patients and those with Dukes’ D tumours. Conclusions The wide differences in colorectal cancer survival between Europe and the USA in the late 1990s are probably attributable to earlier stage and more extensive use of surgery and adjuvant treatment in the USA. Elderly patients with colorectal cancer received surgery, chemotherapy or radiotherapy less often than younger patients, despite evidence that they could also have benefited. PMID:24022388

  18. Colorectal cancer survival in the USA and Europe: a CONCORD high-resolution study.

    PubMed

    Allemani, Claudia; Rachet, Bernard; Weir, Hannah K; Richardson, Lisa C; Lepage, Côme; Faivre, Jean; Gatta, Gemma; Capocaccia, Riccardo; Sant, Milena; Baili, Paolo; Lombardo, Claudio; Aareleid, Tiiu; Ardanaz, Eva; Bielska-Lasota, Magdalena; Bolick, Susan; Cress, Rosemary; Elferink, Marloes; Fulton, John P; Galceran, Jaume; Gózdz, Stanislaw; Hakulinen, Timo; Primic-Zakelj, Maja; Rachtan, Jadwiga; Diba, Chakameh Safaei; Sánchez, Maria-José; Schymura, Maria J; Shen, Tiefu; Tagliabue, Giovanna; Tumino, Rosario; Vercelli, Marina; Wolf, Holly J; Wu, Xiao-Cheng; Coleman, Michel P

    2013-09-10

    To assess the extent to which stage at diagnosis and adherence to treatment guidelines may explain the persistent differences in colorectal cancer survival between the USA and Europe. A high-resolution study using detailed clinical data on Dukes' stage, diagnostic procedures, treatment and follow-up, collected directly from medical records by trained abstractors under a single protocol, with standardised quality control and central statistical analysis. 21 population-based registries in seven US states and nine European countries provided data for random samples comprising 12 523 adults (15-99 years) diagnosed with colorectal cancer during 1996-1998. Logistic regression models were used to compare adherence to 'standard care' in the USA and Europe. Net survival and excess risk of death were estimated with flexible parametric models. The proportion of Dukes' A and B tumours was similar in the USA and Europe, while that of Dukes' C was more frequent in the USA (38% vs 21%) and of Dukes' D more frequent in Europe (22% vs 10%). Resection with curative intent was more frequent in the USA (85% vs 75%). Elderly patients (75-99 years) were 70-90% less likely to receive radiotherapy and chemotherapy. Age-standardised 5-year net survival was similar in the USA (58%) and Northern and Western Europe (54-56%) and lowest in Eastern Europe (42%). The mean excess hazard up to 5 years after diagnosis was highest in Eastern Europe, especially among elderly patients and those with Dukes' D tumours. The wide differences in colorectal cancer survival between Europe and the USA in the late 1990s are probably attributable to earlier stage and more extensive use of surgery and adjuvant treatment in the USA. Elderly patients with colorectal cancer received surgery, chemotherapy or radiotherapy less often than younger patients, despite evidence that they could also have benefited.

  19. An ecological study of prostate cancer mortality in the USA and UK, 1975-2004: are divergent trends a consequence of treatment, screening or artefact?

    PubMed Central

    Collin, Simon M; Martin, Richard M; Metcalfe, Chris; Gunnell, David; Albertsen, Peter; Neal, David; Hamdy, Freddie; Stephens, Peter; Lane, J Athene; Moore, Rollo; Donovan, Jenny

    2009-01-01

    Background There is no conclusive evidence that screening based on prostate-specific antigen (PSA) tests reduces prostate cancer mortality. In the USA uptake of PSA testing has been rapid, but is much less common in the UK. Purpose To investigate trends in prostate cancer mortality and incidence in the USA and UK from 1975-2004, contrasting these with trends in screening and treatment. Methods Joinpoint regression analysis of cancer mortality statistics from Cancer Research UK and the USA National Cancer Institute Surveillance Epidemiology and End Results (SEER) program was used to estimate the annual percentage change in prostate cancer mortality in each country and the points in time when trends changed. Results Age-specific and age-adjusted prostate cancer mortality peaked in the early 1990s at almost identical rates in both countries, but age-adjusted mortality in the USA subsequently declined by 4.2% (95% CI 4.0-4.3%) per annum, four times the rate of decline in the UK (1.1%; 0.8-1.4%). The mortality decline in the USA was greatest and most sustained in those ≥75 years, whereas death rates had plateaued in this age group in the UK by 2000. Conclusion The striking decline in prostate cancer mortality in the USA compared with the UK between 1994-2004 coincided with much higher uptake of PSA screening in the USA. Explanations for the different trends in mortality include the possibility of an early impact of initial screening rounds on men with more aggressive asymptomatic disease in the USA, different approaches to treatment in the two countries, and bias related to the misattribution of cause of death. Speculation over the role of screening will continue until evidence from randomised controlled trials is published. PMID:18424233

  20. Parallel digital forensics infrastructure.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liebrock, Lorie M.; Duggan, David Patrick

    2009-10-01

    This report documents the architecture and implementation of a Parallel Digital Forensics infrastructure. This infrastructure is necessary for supporting the design, implementation, and testing of new classes of parallel digital forensics tools. Digital Forensics has become extremely difficult with data sets of one terabyte and larger. The only way to overcome the processing time of these large sets is to identify and develop new parallel algorithms for performing the analysis. To support algorithm research, a flexible base infrastructure is required. A candidate architecture for this base infrastructure was designed, instantiated, and tested by this project, in collaboration with New Mexicomore » Tech. Previous infrastructures were not designed and built specifically for the development and testing of parallel algorithms. With the size of forensics data sets only expected to increase significantly, this type of infrastructure support is necessary for continued research in parallel digital forensics. This report documents the implementation of the parallel digital forensics (PDF) infrastructure architecture and implementation.« less

  1. A model for optimizing file access patterns using spatio-temporal parallelism

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boonthanome, Nouanesengsy; Patchett, John; Geveci, Berk

    2013-01-01

    For many years now, I/O read time has been recognized as the primary bottleneck for parallel visualization and analysis of large-scale data. In this paper, we introduce a model that can estimate the read time for a file stored in a parallel filesystem when given the file access pattern. Read times ultimately depend on how the file is stored and the access pattern used to read the file. The file access pattern will be dictated by the type of parallel decomposition used. We employ spatio-temporal parallelism, which combines both spatial and temporal parallelism, to provide greater flexibility to possible filemore » access patterns. Using our model, we were able to configure the spatio-temporal parallelism to design optimized read access patterns that resulted in a speedup factor of approximately 400 over traditional file access patterns.« less

  2. Exploiting parallel computing with limited program changes using a network of microcomputers

    NASA Technical Reports Server (NTRS)

    Rogers, J. L., Jr.; Sobieszczanski-Sobieski, J.

    1985-01-01

    Network computing and multiprocessor computers are two discernible trends in parallel processing. The computational behavior of an iterative distributed process in which some subtasks are completed later than others because of an imbalance in computational requirements is of significant interest. The effects of asynchronus processing was studied. A small existing program was converted to perform finite element analysis by distributing substructure analysis over a network of four Apple IIe microcomputers connected to a shared disk, simulating a parallel computer. The substructure analysis uses an iterative, fully stressed, structural resizing procedure. A framework of beams divided into three substructures is used as the finite element model. The effects of asynchronous processing on the convergence of the design variables are determined by not resizing particular substructures on various iterations.

  3. Validation of neoclassical bootstrap current models in the edge of an H-mode plasma.

    PubMed

    Wade, M R; Murakami, M; Politzer, P A

    2004-06-11

    Analysis of the parallel electric field E(parallel) evolution following an L-H transition in the DIII-D tokamak indicates the generation of a large negative pulse near the edge which propagates inward, indicative of the generation of a noninductive edge current. Modeling indicates that the observed E(parallel) evolution is consistent with a narrow current density peak generated in the plasma edge. Very good quantitative agreement is found between the measured E(parallel) evolution and that expected from neoclassical theory predictions of the bootstrap current.

  4. Backtracking and Re-execution in the Automatic Debugging of Parallelized Programs

    NASA Technical Reports Server (NTRS)

    Matthews, Gregory; Hood, Robert; Johnson, Stephen; Leggett, Peter; Biegel, Bryan (Technical Monitor)

    2002-01-01

    In this work we describe a new approach using relative debugging to find differences in computation between a serial program and a parallel version of th it program. We use a combination of re-execution and backtracking in order to find the first difference in computation that may ultimately lead to an incorrect value that the user has indicated. In our prototype implementation we use static analysis information from a parallelization tool in order to perform the backtracking as well as the mapping required between serial and parallel computations.

  5. SPSS and SAS programs for determining the number of components using parallel analysis and velicer's MAP test.

    PubMed

    O'Connor, B P

    2000-08-01

    Popular statistical software packages do not have the proper procedures for determining the number of components in factor and principal components analyses. Parallel analysis and Velicer's minimum average partial (MAP) test are validated procedures, recommended widely by statisticians. However, many researchers continue to use alternative, simpler, but flawed procedures, such as the eigenvalues-greater-than-one rule. Use of the proper procedures might be increased if these procedures could be conducted within familiar software environments. This paper describes brief and efficient programs for using SPSS and SAS to conduct parallel analyses and the MAP test.

  6. Binocular optical axis parallelism detection precision analysis based on Monte Carlo method

    NASA Astrophysics Data System (ADS)

    Ying, Jiaju; Liu, Bingqi

    2018-02-01

    According to the working principle of the binocular photoelectric instrument optical axis parallelism digital calibration instrument, and in view of all components of the instrument, the various factors affect the system precision is analyzed, and then precision analysis model is established. Based on the error distribution, Monte Carlo method is used to analyze the relationship between the comprehensive error and the change of the center coordinate of the circle target image. The method can further guide the error distribution, optimize control the factors which have greater influence on the comprehensive error, and improve the measurement accuracy of the optical axis parallelism digital calibration instrument.

  7. Introduction and Committees

    NASA Astrophysics Data System (ADS)

    Angelova, Maia; Zakrzewski, Wojciech; Hussin, Véronique; Piette, Bernard

    2011-03-01

    This volume contains contributions to the XXVIIIth International Colloquium on Group-Theoretical Methods in Physics, the GROUP 28 conference, which took place in Newcastle upon Tyne from 26-30 July 2010. All plenary and contributed papers have undergone an independent review; as a result of this review and the decisions of the Editorial Board most but not all of the contributions were accepted. The volume is organised as follows: it starts with notes in memory of Marcos Moshinsky, followed by contributions related to the Wigner Medal and Hermann Weyl prize. Then the invited talks at the plenary sessions and the public lecture are published followed by contributions in the parallel and poster sessions in alphabetical order. The Editors:Maia Angelova, Wojciech Zakrzewski, Véronique Hussin and Bernard Piette International Advisory Committee Michael BaakeUniversity of Bielefeld, Germany Gerald DunneUniversity of Connecticut, USA J F (Frank) GomesUNESP, Sao Paolo, Brazil Peter HanggiUniversity of Augsburg, Germany Jeffrey C LagariasUniversity of Michigan, USA Michael MackeyMcGill University, Canada Nicholas MantonCambridge University, UK Alexei MorozovITEP, Moscow, Russia Valery RubakovINR, Moscow, Russia Barry SandersUniversity of Calgary, Canada Allan SolomonOpen University, Milton Keynes, UK Christoph SchweigertUniversity of Hamburg, Germany Standing Committee Twareque AliConcordia University, Canada Luis BoyaSalamanca University, Spain Enrico CeleghiniFirenze University, Italy Vladimir DobrevBulgarian Academy of Sciences, Bulgaria Heinz-Dietrich DoebnerHonorary Member, Clausthal University, Germany Jean-Pierre GazeauChairman, Paris Diderot University, France Mo-Lin GeNankai University. China Gerald GoldinRutgers University, USA Francesco IachelloYale University, USA Joris Van der JeugtGhent University, Belgium Richard KernerPierre et Marie Curie University, France Piotr KielanowskiCINVESTAV, Mexico Alan KosteleckyIndiana University, USA Mariano del OlmoValladolid University, Spain George PogosyanUNAM, Mexico, JINR, Dubna, Russia Christoph SchweigertUniversity of Hamburg, Germany Reidun TwarockYork University, UK Luc VinetMontréal University, Canada Apostolos VourdasBradford University, UK Kurt WolfUNAM, Mexico Local Organising Committee Maia Angelova - ChairNorthumbria University, Newcastle Wojtek Zakrzewski - ChairDurham University, Durham Sarah Howells - SecretaryNorthumbria University, Newcastle Jeremy Ellman - WebNorthumbria University, Newcastle Véronique HussinNorthumbria, Durham and University of Montréal Safwat MansiNorthumbria University, Newcastle James McLaughlinNorthumbria University, Newcastle Bernard PietteDurham University, Durham Ghanim PutrusNorthumbria University, Newcastle Sarah ReesNewcastle University, Newcastle Petia SiceNorthumbria University, Newcastle Anne TaorminaDurham University, Durham Rosemary ZakrzewskiAccompanying persons programme Lighthouse Photograph by Bernard Piette: Souter Lighthouse, Marsden, Tyne and Wear, England

  8. Characterizing parallel file-access patterns on a large-scale multiprocessor

    NASA Technical Reports Server (NTRS)

    Purakayastha, A.; Ellis, Carla; Kotz, David; Nieuwejaar, Nils; Best, Michael L.

    1995-01-01

    High-performance parallel file systems are needed to satisfy tremendous I/O requirements of parallel scientific applications. The design of such high-performance parallel file systems depends on a comprehensive understanding of the expected workload, but so far there have been very few usage studies of multiprocessor file systems. This paper is part of the CHARISMA project, which intends to fill this void by measuring real file-system workloads on various production parallel machines. In particular, we present results from the CM-5 at the National Center for Supercomputing Applications. Our results are unique because we collect information about nearly every individual I/O request from the mix of jobs running on the machine. Analysis of the traces leads to various recommendations for parallel file-system design.

  9. Parallel imports and the pricing of pharmaceutical products: evidence from the European Union.

    PubMed

    Ganslandt, Mattias; Maskus, Keith E

    2004-09-01

    We consider policy issues regarding parallel imports (PIs) of brand-name pharmaceuticals in the European Union, where such trade is permitted. We develop a simple model in which an original manufacturer competes in its home market with PI firms. The model suggests that for small trade costs the original manufacturer will accommodate the import decisions of parallel traders and that the price in the home market falls as the volume of parallel imports rises. Using data from Sweden we find that the prices of drugs subject to competition from parallel imports fell relative to other drugs over the period 1994-1999. Econometric analysis finds that parallel imports significantly reduced manufacturing prices, by 12-19%. There is evidence that this effect increases with multiple PI entrants.

  10. Intrahost Evolution of Methicillin-Resistant Staphylococcus aureus USA300 Among Individuals With Reoccurring Skin and Soft-Tissue Infections

    PubMed Central

    Azarian, Taj; Daum, Robert S.; Petty, Lindsay A.; Steinbeck, Jenny L.; Yin, Zachary; Nolan, David; Boyle-Vavra, Susan; Hanage, W. P.; Salemi, Marco; David, Michael Z.

    2016-01-01

    Background. Methicillin-resistant Staphylococcus aureus (MRSA) USA300 is the leading cause of MRSA infections in the United States and has caused an epidemic of skin and soft-tissue infections. Recurrent infections with USA300 MRSA are common, yet intrahost evolution during persistence on an individual has not been studied. This gap hinders the ability to clinically manage recurrent infections and reconstruct transmission networks. Methods. To characterize bacterial intrahost evolution, we examined the clinical courses of 4 subjects with 3–6 recurrent USA300 MRSA infections, using patient clinical data, including antibiotic exposure history, and whole-genome sequencing and phylogenetic analysis of all available MRSA isolates (n = 29). Results. Among sequential isolates, we found variability in diversity, accumulation of mutations, and mobile genetic elements. Selection for antimicrobial-resistant populations was observed through both an increase in the number of plasmids conferring multidrug resistance and strain replacement by a resistant population. Two of 4 subjects had strain replacement with a genetically distinct USA300 MRSA population. Discussions. During a 5-year period in 4 subjects, we identified development of antimicrobial resistance, intrahost evolution, and strain replacement among isolates from patients with recurrent MRSA infections. This calls into question the efficacy of decolonization to prevent recurrent infections and highlights the adaptive potential of USA300 and the need for effective sampling. PMID:27288537

  11. Synthesizing parallel imaging applications using the CAP (computer-aided parallelization) tool

    NASA Astrophysics Data System (ADS)

    Gennart, Benoit A.; Mazzariol, Marc; Messerli, Vincent; Hersch, Roger D.

    1997-12-01

    Imaging applications such as filtering, image transforms and compression/decompression require vast amounts of computing power when applied to large data sets. These applications would potentially benefit from the use of parallel processing. However, dedicated parallel computers are expensive and their processing power per node lags behind that of the most recent commodity components. Furthermore, developing parallel applications remains a difficult task: writing and debugging the application is difficult (deadlocks), programs may not be portable from one parallel architecture to the other, and performance often comes short of expectations. In order to facilitate the development of parallel applications, we propose the CAP computer-aided parallelization tool which enables application programmers to specify at a high-level of abstraction the flow of data between pipelined-parallel operations. In addition, the CAP tool supports the programmer in developing parallel imaging and storage operations. CAP enables combining efficiently parallel storage access routines and image processing sequential operations. This paper shows how processing and I/O intensive imaging applications must be implemented to take advantage of parallelism and pipelining between data access and processing. This paper's contribution is (1) to show how such implementations can be compactly specified in CAP, and (2) to demonstrate that CAP specified applications achieve the performance of custom parallel code. The paper analyzes theoretically the performance of CAP specified applications and demonstrates the accuracy of the theoretical analysis through experimental measurements.

  12. Robust variance estimation with dependent effect sizes: practical considerations including a software tutorial in Stata and spss.

    PubMed

    Tanner-Smith, Emily E; Tipton, Elizabeth

    2014-03-01

    Methodologists have recently proposed robust variance estimation as one way to handle dependent effect sizes in meta-analysis. Software macros for robust variance estimation in meta-analysis are currently available for Stata (StataCorp LP, College Station, TX, USA) and spss (IBM, Armonk, NY, USA), yet there is little guidance for authors regarding the practical application and implementation of those macros. This paper provides a brief tutorial on the implementation of the Stata and spss macros and discusses practical issues meta-analysts should consider when estimating meta-regression models with robust variance estimates. Two example databases are used in the tutorial to illustrate the use of meta-analysis with robust variance estimates. Copyright © 2013 John Wiley & Sons, Ltd.

  13. SAPNEW: Parallel finite element code for thin shell structures on the Alliant FX-80

    NASA Astrophysics Data System (ADS)

    Kamat, Manohar P.; Watson, Brian C.

    1992-11-01

    The finite element method has proven to be an invaluable tool for analysis and design of complex, high performance systems, such as bladed-disk assemblies in aircraft turbofan engines. However, as the problem size increase, the computation time required by conventional computers can be prohibitively high. Parallel processing computers provide the means to overcome these computation time limits. This report summarizes the results of a research activity aimed at providing a finite element capability for analyzing turbomachinery bladed-disk assemblies in a vector/parallel processing environment. A special purpose code, named with the acronym SAPNEW, has been developed to perform static and eigen analysis of multi-degree-of-freedom blade models built-up from flat thin shell elements. SAPNEW provides a stand alone capability for static and eigen analysis on the Alliant FX/80, a parallel processing computer. A preprocessor, named with the acronym NTOS, has been developed to accept NASTRAN input decks and convert them to the SAPNEW format to make SAPNEW more readily used by researchers at NASA Lewis Research Center.

  14. Substructure analysis using NICE/SPAR and applications of force to linear and nonlinear structures. [spacecraft masts

    NASA Technical Reports Server (NTRS)

    Razzaq, Zia; Prasad, Venkatesh; Darbhamulla, Siva Prasad; Bhati, Ravinder; Lin, Cai

    1987-01-01

    Parallel computing studies are presented for a variety of structural analysis problems. Included are the substructure planar analysis of rectangular panels with and without a hole, the static analysis of space mast, using NICE/SPAR and FORCE, and substructure analysis of plane rigid-jointed frames using FORCE. The computations are carried out on the Flex/32 MultiComputer using one to eighteen processors. The NICE/SPAR runstream samples are documented for the panel problem. For the substructure analysis of plane frames, a computer program is developed to demonstrate the effectiveness of a substructuring technique when FORCE is enforced. Ongoing research activities for an elasto-plastic stability analysis problem using FORCE, and stability analysis of the focus problem using NICE/SPAR are briefly summarized. Speedup curves for the panel, the mast, and the frame problems provide a basic understanding of the effectiveness of parallel computing procedures utilized or developed, within the domain of the parameters considered. Although the speedup curves obtained exhibit various levels of computational efficiency, they clearly demonstrate the excellent promise which parallel computing holds for the structural analysis problem. Source code is given for the elasto-plastic stability problem and the FORCE program.

  15. A Novel Design of 4-Class BCI Using Two Binary Classifiers and Parallel Mental Tasks

    PubMed Central

    Geng, Tao; Gan, John Q.; Dyson, Matthew; Tsui, Chun SL; Sepulveda, Francisco

    2008-01-01

    A novel 4-class single-trial brain computer interface (BCI) based on two (rather than four or more) binary linear discriminant analysis (LDA) classifiers is proposed, which is called a “parallel BCI.” Unlike other BCIs where mental tasks are executed and classified in a serial way one after another, the parallel BCI uses properly designed parallel mental tasks that are executed on both sides of the subject body simultaneously, which is the main novelty of the BCI paradigm used in our experiments. Each of the two binary classifiers only classifies the mental tasks executed on one side of the subject body, and the results of the two binary classifiers are combined to give the result of the 4-class BCI. Data was recorded in experiments with both real movement and motor imagery in 3 able-bodied subjects. Artifacts were not detected or removed. Offline analysis has shown that, in some subjects, the parallel BCI can generate a higher accuracy than a conventional 4-class BCI, although both of them have used the same feature selection and classification algorithms. PMID:18584040

  16. On nonlinear finite element analysis in single-, multi- and parallel-processors

    NASA Technical Reports Server (NTRS)

    Utku, S.; Melosh, R.; Islam, M.; Salama, M.

    1982-01-01

    Numerical solution of nonlinear equilibrium problems of structures by means of Newton-Raphson type iterations is reviewed. Each step of the iteration is shown to correspond to the solution of a linear problem, therefore the feasibility of the finite element method for nonlinear analysis is established. Organization and flow of data for various types of digital computers, such as single-processor/single-level memory, single-processor/two-level-memory, vector-processor/two-level-memory, and parallel-processors, with and without sub-structuring (i.e. partitioning) are given. The effect of the relative costs of computation, memory and data transfer on substructuring is shown. The idea of assigning comparable size substructures to parallel processors is exploited. Under Cholesky type factorization schemes, the efficiency of parallel processing is shown to decrease due to the occasional shared data, just as that due to the shared facilities.

  17. Discrete sensitivity derivatives of the Navier-Stokes equations with a parallel Krylov solver

    NASA Technical Reports Server (NTRS)

    Ajmani, Kumud; Taylor, Arthur C., III

    1994-01-01

    This paper solves an 'incremental' form of the sensitivity equations derived by differentiating the discretized thin-layer Navier Stokes equations with respect to certain design variables of interest. The equations are solved with a parallel, preconditioned Generalized Minimal RESidual (GMRES) solver on a distributed-memory architecture. The 'serial' sensitivity analysis code is parallelized by using the Single Program Multiple Data (SPMD) programming model, domain decomposition techniques, and message-passing tools. Sensitivity derivatives are computed for low and high Reynolds number flows over a NACA 1406 airfoil on a 32-processor Intel Hypercube, and found to be identical to those computed on a single-processor Cray Y-MP. It is estimated that the parallel sensitivity analysis code has to be run on 40-50 processors of the Intel Hypercube in order to match the single-processor processing time of a Cray Y-MP.

  18. Visualizing Parallel Computer System Performance

    NASA Technical Reports Server (NTRS)

    Malony, Allen D.; Reed, Daniel A.

    1988-01-01

    Parallel computer systems are among the most complex of man's creations, making satisfactory performance characterization difficult. Despite this complexity, there are strong, indeed, almost irresistible, incentives to quantify parallel system performance using a single metric. The fallacy lies in succumbing to such temptations. A complete performance characterization requires not only an analysis of the system's constituent levels, it also requires both static and dynamic characterizations. Static or average behavior analysis may mask transients that dramatically alter system performance. Although the human visual system is remarkedly adept at interpreting and identifying anomalies in false color data, the importance of dynamic, visual scientific data presentation has only recently been recognized Large, complex parallel system pose equally vexing performance interpretation problems. Data from hardware and software performance monitors must be presented in ways that emphasize important events while eluding irrelevant details. Design approaches and tools for performance visualization are the subject of this paper.

  19. PFLOTRAN-E4D: A parallel open source PFLOTRAN module for simulating time-lapse electrical resistivity data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, Timothy C.; Hammond, Glenn E.; Chen, Xingyuan

    Time-lapse electrical resistivity tomography (ERT) is finding increased application for remotely monitoring processes occurring in the near subsurface in three-dimensions (i.e. 4D monitoring). However, there are few codes capable of simulating the evolution of subsurface resistivity and corresponding tomographic measurements arising from a particular process, particularly in parallel and with an open source license. Herein we describe and demonstrate an electrical resistivity tomography module for the PFLOTRAN subsurface simulation code, named PFLOTRAN-E4D. The PFLOTRAN-E4D module operates in parallel using a dedicated set of compute cores in a master-slave configuration. At each time step, the master processes receives subsurface states frommore » PFLOTRAN, converts those states to bulk electrical conductivity, and instructs the slave processes to simulate a tomographic data set. The resulting multi-physics simulation capability enables accurate feasibility studies for ERT imaging, the identification of the ERT signatures that are unique to a given process, and facilitates the joint inversion of ERT data with hydrogeological data for subsurface characterization. PFLOTRAN-E4D is demonstrated herein using a field study of stage-driven groundwater/river water interaction ERT monitoring along the Columbia River, Washington, USA. Results demonstrate the complex nature of changes subsurface electrical conductivity, in both the saturated and unsaturated zones, arising from water table changes and from river water intrusion into the aquifer. The results also demonstrate the sensitivity of surface based ERT measurements to those changes over time. PFLOTRAN-E4D is available with the PFLOTRAN development version with an open-source license at https://bitbucket.org/pflotran/pflotran-dev .« less

  20. PFLOTRAN-E4D: A parallel open source PFLOTRAN module for simulating time-lapse electrical resistivity data

    NASA Astrophysics Data System (ADS)

    Johnson, Timothy C.; Hammond, Glenn E.; Chen, Xingyuan

    2017-02-01

    Time-lapse electrical resistivity tomography (ERT) is finding increased application for remotely monitoring processes occurring in the near subsurface in three-dimensions (i.e. 4D monitoring). However, there are few codes capable of simulating the evolution of subsurface resistivity and corresponding tomographic measurements arising from a particular process, particularly in parallel and with an open source license. Herein we describe and demonstrate an electrical resistivity tomography module for the PFLOTRAN subsurface flow and reactive transport simulation code, named PFLOTRAN-E4D. The PFLOTRAN-E4D module operates in parallel using a dedicated set of compute cores in a master-slave configuration. At each time step, the master processes receives subsurface states from PFLOTRAN, converts those states to bulk electrical conductivity, and instructs the slave processes to simulate a tomographic data set. The resulting multi-physics simulation capability enables accurate feasibility studies for ERT imaging, the identification of the ERT signatures that are unique to a given process, and facilitates the joint inversion of ERT data with hydrogeological data for subsurface characterization. PFLOTRAN-E4D is demonstrated herein using a field study of stage-driven groundwater/river water interaction ERT monitoring along the Columbia River, Washington, USA. Results demonstrate the complex nature of subsurface electrical conductivity changes, in both the saturated and unsaturated zones, arising from river stage fluctuations and associated river water intrusion into the aquifer. The results also demonstrate the sensitivity of surface based ERT measurements to those changes over time. PFLOTRAN-E4D is available with the PFLOTRAN development version with an open-source license at https://bitbucket.org/pflotran/pflotran-dev.

  1. Causes of sinks near Tucson, Arizona, USA

    USGS Publications Warehouse

    Hoffmann, J.P.; Pool, D.R.; Konieczki, A.D.; Carpenter, M.C.

    1998-01-01

    Land subsidence in the form of sinks has occurred on and near farmlands near Tucson, Pima County, Arizona, USA. The sinks occur in alluvial deposits along the flood plain of the Santa Cruz River, and have made farmlands dangerous and unsuitable for farming. More than 1700 sinks are confined to the flood plain of the Santa Cruz River and are grouped along two north-northwestward-trending bands that are approximately parallel to the river and other flood-plain drainages. An estimated 17,000 m3 of sediment have been removed in the formation of the sinks. Thirteen trenches were dug to depths of 4-6 m to characterize near-surface sediments in sink and nonsink areas. Sediments below about 2 m included a large percentage of dispersive clays in sink areas. Sediments in nonsink areas contain a large component of medium- to coarse-grained, moderately to well sorted sand that probably fills a paleochannel. Electromagnetic surveys support the association of silts and clays in sink areas that are highly electrically conductive relative to sand in nonsink areas. Sinks probably are caused by the near-surface process of subsurface erosion of dispersive sediments along pre-existing cracks in predominantly silt and clay sediments. The pre-existing cracks probably result from desiccation or tension that developed during periods of water-table decline and channel incision during the past 100 years or in earlier periods.

  2. Counter rotating fans — An aircraft propulsion for the future?

    NASA Astrophysics Data System (ADS)

    Schimming, Peter

    2003-05-01

    In the mid seventies a new propulsor for aircraft was designed and investigated - the so-called PROPFAN. With regard to the total pressure increase, it ranges between a conventional propeller and a turbofan with very high bypass ratio. This new propulsion system promised a reduction in fuel consumption of 15 to 25% compared to engines at that time. A lot of propfans (Hamilton Standard, USA) with different numbers of blades and blade shapes have been designed and tested in wind tunnels in order to find an optimum in efficiency, Fig.1. Parallel to this development GE, USA, made a design of a counter rotating unducted propfan, the so-called UDF, Fig.2. A prototype engine was manufactured and investigated on an in-flight test bed mounted at the MD82 and the B727. Since that time there has not been any further development of propfans (except AN 70 with NK 90-engine, Ukraine, which is more or less a propeller design) due to relatively low fuel prices and technical obstacles. Only technical programs in different countries are still going on in order to prepare a data base for designing counter rotating fans in terms of aeroacoustics, aerodynamics and aeroelasticities. In DLR, Germany, a lot of experimental and numerical work has been undertaken to understand the physical behaviour of the unsteady flow in a counter rotating fan.

  3. Life as an early career researcher: interview with Catherine Martel.

    PubMed

    Martel, Catherine

    2016-03-01

    Catherine Martel speaks to Francesca Lake, Managing Commissioning Editor: Catherine Martel obtained her PhD from the Université de Montréal and pursued a postdoctoral fellowship first at Mount Sinai School of Medicine in New York (NY, USA), then at Washington University School of Medicine in St Louis (MO, USA), and obtained the Junior Investigator Award for Women from the Arteriosclerosis, Thrombosis and Vascular Biology council of the American Heart Association. Her postdoctoral work is certainly groundbreaking and brings forward new considerations in the field: she discovered that the lymphatic vessel route, the network that runs in parallel with the blood vessels, is critical for removing cholesterol from multiple tissues, including the aortic wall. In 2013, she joined the Arteriosclerosis, Thrombosis and Vascular Biology Early Career Committee, eager to bring a Canadian perspective to the group and get involved in council activities. Since 2014, she is an Assistant Professor at the Department of Medicine at the Université de Montréal, and a research scientist at the Montreal Heart Institute. Her research program now focuses on characterizing the physiopathologic role of the lymphatics in the initiation, progression and regression of atherosclerosis. Basic and translational research will allow her team to identify the causes of lymphatic dysfunction, and eventually target potential therapeutic strategies aiming at improving lymphatic function at the different levels of the atherothrombotic disease. You can follow her laboratory at @LaboMartel_ICM.

  4. What is Catholic about Catholic Charities?

    PubMed

    Degeneffe, Charles Edmund

    2003-07-01

    Sectarian social services agencies play an important and increasing role in contemporary social welfare. Among sectarian social welfare organizations, Catholic Charities USA has emerged as the largest private provider of social welfare services. This article reviews the history, services, and practice controversies of Catholic Charities USA and examines issues regarding the ability of sectarian social services organizations to provide nonbiased and fair services. Through an analysis of this organization, the authors raise and discuss questions of accountability and philosophical approaches.

  5. Lichen-based indices to quantify responses to climate and air pollution across northeastern U.S.A

    Treesearch

    Susan Will-Wolf; Sarah Jovan; Peter Neitlich; JeriLynn E. Peck; Roger Rosentreter

    2015-01-01

    Lichens are known to be indicators for air quality; they also respond to climate. We developed indices for lichen response to climate and air quality in forests across the northeastern United States of America (U.S.A.), using 218–250 plot surveys with 145–161 macrolichen taxa from the Forest Inventory and Analysis (FIA) Program of the U.S. Department of Agriculture,...

  6. Adulteration and cultivation region identification of American ginseng using HPLC coupled with multivariate analysis

    PubMed Central

    Yu, Chunhao; Wang, Chong-Zhi; Zhou, Chun-Jie; Wang, Bin; Han, Lide; Zhang, Chun-Feng; Wu, Xiao-Hui; Yuan, Chun-Su

    2014-01-01

    American ginseng (Panax quinquefolius) is originally grown in North America. Due to price difference and supply shortage, American ginseng recently has been cultivated in northern China. Further, in the market, some Asian ginsengs are labeled as American ginseng. In this study, forty-three American ginseng samples cultivated in the USA, Canada or China were collected and 14 ginseng saponins were determined using HPLC. HPLC coupled with hierarchical cluster analysis and principal component analysis was developed to identify the species. Subsequently, an HPLC-linear discriminant analysis was established to discriminate cultivation regions of American ginseng. This method was successfully applied to identify the sources of 6 commercial American ginseng samples. Two of them were identified as Asian ginseng, while 4 others were identified as American ginseng, which were cultivated in the USA (3) and China (1). Our newly developed method can be used to identify American ginseng with different cultivation regions. PMID:25044150

  7. Color stability of ceramic brackets immersed in potentially staining solutions

    PubMed Central

    Guignone, Bruna Coser; Silva, Ludimila Karsbergen; Soares, Rodrigo Villamarim; Akaki, Emilio; Goiato, Marcelo Coelho; Pithon, Matheus Melo; Oliveira, Dauro Douglas

    2015-01-01

    OBJECTIVE: To assess the color stability of five types of ceramic brackets after immersion in potentially staining solutions. METHODS: Ninety brackets were divided into 5 groups (n = 18) according to brackets commercial brands and the solutions in which they were immersed (coffee, red wine, coke and artificial saliva). The brackets assessed were Transcend (3M/Unitek, Monrovia, CA, USA), Radiance (American Orthodontics, Sheboygan, WI, USA), Mystique (GAC International Inc., Bohemia, NY, USA) and Luxi II (Rocky Mountain Orthodontics, Denver, CO, USA). Chromatic changes were analyzed with the aid of a reflectance spectrophotometer and by visual inspection at five specific time intervals. Assessment periods were as received from the manufacturer (T0), 24 hours (T1), 72 hours (T2), as well as 7 days (T3) and 14 days (T4) of immersion in the aforementioned solutions. Results were submitted to statistical analysis with ANOVA and Bonferroni correction, as well as to a multivariate profile analysis for independent and paired samples with significance level set at 5%. RESULTS: The duration of the immersion period influenced color alteration of all tested brackets, even though these changes could not always be visually observed. Different behaviors were observed for each immersion solution; however, brackets immersed in one solution progressed similarly despite minor variations. CONCLUSIONS: Staining became more intense over time and all brackets underwent color alterations when immersed in the aforementioned solutions. PMID:26352842

  8. Color stability of ceramic brackets immersed in potentially staining solutions.

    PubMed

    Guignone, Bruna Coser; Silva, Ludimila Karsbergen; Soares, Rodrigo Villamarim; Akaki, Emilio; Goiato, Marcelo Coelho; Pithon, Matheus Melo; Oliveira, Dauro Douglas

    2015-01-01

    To assess the color stability of five types of ceramic brackets after immersion in potentially staining solutions. Ninety brackets were divided into 5 groups (n = 18) according to brackets commercial brands and the solutions in which they were immersed (coffee, red wine, coke and artificial saliva). The brackets assessed were Transcend (3M/Unitek, Monrovia, CA, USA), Radiance (American Orthodontics, Sheboygan, WI, USA), Mystique (GAC International Inc., Bohemia, NY, USA) and Luxi II (Rocky Mountain Orthodontics, Denver, CO, USA). Chromatic changes were analyzed with the aid of a reflectance spectrophotometer and by visual inspection at five specific time intervals. Assessment periods were as received from the manufacturer (T0), 24 hours (T1), 72 hours (T2), as well as 7 days (T3) and 14 days (T4) of immersion in the aforementioned solutions. Results were submitted to statistical analysis with ANOVA and Bonferroni correction, as well as to a multivariate profile analysis for independent and paired samples with significance level set at 5%. The duration of the immersion period influenced color alteration of all tested brackets, even though these changes could not always be visually observed. Different behaviors were observed for each immersion solution; however, brackets immersed in one solution progressed similarly despite minor variations. Staining became more intense over time and all brackets underwent color alterations when immersed in the aforementioned solutions.

  9. Computer architecture evaluation for structural dynamics computations: Project summary

    NASA Technical Reports Server (NTRS)

    Standley, Hilda M.

    1989-01-01

    The intent of the proposed effort is the examination of the impact of the elements of parallel architectures on the performance realized in a parallel computation. To this end, three major projects are developed: a language for the expression of high level parallelism, a statistical technique for the synthesis of multicomputer interconnection networks based upon performance prediction, and a queueing model for the analysis of shared memory hierarchies.

  10. Surface topography study of prepared 3D printed moulds via 3D printer for silicone elastomer based nasal prosthesis

    NASA Astrophysics Data System (ADS)

    Abdullah, Abdul Manaf; Din, Tengku Noor Daimah Tengku; Mohamad, Dasmawati; Rahim, Tuan Noraihan Azila Tuan; Akil, Hazizan Md; Rajion, Zainul Ahmad

    2016-12-01

    Conventional prosthesis fabrication is highly depends on the hand creativity of laboratory technologist. The development in 3D printing technology offers a great help in fabricating affordable and fast yet esthetically acceptable prostheses. This study was conducted to discover the potential of 3D printed moulds for indirect silicone elastomer based nasal prosthesis fabrication. Moulds were designed using computer aided design (CAD) software (Solidworks, USA) and converted into the standard tessellation language (STL) file. Three moulds with layer thickness of 0.1, 0.2 and 0.3mm were printed utilizing polymer filament based 3D printer (Makerbot Replicator 2X, Makerbot, USA). Another one mould was printed utilizing liquid resin based 3D printer (Objet 30 Scholar, Stratasys, USA) as control. The printed moulds were then used to fabricate maxillofacial silicone specimens (n=10)/mould. Surface profilometer (Surfcom Flex, Accretech, Japan), digital microscope (KH77000, Hirox, USA) and scanning electron microscope (Quanta FEG 450, Fei, USA) were used to measure the surface roughness as well as the topological properties of fabricated silicone. Statistical analysis of One-Way ANOVA was employed to compare the surface roughness of the fabricated silicone elastomer. Result obtained demonstrated significant differences in surface roughness of the fabricated silicone (p<0.01). Further post hoc analysis also revealed significant differences in silicone fabricated using different 3D printed moulds (p<0.01). A 3D printed mould was successfully prepared and characterized. With surface topography that could be enhanced, inexpensive and rapid mould fabrication techniques, polymer filament based 3D printer is potential for indirect silicone elastomer based nasal prosthesis fabrication.

  11. Independent and Parallel Evolution of New Genes by Gene Duplication in Two Origins of C4 Photosynthesis Provides New Insight into the Mechanism of Phloem Loading in C4 Species.

    PubMed

    Emms, David M; Covshoff, Sarah; Hibberd, Julian M; Kelly, Steven

    2016-07-01

    C4 photosynthesis is considered one of the most remarkable examples of evolutionary convergence in eukaryotes. However, it is unknown whether the evolution of C4 photosynthesis required the evolution of new genes. Genome-wide gene-tree species-tree reconciliation of seven monocot species that span two origins of C4 photosynthesis revealed that there was significant parallelism in the duplication and retention of genes coincident with the evolution of C4 photosynthesis in these lineages. Specifically, 21 orthologous genes were duplicated and retained independently in parallel at both C4 origins. Analysis of this gene cohort revealed that the set of parallel duplicated and retained genes is enriched for genes that are preferentially expressed in bundle sheath cells, the cell type in which photosynthesis was activated during C4 evolution. Furthermore, functional analysis of the cohort of parallel duplicated genes identified SWEET-13 as a potential key transporter in the evolution of C4 photosynthesis in grasses, and provides new insight into the mechanism of phloem loading in these C4 species. C4 photosynthesis, gene duplication, gene families, parallel evolution. © The Author 2016. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.

  12. Parallel eigenanalysis of finite element models in a completely connected architecture

    NASA Technical Reports Server (NTRS)

    Akl, F. A.; Morel, M. R.

    1989-01-01

    A parallel algorithm is presented for the solution of the generalized eigenproblem in linear elastic finite element analysis, (K)(phi) = (M)(phi)(omega), where (K) and (M) are of order N, and (omega) is order of q. The concurrent solution of the eigenproblem is based on the multifrontal/modified subspace method and is achieved in a completely connected parallel architecture in which each processor is allowed to communicate with all other processors. The algorithm was successfully implemented on a tightly coupled multiple-instruction multiple-data parallel processing machine, Cray X-MP. A finite element model is divided into m domains each of which is assumed to process n elements. Each domain is then assigned to a processor or to a logical processor (task) if the number of domains exceeds the number of physical processors. The macrotasking library routines are used in mapping each domain to a user task. Computational speed-up and efficiency are used to determine the effectiveness of the algorithm. The effect of the number of domains, the number of degrees-of-freedom located along the global fronts and the dimension of the subspace on the performance of the algorithm are investigated. A parallel finite element dynamic analysis program, p-feda, is documented and the performance of its subroutines in parallel environment is analyzed.

  13. A visual parallel-BCI speller based on the time-frequency coding strategy

    NASA Astrophysics Data System (ADS)

    Xu, Minpeng; Chen, Long; Zhang, Lixin; Qi, Hongzhi; Ma, Lan; Tang, Jiabei; Wan, Baikun; Ming, Dong

    2014-04-01

    Objective. Spelling is one of the most important issues in brain-computer interface (BCI) research. This paper is to develop a visual parallel-BCI speller system based on the time-frequency coding strategy in which the sub-speller switching among four simultaneously presented sub-spellers and the character selection are identified in a parallel mode. Approach. The parallel-BCI speller was constituted by four independent P300+SSVEP-B (P300 plus SSVEP blocking) spellers with different flicker frequencies, thereby all characters had a specific time-frequency code. To verify its effectiveness, 11 subjects were involved in the offline and online spellings. A classification strategy was designed to recognize the target character through jointly using the canonical correlation analysis and stepwise linear discriminant analysis. Main results. Online spellings showed that the proposed parallel-BCI speller had a high performance, reaching the highest information transfer rate of 67.4 bit min-1, with an average of 54.0 bit min-1 and 43.0 bit min-1 in the three rounds and five rounds, respectively. Significance. The results indicated that the proposed parallel-BCI could be effectively controlled by users with attention shifting fluently among the sub-spellers, and highly improved the BCI spelling performance.

  14. Independent and Parallel Evolution of New Genes by Gene Duplication in Two Origins of C4 Photosynthesis Provides New Insight into the Mechanism of Phloem Loading in C4 Species

    PubMed Central

    Emms, David M.; Covshoff, Sarah; Hibberd, Julian M.; Kelly, Steven

    2016-01-01

    C4 photosynthesis is considered one of the most remarkable examples of evolutionary convergence in eukaryotes. However, it is unknown whether the evolution of C4 photosynthesis required the evolution of new genes. Genome-wide gene-tree species-tree reconciliation of seven monocot species that span two origins of C4 photosynthesis revealed that there was significant parallelism in the duplication and retention of genes coincident with the evolution of C4 photosynthesis in these lineages. Specifically, 21 orthologous genes were duplicated and retained independently in parallel at both C4 origins. Analysis of this gene cohort revealed that the set of parallel duplicated and retained genes is enriched for genes that are preferentially expressed in bundle sheath cells, the cell type in which photosynthesis was activated during C4 evolution. Furthermore, functional analysis of the cohort of parallel duplicated genes identified SWEET-13 as a potential key transporter in the evolution of C4 photosynthesis in grasses, and provides new insight into the mechanism of phloem loading in these C4 species. Key words: C4 photosynthesis, gene duplication, gene families, parallel evolution. PMID:27016024

  15. Accuracy analysis and design of A3 parallel spindle head

    NASA Astrophysics Data System (ADS)

    Ni, Yanbing; Zhang, Biao; Sun, Yupeng; Zhang, Yuan

    2016-03-01

    As functional components of machine tools, parallel mechanisms are widely used in high efficiency machining of aviation components, and accuracy is one of the critical technical indexes. Lots of researchers have focused on the accuracy problem of parallel mechanisms, but in terms of controlling the errors and improving the accuracy in the stage of design and manufacturing, further efforts are required. Aiming at the accuracy design of a 3-DOF parallel spindle head(A3 head), its error model, sensitivity analysis and tolerance allocation are investigated. Based on the inverse kinematic analysis, the error model of A3 head is established by using the first-order perturbation theory and vector chain method. According to the mapping property of motion and constraint Jacobian matrix, the compensatable and uncompensatable error sources which affect the accuracy in the end-effector are separated. Furthermore, sensitivity analysis is performed on the uncompensatable error sources. The sensitivity probabilistic model is established and the global sensitivity index is proposed to analyze the influence of the uncompensatable error sources on the accuracy in the end-effector of the mechanism. The results show that orientation error sources have bigger effect on the accuracy in the end-effector. Based upon the sensitivity analysis results, the tolerance design is converted into the issue of nonlinearly constrained optimization with the manufacturing cost minimum being the optimization objective. By utilizing the genetic algorithm, the allocation of the tolerances on each component is finally determined. According to the tolerance allocation results, the tolerance ranges of ten kinds of geometric error sources are obtained. These research achievements can provide fundamental guidelines for component manufacturing and assembly of this kind of parallel mechanisms.

  16. Constraint treatment techniques and parallel algorithms for multibody dynamic analysis. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Chiou, Jin-Chern

    1990-01-01

    Computational procedures for kinematic and dynamic analysis of three-dimensional multibody dynamic (MBD) systems are developed from the differential-algebraic equations (DAE's) viewpoint. Constraint violations during the time integration process are minimized and penalty constraint stabilization techniques and partitioning schemes are developed. The governing equations of motion, a two-stage staggered explicit-implicit numerical algorithm, are treated which takes advantage of a partitioned solution procedure. A robust and parallelizable integration algorithm is developed. This algorithm uses a two-stage staggered central difference algorithm to integrate the translational coordinates and the angular velocities. The angular orientations of bodies in MBD systems are then obtained by using an implicit algorithm via the kinematic relationship between Euler parameters and angular velocities. It is shown that the combination of the present solution procedures yields a computationally more accurate solution. To speed up the computational procedures, parallel implementation of the present constraint treatment techniques, the two-stage staggered explicit-implicit numerical algorithm was efficiently carried out. The DAE's and the constraint treatment techniques were transformed into arrowhead matrices to which Schur complement form was derived. By fully exploiting the sparse matrix structural analysis techniques, a parallel preconditioned conjugate gradient numerical algorithm is used to solve the systems equations written in Schur complement form. A software testbed was designed and implemented in both sequential and parallel computers. This testbed was used to demonstrate the robustness and efficiency of the constraint treatment techniques, the accuracy of the two-stage staggered explicit-implicit numerical algorithm, and the speed up of the Schur-complement-based parallel preconditioned conjugate gradient algorithm on a parallel computer.

  17. Study of Solid State Drives performance in PROOF distributed analysis system

    NASA Astrophysics Data System (ADS)

    Panitkin, S. Y.; Ernst, M.; Petkus, R.; Rind, O.; Wenaus, T.

    2010-04-01

    Solid State Drives (SSD) is a promising storage technology for High Energy Physics parallel analysis farms. Its combination of low random access time and relatively high read speed is very well suited for situations where multiple jobs concurrently access data located on the same drive. It also has lower energy consumption and higher vibration tolerance than Hard Disk Drive (HDD) which makes it an attractive choice in many applications raging from personal laptops to large analysis farms. The Parallel ROOT Facility - PROOF is a distributed analysis system which allows to exploit inherent event level parallelism of high energy physics data. PROOF is especially efficient together with distributed local storage systems like Xrootd, when data are distributed over computing nodes. In such an architecture the local disk subsystem I/O performance becomes a critical factor, especially when computing nodes use multi-core CPUs. We will discuss our experience with SSDs in PROOF environment. We will compare performance of HDD with SSD in I/O intensive analysis scenarios. In particular we will discuss PROOF system performance scaling with a number of simultaneously running analysis jobs.

  18. A Parallel Framework with Block Matrices of a Discrete Fourier Transform for Vector-Valued Discrete-Time Signals.

    PubMed

    Soto-Quiros, Pablo

    2015-01-01

    This paper presents a parallel implementation of a kind of discrete Fourier transform (DFT): the vector-valued DFT. The vector-valued DFT is a novel tool to analyze the spectra of vector-valued discrete-time signals. This parallel implementation is developed in terms of a mathematical framework with a set of block matrix operations. These block matrix operations contribute to analysis, design, and implementation of parallel algorithms in multicore processors. In this work, an implementation and experimental investigation of the mathematical framework are performed using MATLAB with the Parallel Computing Toolbox. We found that there is advantage to use multicore processors and a parallel computing environment to minimize the high execution time. Additionally, speedup increases when the number of logical processors and length of the signal increase.

  19. AC losses in horizontally parallel HTS tapes for possible wireless power transfer applications

    NASA Astrophysics Data System (ADS)

    Shen, Boyang; Geng, Jianzhao; Zhang, Xiuchang; Fu, Lin; Li, Chao; Zhang, Heng; Dong, Qihuan; Ma, Jun; Gawith, James; Coombs, T. A.

    2017-12-01

    This paper presents the concept of using horizontally parallel HTS tapes with AC loss study, and the investigation on possible wireless power transfer (WPT) applications. An example of three parallel HTS tapes was proposed, whose AC loss study was carried out both from experiment using electrical method; and simulation using 2D H-formulation on the FEM platform of COMSOL Multiphysics. The electromagnetic induction around the three parallel tapes was monitored using COMSOL simulation. The electromagnetic induction and AC losses generated by a conventional three turn coil was simulated as well, and then compared to the case of three parallel tapes with the same AC transport current. The analysis demonstrates that HTS parallel tapes could be potentially used into wireless power transfer systems, which could have lower total AC losses than conventional HTS coils.

  20. An Expert System for the Development of Efficient Parallel Code

    NASA Technical Reports Server (NTRS)

    Jost, Gabriele; Chun, Robert; Jin, Hao-Qiang; Labarta, Jesus; Gimenez, Judit

    2004-01-01

    We have built the prototype of an expert system to assist the user in the development of efficient parallel code. The system was integrated into the parallel programming environment that is currently being developed at NASA Ames. The expert system interfaces to tools for automatic parallelization and performance analysis. It uses static program structure information and performance data in order to automatically determine causes of poor performance and to make suggestions for improvements. In this paper we give an overview of our programming environment, describe the prototype implementation of our expert system, and demonstrate its usefulness with several case studies.

  1. Parallel log structured file system collective buffering to achieve a compact representation of scientific and/or dimensional data

    DOEpatents

    Grider, Gary A.; Poole, Stephen W.

    2015-09-01

    Collective buffering and data pattern solutions are provided for storage, retrieval, and/or analysis of data in a collective parallel processing environment. For example, a method can be provided for data storage in a collective parallel processing environment. The method comprises receiving data to be written for a plurality of collective processes within a collective parallel processing environment, extracting a data pattern for the data to be written for the plurality of collective processes, generating a representation describing the data pattern, and saving the data and the representation.

  2. Efficient computation of aerodynamic influence coefficients for aeroelastic analysis on a transputer network

    NASA Technical Reports Server (NTRS)

    Janetzke, David C.; Murthy, Durbha V.

    1991-01-01

    Aeroelastic analysis is multi-disciplinary and computationally expensive. Hence, it can greatly benefit from parallel processing. As part of an effort to develop an aeroelastic capability on a distributed memory transputer network, a parallel algorithm for the computation of aerodynamic influence coefficients is implemented on a network of 32 transputers. The aerodynamic influence coefficients are calculated using a 3-D unsteady aerodynamic model and a parallel discretization. Efficiencies up to 85 percent were demonstrated using 32 processors. The effect of subtask ordering, problem size, and network topology are presented. A comparison to results on a shared memory computer indicates that higher speedup is achieved on the distributed memory system.

  3. An Expert Assistant for Computer Aided Parallelization

    NASA Technical Reports Server (NTRS)

    Jost, Gabriele; Chun, Robert; Jin, Haoqiang; Labarta, Jesus; Gimenez, Judit

    2004-01-01

    The prototype implementation of an expert system was developed to assist the user in the computer aided parallelization process. The system interfaces to tools for automatic parallelization and performance analysis. By fusing static program structure information and dynamic performance analysis data the expert system can help the user to filter, correlate, and interpret the data gathered by the existing tools. Sections of the code that show poor performance and require further attention are rapidly identified and suggestions for improvements are presented to the user. In this paper we describe the components of the expert system and discuss its interface to the existing tools. We present a case study to demonstrate the successful use in full scale scientific applications.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bennett, Janine Camille; Thompson, David; Pebay, Philippe Pierre

    Statistical analysis is typically used to reduce the dimensionality of and infer meaning from data. A key challenge of any statistical analysis package aimed at large-scale, distributed data is to address the orthogonal issues of parallel scalability and numerical stability. Many statistical techniques, e.g., descriptive statistics or principal component analysis, are based on moments and co-moments and, using robust online update formulas, can be computed in an embarrassingly parallel manner, amenable to a map-reduce style implementation. In this paper we focus on contingency tables, through which numerous derived statistics such as joint and marginal probability, point-wise mutual information, information entropy,more » and {chi}{sup 2} independence statistics can be directly obtained. However, contingency tables can become large as data size increases, requiring a correspondingly large amount of communication between processors. This potential increase in communication prevents optimal parallel speedup and is the main difference with moment-based statistics (which we discussed in [1]) where the amount of inter-processor communication is independent of data size. Here we present the design trade-offs which we made to implement the computation of contingency tables in parallel. We also study the parallel speedup and scalability properties of our open source implementation. In particular, we observe optimal speed-up and scalability when the contingency statistics are used in their appropriate context, namely, when the data input is not quasi-diffuse.« less

  5. Parallel labeling experiments and metabolic flux analysis: Past, present and future methodologies.

    PubMed

    Crown, Scott B; Antoniewicz, Maciek R

    2013-03-01

    Radioactive and stable isotopes have been applied for decades to elucidate metabolic pathways and quantify carbon flow in cellular systems using mass and isotope balancing approaches. Isotope-labeling experiments can be conducted as a single tracer experiment, or as parallel labeling experiments. In the latter case, several experiments are performed under identical conditions except for the choice of substrate labeling. In this review, we highlight robust approaches for probing metabolism and addressing metabolically related questions though parallel labeling experiments. In the first part, we provide a brief historical perspective on parallel labeling experiments, from the early metabolic studies when radioisotopes were predominant to present-day applications based on stable-isotopes. We also elaborate on important technical and theoretical advances that have facilitated the transition from radioisotopes to stable-isotopes. In the second part of the review, we focus on parallel labeling experiments for (13)C-metabolic flux analysis ((13)C-MFA). Parallel experiments offer several advantages that include: tailoring experiments to resolve specific fluxes with high precision; reducing the length of labeling experiments by introducing multiple entry-points of isotopes; validating biochemical network models; and improving the performance of (13)C-MFA in systems where the number of measurements is limited. We conclude by discussing some challenges facing the use of parallel labeling experiments for (13)C-MFA and highlight the need to address issues related to biological variability, data integration, and rational tracer selection. Copyright © 2012 Elsevier Inc. All rights reserved.

  6. Stream Dissolved Organic Matter Quantity and Quality Along a Wetland-Cropland Catchment Gradient

    NASA Astrophysics Data System (ADS)

    McDonough, O.; Hosen, J. D.; Lang, M. W.; Oesterling, R.; Palmer, M.

    2012-12-01

    Wetlands may be critical sources of dissolved organic matter (DOM) to stream networks. Yet, more than half of wetlands in the continental United States have been lost since European settlement, with the majority of loss attributed to agriculture. The degree to which agricultural loss of wetlands impacts stream DOM is largely unknown and may have important ecological implications. Using twenty headwater catchments on the Delmarva Peninsula (Maryland, USA), we investigated the seasonal influence of wetland and cropland coverage on downstream DOM quantity and quality. In addition to quantifying bulk downstream dissolved organic carbon (DOC) concentration, we used a suite of DOM UV-absorbance metrics and parallel factor analysis (PARAFAC) modeling of excitation-emission fluorescence spectra (EEMs) to characterize DOM composition. Percent bioavailable DOC (%BDOC) was measured during the Spring sampling using a 28-day incubation. Percent wetland coverage and % cropland within the watersheds were significantly negatively correlated (r = -0.93, p < 0.001). Results show that % wetland coverage was positively correlated with stream DOM concentration, molecular weight, aromaticity, humic-like fluorescence, and allochthonous origin. Conversely, increased wetland coverage was negatively correlated with stream DOM protein-like fluorescence. Percent BDOC decreased with DOM humic-like fluorescence and increased with protein-like fluorescence. We observed minimal seasonal interaction between % wetland coverage and DOM concentration and composition across Spring, Fall, and Winter sampling seasons. However, principal component analysis suggested more pronounced seasonal differences exist in stream DOM. This study highlights the influence of wetlands on downstream DOM in agriculturally impacted landscapes where loss of wetlands to cultivation may significantly alter stream DOM quantity and quality.

  7. DESCRIPTION OF VIRIDILOBUS MARINUS (GEN. ET SP. NOV.), A NEW RAPHIDOPHYTE FROM DELAWARE'S INLAND BAYS.

    PubMed

    Demir-Hilton, Elif; Hutchins, David A; Czymmek, Kirk J; Coyne, Kathryn J

    2012-10-01

    Delaware's Inland Bays (DIB), USA, are subject to blooms of potentially harmful raphidophytes, including Heterosigma akashiwo. In 2004, a dense bloom was observed in a low salinity tributary of the DIB. Light microscopy initially suggested that the species was H. akashiwo; however, the cells were smaller than anticipated. 18S rDNA sequences of isolated cultures differed substantially from all raphidophyte sequences in GenBank. Phylogenetic analysis placed it approximately equidistant from Chattonella and Heterosigma with only ~96% sequence homology with either group. Here, we describe this marine raphidophyte as a novel genus and species, Viridilobus marinus (gen. et sp. nov.). We also compared this species with H. akashiwo, because both species are superficially similar with respect to morphology and their ecological niches overlap. V. marinus cells are ovoid to spherical (11.4 × 9.4 μm), and the average number of chloroplasts (4 per cell) is lower than in H. akashiwo (15 per cell). Pigment analysis of V. marinus revealed the presence of fucoxanthin, violaxanthin, and zeaxanthin, which are characteristic of marine raphidophytes within the family Chattonellaceae of the Raphidophyceae. TEM and confocal microscopy, however, revealed diagnostic microscopic and ultrastructural characteristics that distinguish it from other raphidophytes. Chloroplasts were in close association with the nucleus and thylakoids were arranged either parallel or perpendicular to the cell surface. Putative mucocysts were identified, but trichocysts were not observed. These features, along with DNA sequence data, distinguish this species from all other raphidophyte genera within the family Chattonellaceae of the Raphidophyceae. © 2012 Phycological Society of America.

  8. Gut microbiota in early pediatric multiple sclerosis: a case-control study.

    PubMed

    Tremlett, Helen; Fadrosh, Douglas W; Faruqi, Ali A; Zhu, Feng; Hart, Janace; Roalstad, Shelly; Graves, Jennifer; Lynch, Susan; Waubant, Emmanuelle

    2016-08-01

    Alterations in the gut microbial community composition may be influential in neurological disease. Microbial community profiles were compared between early onset pediatric multiple sclerosis (MS) and control children similar for age and sex. Children ≤18 years old within 2 years of MS onset or controls without autoimmune disorders attending a University of California, San Francisco, USA, pediatric clinic were examined for fecal bacterial community composition and predicted function by 16S ribosomal RNA sequencing and phylogenetic reconstruction of unobserved states (PICRUSt) analysis. Associations between subject characteristics and the microbiota, including beta diversity and taxa abundance, were identified using non-parametric tests, permutational multivariate analysis of variance and negative binomial regression. Eighteen relapsing-remitting MS cases and 17 controls (mean age 13 years; range 4-18) were studied. Cases had a short disease duration (mean 11 months; range 2-24) and half were immunomodulatory drug (IMD) naïve. Whilst overall gut bacterial beta diversity was not significantly related to MS status, IMD exposure was (Canberra, P < 0.02). However, relative to controls, MS cases had a significant enrichment in relative abundance for members of the Desulfovibrionaceae (Bilophila, Desulfovibrio and Christensenellaceae) and depletion in Lachnospiraceae and Ruminococcaceae (all P and q < 0.000005). Microbial genes predicted as enriched in MS versus controls included those involved in glutathione metabolism (Mann-Whitney, P = 0.017), findings that were consistent regardless of IMD exposure. In recent onset pediatric MS, perturbations in the gut microbiome composition were observed, in parallel with predicted enrichment of metabolic pathways associated with neurodegeneration. Findings were suggestive of a pro-inflammatory milieu. © 2016 EAN.

  9. HIV-1 transgenic rats display mitochondrial abnormalities consistent with abnormal energy generation and distribution.

    PubMed

    Villeneuve, Lance M; Purnell, Phillip R; Stauch, Kelly L; Callen, Shannon E; Buch, Shilpa J; Fox, Howard S

    2016-10-01

    With the advent of the combination antiretroviral therapy era (cART), the development of AIDS has been largely limited in the USA. Unfortunately, despite the development of efficacious treatments, HIV-1-associated neurocognitive disorders (HAND) can still develop, and as many HIV-1 positive individuals age, the prevalence of HAND is likely to rise because HAND manifests in the brain with very low levels of virus. However, the mechanism producing this viral disorder is still debated. Interestingly, HIV-1 infection exposes neurons to proteins including Tat, Nef, and Vpr which can drastically alter mitochondrial properties. Mitochondrial dysfunction has been posited to be a cornerstone of the development of numerous neurodegenerative diseases. Therefore, we investigated mitochondria in an animal model of HAND. Using an HIV-1 transgenic rat model expressing seven of the nine HIV-1 viral proteins, mitochondrial functional and proteomic analysis were performed on a subset of mitochondria that are particularly sensitive to cellular changes, the neuronal synaptic mitochondria. Quantitative mass spectroscopic studies followed by statistical analysis revealed extensive proteome alteration in this model paralleling mitochondrial abnormalities identified in HIV-1 animal models and HIV-1-infected humans. Novel mitochondrial protein changes were discovered in the electron transport chain (ETC), the glycolytic pathways, mitochondrial trafficking proteins, and proteins involved in various energy pathways, and these findings correlated well with the function of the mitochondria as assessed by a mitochondrial coupling and flux assay. By targeting these proteins and proteins upstream in the same pathway, we may be able to limit the development of HAND.

  10. Transplacental Passage of Acetaminophen in Term Pregnancy.

    PubMed

    Nitsche, Joshua F; Patil, Avinash S; Langman, Loralie J; Penn, Hannah J; Derleth, Douglas; Watson, William J; Brost, Brian C

    2017-05-01

    Objective  The objective of this study was to determine the maternal and fetal pharmacokinetic (PK) profiles of acetaminophen after administration of a therapeutic oral dose. Study Design  After obtaining Institutional Review Board approval and their written informed consent, pregnant women were given a single oral dose (1,000 mg) of acetaminophen upon admission for scheduled cesarean delivery. Maternal venous blood and fetal cord blood were obtained at the time of delivery and acetaminophen levels were measured using gas chromatography-mass spectroscopy. PK parameters were calculated by noncompartmental analysis. Nonparametric correlation of maternal/fetal acetaminophen levels and PK curves were calculated. Results  In this study, 34 subjects were enrolled (median, 32 years; range, 25-39 years). The median maternal weight was 82 kg (range, 62-100 kg). All but two subjects were delivered beyond 39 weeks' gestation. The median newborn birth weight was 3,590 g (interquartile range, 3,403-3,848 g). Noncompartmental analysis described similar PK parameters in the maternal ( T 1/2 , 84 minutes; apparent clearance [Cl/F], 28.8 L/h; apparent volume of distribution [V d /F], 57.5 L) and fetal compartments ( T 1/2 , 82 minutes; Cl/F, 31.2 L/h; V d /F, 61.2 L). Paired maternal/fetal acetaminophen levels were highly correlated ( p  < 0.0001). Conclusion  Fetal acetaminophen PKs in the fetus parallels that in the mother suggesting that placental transfer is flow limited. Maternal acetaminophen levels can be used as a surrogate for fetal exposure. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  11. Sierra Structural Dynamics User's Notes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reese, Garth M.

    2015-10-19

    Sierra/SD provides a massively parallel implementation of structural dynamics finite element analysis, required for high fidelity, validated models used in modal, vibration, static and shock analysis of weapons systems. This document provides a users guide to the input for Sierra/SD. Details of input specifications for the different solution types, output options, element types and parameters are included. The appendices contain detailed examples, and instructions for running the software on parallel platforms.

  12. Sierra/SD User's Notes.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Munday, Lynn Brendon; Day, David M.; Bunting, Gregory

    Sierra/SD provides a massively parallel implementation of structural dynamics finite element analysis, required for high fidelity, validated models used in modal, vibration, static and shock analysis of weapons systems. This document provides a users guide to the input for Sierra/SD. Details of input specifications for the different solution types, output options, element types and parameters are included. The appendices contain detailed examples, and instructions for running the software on parallel platforms.

  13. High-Performance Parallel Analysis of Coupled Problems for Aircraft Propulsion

    NASA Technical Reports Server (NTRS)

    Felippa, C. A.; Farhat, C.; Park, K. C.; Gumaste, U.; Chen, P.-S.; Lesoinne, M.; Stern, P.

    1997-01-01

    Applications are described of high-performance computing methods to the numerical simulation of complete jet engines. The methodology focuses on the partitioned analysis of the interaction of the gas flow with a flexible structure and with the fluid mesh motion driven by structural displacements. The latter is treated by a ALE technique that models the fluid mesh motion as that of a fictitious mechanical network laid along the edges of near-field elements. New partitioned analysis procedures to treat this coupled three-component problem were developed. These procedures involved delayed corrections and subcycling, and have been successfully tested on several massively parallel computers, including the iPSC-860, Paragon XP/S and the IBM SP2. The NASA-sponsored ENG10 program was used for the global steady state analysis of the whole engine. This program uses a regular FV-multiblock-grid discretization in conjunction with circumferential averaging to include effects of blade forces, loss, combustor heat addition, blockage, bleeds and convective mixing. A load-balancing preprocessor for parallel versions of ENG10 was developed as well as the capability for the first full 3D aeroelastic simulation of a multirow engine stage. This capability was tested on the IBM SP2 parallel supercomputer at NASA Ames.

  14. Microfluidic Platform for Parallel Single Cell Analysis for Diagnostic Applications.

    PubMed

    Le Gac, Séverine

    2017-01-01

    Cell populations are heterogeneous: they can comprise different cell types or even cells at different stages of the cell cycle and/or of biological processes. Furthermore, molecular processes taking place in cells are stochastic in nature. Therefore, cellular analysis must be brought down to the single cell level to get useful insight into biological processes, and to access essential molecular information that would be lost when using a cell population analysis approach. Furthermore, to fully characterize a cell population, ideally, information both at the single cell level and on the whole cell population is required, which calls for analyzing each individual cell in a population in a parallel manner. This single cell level analysis approach is particularly important for diagnostic applications to unravel molecular perturbations at the onset of a disease, to identify biomarkers, and for personalized medicine, not only because of the heterogeneity of the cell sample, but also due to the availability of a reduced amount of cells, or even unique cells. This chapter presents a versatile platform meant for the parallel analysis of individual cells, with a particular focus on diagnostic applications and the analysis of cancer cells. We first describe one essential step of this parallel single cell analysis protocol, which is the trapping of individual cells in dedicated structures. Following this, we report different steps of a whole analytical process, including on-chip cell staining and imaging, cell membrane permeabilization and/or lysis using either chemical or physical means, and retrieval of the cell molecular content in dedicated channels for further analysis. This series of experiments illustrates the versatility of the herein-presented platform and its suitability for various analysis schemes and different analytical purposes.

  15. A Study of Gaps in Attack Analysis

    DTIC Science & Technology

    2016-01-21

    www.threatstream.com/blog/shockpot-exploitation-analysis, September 2014. [86] Shobha Venkataraman , David Brumley, Subhabrata Sen, and Oliver...NDSS 2013, San Diego, California, USA, February 24-27, 2013, 2013. [87] Shobha Venkataraman , Subhabrata Sen, Oliver Spatscheck, Patrick Haffner, and

  16. Dragonfly: an implementation of the expand-maximize-compress algorithm for single-particle imaging.

    PubMed

    Ayyer, Kartik; Lan, Ti-Yen; Elser, Veit; Loh, N Duane

    2016-08-01

    Single-particle imaging (SPI) with X-ray free-electron lasers has the potential to change fundamentally how biomacromolecules are imaged. The structure would be derived from millions of diffraction patterns, each from a different copy of the macromolecule before it is torn apart by radiation damage. The challenges posed by the resultant data stream are staggering: millions of incomplete, noisy and un-oriented patterns have to be computationally assembled into a three-dimensional intensity map and then phase reconstructed. In this paper, the Dragonfly software package is described, based on a parallel implementation of the expand-maximize-compress reconstruction algorithm that is well suited for this task. Auxiliary modules to simulate SPI data streams are also included to assess the feasibility of proposed SPI experiments at the Linac Coherent Light Source, Stanford, California, USA.

  17. Ellipsis Reconsidered

    ERIC Educational Resources Information Center

    Kertz, Laura

    2010-01-01

    I present an analysis of antecedent mismatch effects under ellipsis based on information structure, in which apparent syntactic parallelism effects are explained as a consequence of an information structural constraint requiring topic/comment parallelism for contrastive topics. Experimental findings in support of this hypothesis demonstrate first…

  18. A Parallel Particle Swarm Optimization Algorithm Accelerated by Asynchronous Evaluations

    NASA Technical Reports Server (NTRS)

    Venter, Gerhard; Sobieszczanski-Sobieski, Jaroslaw

    2005-01-01

    A parallel Particle Swarm Optimization (PSO) algorithm is presented. Particle swarm optimization is a fairly recent addition to the family of non-gradient based, probabilistic search algorithms that is based on a simplified social model and is closely tied to swarming theory. Although PSO algorithms present several attractive properties to the designer, they are plagued by high computational cost as measured by elapsed time. One approach to reduce the elapsed time is to make use of coarse-grained parallelization to evaluate the design points. Previous parallel PSO algorithms were mostly implemented in a synchronous manner, where all design points within a design iteration are evaluated before the next iteration is started. This approach leads to poor parallel speedup in cases where a heterogeneous parallel environment is used and/or where the analysis time depends on the design point being analyzed. This paper introduces an asynchronous parallel PSO algorithm that greatly improves the parallel e ciency. The asynchronous algorithm is benchmarked on a cluster assembled of Apple Macintosh G5 desktop computers, using the multi-disciplinary optimization of a typical transport aircraft wing as an example.

  19. On the Optimality of Serial and Parallel Processing in the Psychological Refractory Period Paradigm: Effects of the Distribution of Stimulus Onset Asynchronies

    ERIC Educational Resources Information Center

    Miller, Jeff; Ulrich, Rolf; Rolke, Bettina

    2009-01-01

    Within the context of the psychological refractory period (PRP) paradigm, we developed a general theoretical framework for deciding when it is more efficient to process two tasks in serial and when it is more efficient to process them in parallel. This analysis suggests that a serial mode is more efficient than a parallel mode under a wide variety…

  20. Regression Analysis and the Sociological Imagination

    ERIC Educational Resources Information Center

    De Maio, Fernando

    2014-01-01

    Regression analysis is an important aspect of most introductory statistics courses in sociology but is often presented in contexts divorced from the central concerns that bring students into the discipline. Consequently, we present five lesson ideas that emerge from a regression analysis of income inequality and mortality in the USA and Canada.

  1. Risk Assessment of Hepatocellular Carcinoma in Patients with Hepatitis C in China and the USA.

    PubMed

    Parikh, Neehar D; Fu, Sherry; Rao, Huiying; Yang, Ming; Li, Yumeng; Powell, Corey; Wu, Elizabeth; Lin, Andy; Xing, Baocai; Wei, Lai; Lok, Anna S F

    2017-11-01

    Hepatitis C (HCV) infection is an increasingly common cause of hepatocellular carcinoma (HCC) in China. We aimed to determine differences in demographic and behavioral profiles associated with HCC in HCV+ patients in China and the USA. Consecutive HCV+ patients were recruited from centers in China and the USA. Clinical data and lifestyle profiles were obtained through standardized questionnaires. Multivariable analysis was conducted to determine factors associated with HCC diagnosis within groups. We included 41 HCC patients from China and 71 from the USA, and 931 non-HCC patients in China and 859 in China. Chinese patients with HCC were significantly younger, less likely to be male and to be obese than US patients with HCC (all p < 0.001). Chinese patients with HCC had a significantly lower rate of cirrhosis diagnosis (36.6 vs. 78.9%, p < 0.001); however, they also had a higher rate of hepatitis B core antibody positivity (63.4 vs. 36.8%, p = 0.007). In a multivariable analysis of the entire Chinese cohort, age > 55, male sex, the presence of diabetes, and time from maximum weight were associated with HCC, while tea consumption was associated with a decreased HCC risk (OR 0.37, 95% CI 0.16-0.88). In the US cohort, age > 55, male sex, and cirrhosis were associated with HCC on multivariable analysis. With the aging Chinese population and increasing rates of diabetes, there will likely be continued increase in the incidence of HCV-related HCC in China. The protective effect of tea consumption on HCC development deserves further validation.

  2. Comparative Study of Middle School Students' Attitudes towards Science: Rasch Analysis of Entire TIMSS 2011 Attitudinal Data for England, Singapore and the U.S.A. as Well as Psychometric Properties of Attitudes Scale

    ERIC Educational Resources Information Center

    Oon, Pey Tee; Subramaniam, R.

    2018-01-01

    We report here on a comparative study of middle school students' attitudes towards science involving three countries: England, Singapore and the U.S.A. Complete attitudinal data sets from TIMSS (Trends in International Mathematics and Science Study) 2011 were used, thus giving a very large sample size (N = 20,246), compared to other studies in the…

  3. Disseminated histoplasmosis in a domestic cat imported from the USA to Austria

    PubMed Central

    Klang, Andrea; Loncaric, Igor; Spergser, Joachim; Eigelsreiter, Sabine; Weissenböck, Herbert

    2013-01-01

    We present a case of disseminated histoplasmosis in a domestic cat imported from the USA to Austria. Histopathological examination revealed a systemic mycosis with most severe involvement of the lungs suggestive of Histoplasma (H.) capsulatum-infection. Molecular confirmation was based on polymerase chain reaction (PCR) and sequence analysis of a fungal culture from liver samples. This is the first case of feline histoplasmosis proven by molecular diagnostic technique in Europe and reported in Austria, etc. PMID:24432230

  4. Homeland security in the USA: past, present, and future.

    PubMed

    Kemp, Roger L

    2012-01-01

    This paper examines the evolving and dynamic field of homeland security in the USA. Included in this analysis is the evolution of the creation of the Department of Homeland Security, an overview of the National Warning System, a summary of citizen support groups, and how the field of homeland security has had an impact on the location and architecture of public buildings and facilities. Also included are website directories of citizen support groups and federal agencies related to the field of homeland security.

  5. Isotope hydrology and baseflow geochemistry in natural and human-altered watersheds in the Inland Pacific Northwest, USA

    Treesearch

    Ricardo Sanchez-Murillo; Erin S. Brooks; William J. Elliot; Jan Boll

    2015-01-01

    This study presents a stable isotope hydrology and geochemical analysis in the inland Pacific Northwest (PNW) of the USA. Isotope ratios were used to estimate mean transit times (MTTs) in natural and human-altered watersheds using the FLOWPC program. Isotope ratios in precipitation resulted in a regional meteoric water line of ä2H = 7.42·ä18O + 0.88 (n = 316; r2 = 0.97...

  6. Human Processes in Intelligence Analysis: Phase I Overview

    DTIC Science & Technology

    1979-12-01

    Inodrtpusehsreac, several operating definitions were thv model, and is based on field obser- adopted. A basic defnition was that vations made from tha...Similarly, the IMINT bo•xes of different analysts, analyst who understands the problems Comnputer data bases, such as those of of the reconnaissance pilot has...TaiulanNevl Franea 3 USA Aviation Test K, Pt Rucler. ATTN: STlG-P( I Prin Scientific Off. Ar-1 HIm mngr Rich Olv. Miniatry 1 USA Apy hr Av4iao SAWe

  7. Use Computer-Aided Tools to Parallelize Large CFD Applications

    NASA Technical Reports Server (NTRS)

    Jin, H.; Frumkin, M.; Yan, J.

    2000-01-01

    Porting applications to high performance parallel computers is always a challenging task. It is time consuming and costly. With rapid progressing in hardware architectures and increasing complexity of real applications in recent years, the problem becomes even more sever. Today, scalability and high performance are mostly involving handwritten parallel programs using message-passing libraries (e.g. MPI). However, this process is very difficult and often error-prone. The recent reemergence of shared memory parallel (SMP) architectures, such as the cache coherent Non-Uniform Memory Access (ccNUMA) architecture used in the SGI Origin 2000, show good prospects for scaling beyond hundreds of processors. Programming on an SMP is simplified by working in a globally accessible address space. The user can supply compiler directives, such as OpenMP, to parallelize the code. As an industry standard for portable implementation of parallel programs for SMPs, OpenMP is a set of compiler directives and callable runtime library routines that extend Fortran, C and C++ to express shared memory parallelism. It promises an incremental path for parallel conversion of existing software, as well as scalability and performance for a complete rewrite or an entirely new development. Perhaps the main disadvantage of programming with directives is that inserted directives may not necessarily enhance performance. In the worst cases, it can create erroneous results. While vendors have provided tools to perform error-checking and profiling, automation in directive insertion is very limited and often failed on large programs, primarily due to the lack of a thorough enough data dependence analysis. To overcome the deficiency, we have developed a toolkit, CAPO, to automatically insert OpenMP directives in Fortran programs and apply certain degrees of optimization. CAPO is aimed at taking advantage of detailed inter-procedural dependence analysis provided by CAPTools, developed by the University of Greenwich, to reduce potential errors made by users. Earlier tests on NAS Benchmarks and ARC3D have demonstrated good success of this tool. In this study, we have applied CAPO to parallelize three large applications in the area of computational fluid dynamics (CFD): OVERFLOW, TLNS3D and INS3D. These codes are widely used for solving Navier-Stokes equations with complicated boundary conditions and turbulence model in multiple zones. Each one comprises of from 50K to 1,00k lines of FORTRAN77. As an example, CAPO took 77 hours to complete the data dependence analysis of OVERFLOW on a workstation (SGI, 175MHz, R10K processor). A fair amount of effort was spent on correcting false dependencies due to lack of necessary knowledge during the analysis. Even so, CAPO provides an easy way for user to interact with the parallelization process. The OpenMP version was generated within a day after the analysis was completed. Due to sequential algorithms involved, code sections in TLNS3D and INS3D need to be restructured by hand to produce more efficient parallel codes. An included figure shows preliminary test results of the generated OVERFLOW with several test cases in single zone. The MPI data points for the small test case were taken from a handcoded MPI version. As we can see, CAPO's version has achieved 18 fold speed up on 32 nodes of the SGI O2K. For the small test case, it outperformed the MPI version. These results are very encouraging, but further work is needed. For example, although CAPO attempts to place directives on the outer- most parallel loops in an interprocedural framework, it does not insert directives based on the best manual strategy. In particular, it lacks the support of parallelization at the multi-zone level. Future work will emphasize on the development of methodology to work in a multi-zone level and with a hybrid approach. Development of tools to perform more complicated code transformation is also needed.

  8. Combined slope ratio analysis and linear-subtraction: An extension of the Pearce ratio method

    NASA Astrophysics Data System (ADS)

    De Waal, Sybrand A.

    1996-07-01

    A new technique, called combined slope ratio analysis, has been developed by extending the Pearce element ratio or conserved-denominator method (Pearce, 1968) to its logical conclusions. If two stoichiometric substances are mixed and certain chemical components are uniquely contained in either one of the two mixing substances, then by treating these unique components as conserved, the composition of the substance not containing the relevant component can be accurately calculated within the limits allowed by analytical and geological error. The calculated composition can then be subjected to rigorous statistical testing using the linear-subtraction method recently advanced by Woronow (1994). Application of combined slope ratio analysis to the rocks of the Uwekahuna Laccolith, Hawaii, USA, and the lavas of the 1959-summit eruption of Kilauea Volcano, Hawaii, USA, yields results that are consistent with field observations.

  9. Dietary intakes among South Asian adults differ by length of residence in the USA.

    PubMed

    Talegawkar, Sameera A; Kandula, Namratha R; Gadgil, Meghana D; Desai, Dipika; Kanaya, Alka M

    2016-02-01

    To examine whether nutrient and food intakes among South Asian adult immigrants differ by length of residence in the USA. Cross-sectional analysis to examine differences in nutrient and food intakes by length of residence in the USA. Dietary data were collected using an interviewer-administered, culturally appropriate FFQ, while self-reported length of residence was assessed using a questionnaire and modelled as tertiles. The Mediators of Atherosclerosis in South Asians Living in America (MASALA) study. Eight hundred and seventy-four South Asians (mean age=55 (sd 9) years; 47 % women; range of length of residence in the USA=2-58 years), part of the baseline examination of the MASALA study. Intakes of fat, including saturated and trans fats, dietary cholesterol and n-6 fatty acids, were directly associated with length of residence, while intakes of energy, carbohydrate, glycaemic index and load, protein, dietary fibre, folate and K were inversely associated with length of residence (P trend <0·05). A longer length of residence in the USA was also associated with higher intakes of alcoholic beverages, mixed dishes including pizza and pasta, fats and oils, and lower intakes of beans and lentils, breads, grains and flour products, milk and dairy products, rice, starchy vegetables and sugar, candy and jam (P for differences across groups <0·05). Length of residence in the USA influences diet and nutrient intakes among South Asian adult immigrants and should be considered when investigating and planning dietary interventions to mitigate chronic disease risk.

  10. Intrahost Evolution of Methicillin-Resistant Staphylococcus aureus USA300 Among Individuals With Reoccurring Skin and Soft-Tissue Infections.

    PubMed

    Azarian, Taj; Daum, Robert S; Petty, Lindsay A; Steinbeck, Jenny L; Yin, Zachary; Nolan, David; Boyle-Vavra, Susan; Hanage, W P; Salemi, Marco; David, Michael Z

    2016-09-15

    Methicillin-resistant Staphylococcus aureus (MRSA) USA300 is the leading cause of MRSA infections in the United States and has caused an epidemic of skin and soft-tissue infections. Recurrent infections with USA300 MRSA are common, yet intrahost evolution during persistence on an individual has not been studied. This gap hinders the ability to clinically manage recurrent infections and reconstruct transmission networks. To characterize bacterial intrahost evolution, we examined the clinical courses of 4 subjects with 3-6 recurrent USA300 MRSA infections, using patient clinical data, including antibiotic exposure history, and whole-genome sequencing and phylogenetic analysis of all available MRSA isolates (n = 29). Among sequential isolates, we found variability in diversity, accumulation of mutations, and mobile genetic elements. Selection for antimicrobial-resistant populations was observed through both an increase in the number of plasmids conferring multidrug resistance and strain replacement by a resistant population. Two of 4 subjects had strain replacement with a genetically distinct USA300 MRSA population. During a 5-year period in 4 subjects, we identified development of antimicrobial resistance, intrahost evolution, and strain replacement among isolates from patients with recurrent MRSA infections. This calls into question the efficacy of decolonization to prevent recurrent infections and highlights the adaptive potential of USA300 and the need for effective sampling. © The Author 2016. Published by Oxford University Press for the Infectious Diseases Society of America. All rights reserved. For permissions, e-mail journals.permissions@oup.com.

  11. The USA-NPN Information Management System: A tool in support of phenological assessments

    NASA Astrophysics Data System (ADS)

    Rosemartin, A.; Vazquez, R.; Wilson, B. E.; Denny, E. G.

    2009-12-01

    The USA National Phenology Network (USA-NPN) serves science and society by promoting a broad understanding of plant and animal phenology and the relationships among phenological patterns and all aspects of environmental change. Data management and information sharing are central to the USA-NPN mission. The USA-NPN develops, implements, and maintains a comprehensive Information Management System (IMS) to serve the needs of the network, including the collection, storage and dissemination of phenology data, access to phenology-related information, tools for data interpretation, and communication among partners of the USA-NPN. The IMS includes components for data storage, such as the National Phenology Database (NPD), and several online user interfaces to accommodate data entry, data download, data visualization and catalog searches for phenology-related information. The IMS is governed by a set of standards to ensure security, privacy, data access, and data quality. The National Phenology Database is designed to efficiently accommodate large quantities of phenology data, to be flexible to the changing needs of the network, and to provide for quality control. The database stores phenology data from multiple sources (e.g., partner organizations, researchers and citizen observers), and provides for integration with legacy datasets. Several services will be created to provide access to the data, including reports, visualization interfaces, and web services. These services will provide integrated access to phenology and related information for scientists, decision-makers and general audiences. Phenological assessments at any scale will rely on secure and flexible information management systems for the organization and analysis of phenology data. The USA-NPN’s IMS can serve phenology assessments directly, through data management and indirectly as a model for large-scale integrated data management.

  12. Bootstrap rolling window estimation approach to analysis of the Environment Kuznets Curve hypothesis: evidence from the USA.

    PubMed

    Aslan, Alper; Destek, Mehmet Akif; Okumus, Ilyas

    2018-01-01

    This study aims to examine the validity of inverted U-shaped Environmental Kuznets Curve by investigating the relationship between economic growth and environmental pollution for the period from 1966 to 2013 in the USA. Previous studies based on the assumption of parameter stability and obtained parameters do not change over the full sample. This study uses bootstrap rolling window estimation method to detect the possible changes in causal relations and also obtain the parameters for sub-sample periods. The results show that the parameter of economic growth has increasing trend in 1982-1996 sub-sample periods, and it has decreasing trend in 1996-2013 sub-sample periods. Therefore, the existence of inverted U-shaped Environmental Kuznets Curve is confirmed in the USA.

  13. Computational aspects of helicopter trim analysis and damping levels from Floquet theory

    NASA Technical Reports Server (NTRS)

    Gaonkar, Gopal H.; Achar, N. S.

    1992-01-01

    Helicopter trim settings of periodic initial state and control inputs are investigated for convergence of Newton iteration in computing the settings sequentially and in parallel. The trim analysis uses a shooting method and a weak version of two temporal finite element methods with displacement formulation and with mixed formulation of displacements and momenta. These three methods broadly represent two main approaches of trim analysis: adaptation of initial-value and finite element boundary-value codes to periodic boundary conditions, particularly for unstable and marginally stable systems. In each method, both the sequential and in-parallel schemes are used and the resulting nonlinear algebraic equations are solved by damped Newton iteration with an optimally selected damping parameter. The impact of damped Newton iteration, including earlier-observed divergence problems in trim analysis, is demonstrated by the maximum condition number of the Jacobian matrices of the iterative scheme and by virtual elimination of divergence. The advantages of the in-parallel scheme over the conventional sequential scheme are also demonstrated.

  14. Binary tree eigen solver in finite element analysis

    NASA Technical Reports Server (NTRS)

    Akl, F. A.; Janetzke, D. C.; Kiraly, L. J.

    1993-01-01

    This paper presents a transputer-based binary tree eigensolver for the solution of the generalized eigenproblem in linear elastic finite element analysis. The algorithm is based on the method of recursive doubling, which parallel implementation of a number of associative operations on an arbitrary set having N elements is of the order of o(log2N), compared to (N-1) steps if implemented sequentially. The hardware used in the implementation of the binary tree consists of 32 transputers. The algorithm is written in OCCAM which is a high-level language developed with the transputers to address parallel programming constructs and to provide the communications between processors. The algorithm can be replicated to match the size of the binary tree transputer network. Parallel and sequential finite element analysis programs have been developed to solve for the set of the least-order eigenpairs using the modified subspace method. The speed-up obtained for a typical analysis problem indicates close agreement with the theoretical prediction given by the method of recursive doubling.

  15. Ultra-scale Visualization Climate Data Analysis Tools (UV-CDAT)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, Dean N.

    2011-07-20

    This report summarizes work carried out by the Ultra-scale Visualization Climate Data Analysis Tools (UV-CDAT) Team for the period of January 1, 2011 through June 30, 2011. It discusses highlights, overall progress, period goals, and collaborations and lists papers and presentations. To learn more about our project, please visit our UV-CDAT website (URL: http://uv-cdat.org). This report will be forwarded to the program manager for the Department of Energy (DOE) Office of Biological and Environmental Research (BER), national and international collaborators and stakeholders, and to researchers working on a wide range of other climate model, reanalysis, and observation evaluation activities. Themore » UV-CDAT executive committee consists of Dean N. Williams of Lawrence Livermore National Laboratory (LLNL); Dave Bader and Galen Shipman of Oak Ridge National Laboratory (ORNL); Phil Jones and James Ahrens of Los Alamos National Laboratory (LANL), Claudio Silva of Polytechnic Institute of New York University (NYU-Poly); and Berk Geveci of Kitware, Inc. The UV-CDAT team consists of researchers and scientists with diverse domain knowledge whose home institutions also include the National Aeronautics and Space Administration (NASA) and the University of Utah. All work is accomplished under DOE open-source guidelines and in close collaboration with the project's stakeholders, domain researchers, and scientists. Working directly with BER climate science analysis projects, this consortium will develop and deploy data and computational resources useful to a wide variety of stakeholders, including scientists, policymakers, and the general public. Members of this consortium already collaborate with other institutions and universities in researching data discovery, management, visualization, workflow analysis, and provenance. The UV-CDAT team will address the following high-level visualization requirements: (1) Alternative parallel streaming statistics and analysis pipelines - Data parallelism, Task parallelism, Visualization parallelism; (2) Optimized parallel input/output (I/O); (3) Remote interactive execution; (4) Advanced intercomparison visualization; (5) Data provenance processing and capture; and (6) Interfaces for scientists - Workflow data analysis and visualization construction tools, and Visualization interfaces.« less

  16. Development of parallel line analysis criteria for recombinant adenovirus potency assay and definition of a unit of potency.

    PubMed

    Ogawa, Yasushi; Fawaz, Farah; Reyes, Candice; Lai, Julie; Pungor, Erno

    2007-01-01

    Parameter settings of a parallel line analysis procedure were defined by applying statistical analysis procedures to the absorbance data from a cell-based potency bioassay for a recombinant adenovirus, Adenovirus 5 Fibroblast Growth Factor-4 (Ad5FGF-4). The parallel line analysis was performed with a commercially available software, PLA 1.2. The software performs Dixon outlier test on replicates of the absorbance data, performs linear regression analysis to define linear region of the absorbance data, and tests parallelism between the linear regions of standard and sample. Width of Fiducial limit, expressed as a percent of the measured potency, was developed as a criterion for rejection of the assay data and to significantly improve the reliability of the assay results. With the linear range-finding criteria of the software set to a minimum of 5 consecutive dilutions and best statistical outcome, and in combination with the Fiducial limit width acceptance criterion of <135%, 13% of the assay results were rejected. With these criteria applied, the assay was found to be linear over the range of 0.25 to 4 relative potency units, defined as the potency of the sample normalized to the potency of Ad5FGF-4 standard containing 6 x 10(6) adenovirus particles/mL. The overall precision of the assay was estimated to be 52%. Without the application of Fiducial limit width criterion, the assay results were not linear over the range, and an overall precision of 76% was calculated from the data. An absolute unit of potency for the assay was defined by using the parallel line analysis procedure as the amount of Ad5FGF-4 that results in an absorbance value that is 121% of the average absorbance readings of the wells containing cells not infected with the adenovirus.

  17. PEM-PCA: a parallel expectation-maximization PCA face recognition architecture.

    PubMed

    Rujirakul, Kanokmon; So-In, Chakchai; Arnonkijpanich, Banchar

    2014-01-01

    Principal component analysis or PCA has been traditionally used as one of the feature extraction techniques in face recognition systems yielding high accuracy when requiring a small number of features. However, the covariance matrix and eigenvalue decomposition stages cause high computational complexity, especially for a large database. Thus, this research presents an alternative approach utilizing an Expectation-Maximization algorithm to reduce the determinant matrix manipulation resulting in the reduction of the stages' complexity. To improve the computational time, a novel parallel architecture was employed to utilize the benefits of parallelization of matrix computation during feature extraction and classification stages including parallel preprocessing, and their combinations, so-called a Parallel Expectation-Maximization PCA architecture. Comparing to a traditional PCA and its derivatives, the results indicate lower complexity with an insignificant difference in recognition precision leading to high speed face recognition systems, that is, the speed-up over nine and three times over PCA and Parallel PCA.

  18. An economic evaluation of alternative biofuel deployment scenarios in the USA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oladosu, Gbadebo

    Energy market conditions have shifted dramatically since the USA renewable fuel standards (RFS1 in 2005; RFS2 in 2007) were enacted. The USA has transitioned from an increasing dependence on oil imports to abundant domestic oil production. In addition, increases in the use of ethanol, the main biofuel currently produced in the USA, is now limited by the blend wall constraint. Given this, the current study evaluates alternative biofuel deployment scenarios in the USA, accounting for changes in market conditions. The analysis is performed with a general equilibrium model that reflects the structure of the USA biofuel market as the transitionmore » to advanced biofuel begins. Results suggest that ethanol consumption would increase, albeit slowly, if current biofuel deployment rates of about 10% are maintained as persistently lower oil prices lead to a gradual increase in the consumption of liquid transportation fuels. Without the blend wall constraint, this study finds that the overall economic impact of a full implementation of the USA RFS2 policy is largely neutral before 2022. However, the economic impacts become slightly negative under the blend wall constraint since more expensive bio-hydrocarbons are needed to meet the RFS2 mandates. Results for a scenario with reduced advanced biofuel deployment based on current policy plans show near neutral economic impacts up to 2027. This scenario is also consistent with another scenario where the volume of bio-hydrocarbons deployed is reduced to adjust for its higher cost and energy content relative to deploying the mandated RFS2 advanced biofuel volumes as ethanol. The important role of technological change is demonstrated under pioneer and accelerated technology scenarios, with the latter leading to neutral or positive economic effects up to 2023 under most blend wall scenarios. Here, all scenarios evaluated in this study are found to have positive long-term economic benefits for the USA economy.« less

  19. An economic evaluation of alternative biofuel deployment scenarios in the USA

    DOE PAGES

    Oladosu, Gbadebo

    2017-05-03

    Energy market conditions have shifted dramatically since the USA renewable fuel standards (RFS1 in 2005; RFS2 in 2007) were enacted. The USA has transitioned from an increasing dependence on oil imports to abundant domestic oil production. In addition, increases in the use of ethanol, the main biofuel currently produced in the USA, is now limited by the blend wall constraint. Given this, the current study evaluates alternative biofuel deployment scenarios in the USA, accounting for changes in market conditions. The analysis is performed with a general equilibrium model that reflects the structure of the USA biofuel market as the transitionmore » to advanced biofuel begins. Results suggest that ethanol consumption would increase, albeit slowly, if current biofuel deployment rates of about 10% are maintained as persistently lower oil prices lead to a gradual increase in the consumption of liquid transportation fuels. Without the blend wall constraint, this study finds that the overall economic impact of a full implementation of the USA RFS2 policy is largely neutral before 2022. However, the economic impacts become slightly negative under the blend wall constraint since more expensive bio-hydrocarbons are needed to meet the RFS2 mandates. Results for a scenario with reduced advanced biofuel deployment based on current policy plans show near neutral economic impacts up to 2027. This scenario is also consistent with another scenario where the volume of bio-hydrocarbons deployed is reduced to adjust for its higher cost and energy content relative to deploying the mandated RFS2 advanced biofuel volumes as ethanol. The important role of technological change is demonstrated under pioneer and accelerated technology scenarios, with the latter leading to neutral or positive economic effects up to 2023 under most blend wall scenarios. Here, all scenarios evaluated in this study are found to have positive long-term economic benefits for the USA economy.« less

  20. Demography and Intercontinental Spread of the USA300 Community-Acquired Methicillin-Resistant Staphylococcus aureus Lineage

    PubMed Central

    Glaser, Philippe; Martins-Simões, Patrícia; Villain, Adrien; Barbier, Maxime; Tristan, Anne; Bouchier, Christiane; Ma, Laurence; Bes, Michele; Laurent, Frederic; Guillemot, Didier; Wirth, Thierry

    2016-01-01

    ABSTRACT Community-acquired methicillin-resistant Staphylococcus aureus (CA-MRSA) was recognized worldwide during the 1990s; in less than a decade, several genetically distinct CA-MRSA lineages carrying Panton-Valentine leukocidin genes have emerged on every continent. Most notably, in the United States, the sequence type 18-IV (ST8-IV) clone known as USA300 has become highly prevalent, outcompeting methicillin-susceptible S. aureus (MSSA) and other MRSA strains in both community and hospital settings. CA-MRSA bacteria are much less prevalent in Europe, where the European ST80-IV European CA-MRSA clone, USA300 CA-MRSA strains, and other lineages, such as ST22-IV, coexist. The question that arises is whether the USA300 CA-MRSA present in Europe (i) was imported once or on very few occasions, followed by a broad geographic spread, anticipating an increased prevalence in the future, or (ii) derived from multiple importations with limited spreading success. In the present study, we applied whole-genome sequencing to a collection of French USA300 CA-MRSA strains responsible for sporadic cases and micro-outbreaks over the past decade and United States ST8 MSSA and MRSA isolates. Genome-wide phylogenetic analysis demonstrated that the population structure of the French isolates is the product of multiple introductions dating back to the onset of the USA300 CA-MRSA clone in North America. Coalescent-based demography of the USA300 lineage shows that a strong expansion occurred during the 1990s concomitant with the acquisition of the arginine catabolic mobile element and antibiotic resistance, followed by a sharp decline initiated around 2008, reminiscent of the rise-and-fall pattern previously observed in the ST80 lineage. A future expansion of the USA300 lineage in Europe is therefore very unlikely. PMID:26884428

  1. Hip and knee replacement in Germany and the USA: analysis of individual inpatient data from German and US hospitals for the years 2005 to 2011.

    PubMed

    Wengler, Annelene; Nimptsch, Ulrike; Mansky, Thomas

    2014-06-09

    The number of hip and knee replacement operations is rising in many industrialized countries. To evaluate the current situation in Germany, we analyzed the frequency of procedures in Germany compared to the USA, with the aid of similar case definitions and taking demographic differences into account. We used individual inpatient data from Germany (DRG statistics) and the USA (Nationwide Inpatient Sample) to study differences in the age- and sex-adjusted rates of hip and knee replacement surgery and the determinants of trends in case numbers over the years 2005 to 2011. In 2011, hip replacement surgery was performed 1.4 times as frequently in Germany as in the USA (284 vs. 204 cases per 100 000 population per year; the American figures have been adjusted to the age and sex structure of the German population). On the other hand, knee replacement surgery was performed 1.5 times as frequently in the USA as in Germany (304 [standardized] vs. 206 cases per 100,000 population per year). Over the period of observation, the rates of both procedures increased in both countries. The number of elective primary hip replacement operations in Germany grew by 11%, from 140,000 to 155 300 (from 170 to 190 per 100,000 persons); after correction for demographic changes, a 3% increase remained. At the same time, the rate of elective primary hip replacement surgery in the USA rose by 28%, from 79 to 96 per 100 000 population, with a 13% increase remaining after correction for demographic changes. There are major differences between Germany and the USA in the frequency of these operations. The observed upward trend in elective primary hip replacement operations was mostly due to demographic changes in Germany; non-demographic factors exerted a stronger influence in the USA than in Germany. With respect to primary knee replacement surgery, non-demographic factors exerted a comparably strong influence in both countries.

  2. Real-time implementations of image segmentation algorithms on shared memory multicore architecture: a survey (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Akil, Mohamed

    2017-05-01

    The real-time processing is getting more and more important in many image processing applications. Image segmentation is one of the most fundamental tasks image analysis. As a consequence, many different approaches for image segmentation have been proposed. The watershed transform is a well-known image segmentation tool. The watershed transform is a very data intensive task. To achieve acceleration and obtain real-time processing of watershed algorithms, parallel architectures and programming models for multicore computing have been developed. This paper focuses on the survey of the approaches for parallel implementation of sequential watershed algorithms on multicore general purpose CPUs: homogeneous multicore processor with shared memory. To achieve an efficient parallel implementation, it's necessary to explore different strategies (parallelization/distribution/distributed scheduling) combined with different acceleration and optimization techniques to enhance parallelism. In this paper, we give a comparison of various parallelization of sequential watershed algorithms on shared memory multicore architecture. We analyze the performance measurements of each parallel implementation and the impact of the different sources of overhead on the performance of the parallel implementations. In this comparison study, we also discuss the advantages and disadvantages of the parallel programming models. Thus, we compare the OpenMP (an application programming interface for multi-Processing) with Ptheads (POSIX Threads) to illustrate the impact of each parallel programming model on the performance of the parallel implementations.

  3. Distributed Contour Trees

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morozov, Dmitriy; Weber, Gunther H.

    2014-03-31

    Topological techniques provide robust tools for data analysis. They are used, for example, for feature extraction, for data de-noising, and for comparison of data sets. This chapter concerns contour trees, a topological descriptor that records the connectivity of the isosurfaces of scalar functions. These trees are fundamental to analysis and visualization of physical phenomena modeled by real-valued measurements. We study the parallel analysis of contour trees. After describing a particular representation of a contour tree, called local{global representation, we illustrate how di erent problems that rely on contour trees can be solved in parallel with minimal communication.

  4. Performance Analysis of Multilevel Parallel Applications on Shared Memory Architectures

    NASA Technical Reports Server (NTRS)

    Jost, Gabriele; Jin, Haoqiang; Labarta, Jesus; Gimenez, Judit; Caubet, Jordi; Biegel, Bryan A. (Technical Monitor)

    2002-01-01

    In this paper we describe how to apply powerful performance analysis techniques to understand the behavior of multilevel parallel applications. We use the Paraver/OMPItrace performance analysis system for our study. This system consists of two major components: The OMPItrace dynamic instrumentation mechanism, which allows the tracing of processes and threads and the Paraver graphical user interface for inspection and analyses of the generated traces. We describe how to use the system to conduct a detailed comparative study of a benchmark code implemented in five different programming paradigms applicable for shared memory

  5. Hypercube Expert System Shell - Applying Production Parallelism.

    DTIC Science & Technology

    1989-12-01

    possible processor organizations, or int( rconntction n thod,, for par- allel architetures . The following are examples of commonlv used interconnection...this timing analysis because match speed-up avaiiah& from production parallelism is proportional to the average number of affected produclions1 ( 11:5

  6. Reliability and mass analysis of dynamic power conversion systems with parallel of standby redundancy

    NASA Technical Reports Server (NTRS)

    Juhasz, A. J.; Bloomfield, H. S.

    1985-01-01

    A combinatorial reliability approach is used to identify potential dynamic power conversion systems for space mission applications. A reliability and mass analysis is also performed, specifically for a 100 kWe nuclear Brayton power conversion system with parallel redundancy. Although this study is done for a reactor outlet temperature of 1100K, preliminary system mass estimates are also included for reactor outlet temperatures ranging up to 1500 K.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sewell, Christopher Meyer

    This is a set of slides from a guest lecture for a class at the University of Texas, El Paso on visualization and data analysis for high-performance computing. The topics covered are the following: trends in high-performance computing; scientific visualization, such as OpenGL, ray tracing and volume rendering, VTK, and ParaView; data science at scale, such as in-situ visualization, image databases, distributed memory parallelism, shared memory parallelism, VTK-m, "big data", and then an analysis example.

  8. On the equivalence of Gaussian elimination and Gauss-Jordan reduction in solving linear equations

    NASA Technical Reports Server (NTRS)

    Tsao, Nai-Kuan

    1989-01-01

    A novel general approach to round-off error analysis using the error complexity concepts is described. This is applied to the analysis of the Gaussian Elimination and Gauss-Jordan scheme for solving linear equations. The results show that the two algorithms are equivalent in terms of our error complexity measures. Thus the inherently parallel Gauss-Jordan scheme can be implemented with confidence if parallel computers are available.

  9. Dependability analysis of parallel systems using a simulation-based approach. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Sawyer, Darren Charles

    1994-01-01

    The analysis of dependability in large, complex, parallel systems executing real applications or workloads is examined in this thesis. To effectively demonstrate the wide range of dependability problems that can be analyzed through simulation, the analysis of three case studies is presented. For each case, the organization of the simulation model used is outlined, and the results from simulated fault injection experiments are explained, showing the usefulness of this method in dependability modeling of large parallel systems. The simulation models are constructed using DEPEND and C++. Where possible, methods to increase dependability are derived from the experimental results. Another interesting facet of all three cases is the presence of some kind of workload of application executing in the simulation while faults are injected. This provides a completely new dimension to this type of study, not possible to model accurately with analytical approaches.

  10. Conceptual design and kinematic analysis of a novel parallel robot for high-speed pick-and-place operations

    NASA Astrophysics Data System (ADS)

    Meng, Qizhi; Xie, Fugui; Liu, Xin-Jun

    2018-06-01

    This paper deals with the conceptual design, kinematic analysis and workspace identification of a novel four degrees-of-freedom (DOFs) high-speed spatial parallel robot for pick-and-place operations. The proposed spatial parallel robot consists of a base, four arms and a 1½ mobile platform. The mobile platform is a major innovation that avoids output singularity and offers the advantages of both single and double platforms. To investigate the characteristics of the robot's DOFs, a line graph method based on Grassmann line geometry is adopted in mobility analysis. In addition, the inverse kinematics is derived, and the constraint conditions to identify the correct solution are also provided. On the basis of the proposed concept, the workspace of the robot is identified using a set of presupposed parameters by taking input and output transmission index as the performance evaluation criteria.

  11. Neoplastic diseases in the domestic ferret (Mustela putorius furo) in Italy: classification and tissue distribution of 856 cases (2000-2010).

    PubMed

    Avallone, Giancarlo; Forlani, Annalisa; Tecilla, Marco; Riccardi, Elena; Belluco, Sara; Santagostino, Sara Francesca; Grilli, Guido; Khadivi, Kiumars; Roccabianca, Paola

    2016-12-05

    The aim of this study was to describe the prevalence and tissue distribution of neoplasms in Italian ferrets, compared to the epidemiological data previously reported in USA and Japan. Signalment and diagnoses of pathological submissions received between 2000 and 2010 were searched; cases with the diagnosis of neoplasm were selected and original sections reviewed to confirm the diagnosis. Nine-hundred and ten samples were retrieved, 690 of which included at least one tumour for a total of 856 tumours. Ferrets with multiple neoplasms were 134 (19.4%). Median age was 5 years, and F/M ratio was 0.99. Endocrine neoplasms were the most common. Other frequent tumours were cutaneous mast cell tumours, sebaceous tumours, and lymphomas. Cutaneous squamous cell carcinomas (SCC) were consistently associated with sebaceous tumours. Twenty-four abdominal spindle cell tumours with an undefined origin were observed. Lymphomas and islet cell tumours had a lower incidence compared with previous extra-European studies. Epidemiological information on ferret tumours derives from extra-European countries, mostly USA and Japan. In these countries similar distributions with minor discrepancies have been reported. Compared to previous reports, adrenal tumours were more frequent than pancreatic islet cell neoplasms, and a higher number of mesenchymal neoplasms arising from the adrenal capsule was noted. An unusual association between SCC and sebaceous gland neoplasms and a high number of intrabdominal spindle cell neoplasms with unclear primary origin were noted and grants further investigation. The tissue distribution of tumours recorded in this study paralleled previous findings in ferrets from USA and Japan. Some differences have been noted in the frequency of lymphoma, adrenal mesenchymal tumours and cutaneous tumours. Some tumours that are among the most common in other species seem to be uncommon in ferrets and are characterized by distinctive predilection sites.

  12. Route 20, Autobahn 7, and Slime Mold: Approximating the Longest Roads in USA and Germany With Slime Mold on 3-D Terrains.

    PubMed

    Adamatzky, Andrew I

    2014-01-01

    A cellular slime mould Physarum polycephalum is a monstrously large single cell visible by an unaided eye. The slime mold explores space in parallel, is guided by gradients of chemoattractants, and propagates toward sources of nutrients along nearly shortest paths. The slime mold is a living prototype of amorphous biological computers and robotic devices capable of solving a range of tasks of graph optimization and computational geometry. When presented with a distribution of nutrients, the slime mold spans the sources of nutrients with a network of protoplasmic tubes. This protoplasmic network matches a network of major transport routes of a country when configuration of major urban areas is represented by nutrients. A transport route connecting two cities should ideally be a shortest path, and this is usually the case in computer simulations and laboratory experiments with flat substrates. What searching strategies does the slime mold adopt when exploring 3-D terrains? How are optimal and transport routes approximated by protoplasmic tubes? Do the routes built by the slime mold on 3-D terrain match real-world transport routes? To answer these questions, we conducted pioneer laboratory experiments with Nylon terrains of USA and Germany. We used the slime mold to approximate route 20, the longest road in USA, and autobahn 7, the longest national motorway in Europe. We found that slime mold builds longer transport routes on 3-D terrains, compared to flat substrates yet sufficiently approximates man-made transport routes studied. We demonstrate that nutrients placed in destination sites affect performance of slime mold, and show how the mold navigates around elevations. In cellular automaton models of the slime mold, we have shown variability of the protoplasmic routes might depends on physiological states of the slime mold. Results presented will contribute toward development of novel algorithms for sensorial fusion, information processing, and decision making, and will provide inspirations in design of bioinspired amorphous robotic devices.

  13. Stiffness modeling of compliant parallel mechanisms and applications in the performance analysis of a decoupled parallel compliant stage

    NASA Astrophysics Data System (ADS)

    Jiang, Yao; Li, Tie-Min; Wang, Li-Ping

    2015-09-01

    This paper investigates the stiffness modeling of compliant parallel mechanism (CPM) based on the matrix method. First, the general compliance matrix of a serial flexure chain is derived. The stiffness modeling of CPMs is next discussed in detail, considering the relative positions of the applied load and the selected displacement output point. The derived stiffness models have simple and explicit forms, and the input, output, and coupling stiffness matrices of the CPM can easily be obtained. The proposed analytical model is applied to the stiffness modeling and performance analysis of an XY parallel compliant stage with input and output decoupling characteristics. Then, the key geometrical parameters of the stage are optimized to obtain the minimum input decoupling degree. Finally, a prototype of the compliant stage is developed and its input axial stiffness, coupling characteristics, positioning resolution, and circular contouring performance are tested. The results demonstrate the excellent performance of the compliant stage and verify the effectiveness of the proposed theoretical model. The general stiffness models provided in this paper will be helpful for performance analysis, especially in determining coupling characteristics, and the structure optimization of the CPM.

  14. Independent and parallel evolution of new genes by gene duplication in two origins of C4 photosynthesis provides new insight into the mechanism of phloem loading in C4 species

    DOE PAGES

    Emms, David M.; Covshoff, Sarah; Hibberd, Julian M.; ...

    2016-03-24

    C4 photosynthesis is considered one of the most remarkable examples of evolutionary convergence in eukaryotes. However, it is unknown whether the evolution of C4 photosynthesis required the evolution of new genes. Genome-wide gene-tree species-tree reconciliation of seven monocot species that span two origins of C4 photosynthesis revealed that there was significant parallelism in the duplication and retention of genes coincident with the evolution of C4 photosynthesis in these lineages. Specifically, 21 orthologous genes were duplicated and retained independently in parallel at both C4 origins. Analysis of this gene cohort revealed that the set of parallel duplicated and retained genes ismore » enriched for genes that are preferentially expressed in bundle sheath cells, the cell type in which photosynthesis was activated during C4 evolution. Moreover, functional analysis of the cohort of parallel duplicated genes identified SWEET-13 as a potential key transporter in the evolution of C4 photosynthesis in grasses, and provides new insight into the mechanism of phloem loading in these C4 species.« less

  15. A search for genetic diversity among Italian Greyhounds from Continental Europe and the USA and the effect of inbreeding on susceptibility to autoimmune disease.

    PubMed

    Pedersen, Niels C; Liu, Hongwei; Leonard, Angela; Griffioen, Layle

    2015-01-01

    Previous studies documented the problem of inbreeding among Italian Greyhounds (IG) from the USA and its possible role in a multiple autoimmune disease syndrome. The present study is an extension of these earlier experiments and had two objectives: 1) to identify pockets of additional genetic diversity that might still exist among IG from the USA and Continental Europe, and 2) to determine how loss of genetic diversity within the genome and in the dog leukocyte antigen (DLA) complex relates to the problem of autoimmune disease in IG from the USA. Genetic testing was conducted using 33 short tandem repeat (STR) loci across 25 chromosomes and 7 STR loci that associated with specific dog leukocyte antigen (DLA) class I and II haplotypes. Standard genetic assessment tests based on allele frequencies and internal relatedness (IR) were used as measures of breed-wide and individual heterozygosity. The results of these tests demonstrated that IG from the USA and Continental Europe belonged to a single breed but were genetically distinguishable by genomic allele frequencies, DLA class I and II haplotypes, and principal coordinate analysis (PCoA). In the second part of the study, 85 IG from the USA that had suffered various autoimmune disorders (case) and 104 healthy dogs (control) of comparable age were studied for genetic associations with disease. Case dogs were found to be significantly more homozygous in the DLA regions than control dogs. Principal coordinate analysis did not differentiate case from control populations. No specific STR-associated DLA-class I or II haplotype was associated with increased autoimmune disease risks. Reasons for the loss of genetic diversity and increased homozygosity among IG from the USA were studied using registration data and deep pedigrees. The breed in the USA started from a small number of founders from Europe and has remained relatively isolated and small in numbers, limiting breeding choices especially in the period before modern transportation and artificial insemination. An additional cause of lost diversity and increased homozygosity has been the influence of famous sires and their show-winning progeny. The most influential of these sires was Ch. Dasa's King of the Mountain (King) born in 1978. Virtually all contemporary IG from the USA have King at least once in 10 generation pedigrees and 18 % of the genome of contemporary IG from the USA is shared with King. It was concluded that artificial genetic bottlenecks have concentrated numerous genetic polymorphisms responsible for autoimmune disease and that these risk factors did not originate in a specific individual or bloodline of the breed. Rather, they were of ancestral origin in both purebred and random bred dogs and inherited by descent. Italian Greyhound breeders in the USA have several options to improve breed health: 1) breed against homozygosity within the genome and in the DLA region, 2) avoid breeding dogs that have suffered an autoimmune disorder, 3) increase diversity by incorporating the genetic differences that exist in IG from Continental Europe, or 4) outcross to other small sighthound breeds. The latter two approaches must be undertaken with care to avoid introduction of new deleterious traits and to maximize retention and dissemination of new genetic diversity.

  16. Emission of sound from turbulence convected by a parallel flow in the presence of solid boundaries

    NASA Technical Reports Server (NTRS)

    Goldstein, M. E.; Rosenbaum, B. M.

    1973-01-01

    A theoretical description is given of the sound emitted from an arbitrary point in a parallel or nearly parallel turbulent shear flow confined to a region near solid boundaries. The analysis begins with Lighthill's formulation of aerodynamic noise and assumes that the turbulence is axisymmetric. Specific results are obtained for the sound emitted from an arbitrary point in a turbulent flow within a semi-infinite, open-ended duct.

  17. Stochastic injection-strategy optimization for the preliminary assessment of candidate geological storage sites

    NASA Astrophysics Data System (ADS)

    Cody, Brent M.; Baù, Domenico; González-Nicolás, Ana

    2015-09-01

    Geological carbon sequestration (GCS) has been identified as having the potential to reduce increasing atmospheric concentrations of carbon dioxide (CO2). However, a global impact will only be achieved if GCS is cost-effectively and safely implemented on a massive scale. This work presents a computationally efficient methodology for identifying optimal injection strategies at candidate GCS sites having uncertainty associated with caprock permeability, effective compressibility, and aquifer permeability. A multi-objective evolutionary optimization algorithm is used to heuristically determine non-dominated solutions between the following two competing objectives: (1) maximize mass of CO2 sequestered and (2) minimize project cost. A semi-analytical algorithm is used to estimate CO2 leakage mass rather than a numerical model, enabling the study of GCS sites having vastly different domain characteristics. The stochastic optimization framework presented herein is applied to a feasibility study of GCS in a brine aquifer in the Michigan Basin (MB), USA. Eight optimization test cases are performed to investigate the impact of decision-maker (DM) preferences on Pareto-optimal objective-function values and carbon-injection strategies. This analysis shows that the feasibility of GCS at the MB test site is highly dependent upon the DM's risk-adversity preference and degree of uncertainty associated with caprock integrity. Finally, large gains in computational efficiency achieved using parallel processing and archiving are discussed.

  18. European Paediatric Formulation Initiative (EuPFI)-Formulating Ideas for Better Medicines for Children.

    PubMed

    Salunke, Smita; Liu, Fang; Batchelor, Hannah; Walsh, Jenny; Turner, Roy; Ju, Tzuchi Rob; Tuleu, Catherine

    2017-02-01

    The European Paediatric Formulation Initiative (EuPFI), founded in 2007, aims to promote and facilitate the preparation of better and safe medicines for children through linking research and information dissemination. It brings together the capabilities of the industry, academics, hospitals, and regulators within a common platform in order to scope the solid understanding of the major issues, which will underpin the progress towards the future of paediatric medicines we want.The EuPFI was formed in parallel to the adoption of regulations within the EU and USA and has served as a community that drives research and dissemination through publications and the organisation of annual conferences. The membership and reach of this group have grown since its inception in 2007 and continue to develop and evolve to meet the continuing needs and ambitions of research into and development of age appropriate medicines. Five diverse workstreams (age-appropriate medicines, Biopharmaceutics, Administration Devices, Excipients and Taste Assessment & Taste Masking (TATM)) direct specific workpackages on behalf of the EuPFI. Furthermore, EuPFI interacts with multiple diverse professional groups across the globe to ensure efficient working in the area of paediatric medicines. Strong commitment and active involvement of all EuPFI stakeholders have proved to be vital to effectively address knowledge gaps related to paediatric medicines, discuss potential areas for further research and identify issues that need more attention and analysis in the future.

  19. AirSWOT Measurements of Water Surface Elevations and Hydraulic Gradients over the Yukon Flats, Alaska

    NASA Astrophysics Data System (ADS)

    Pitcher, L. H.; Pavelsky, T.; Smith, L. C.; Moller, D.; Altenau, E. H.; Lion, C.; Bertram, M.; Cooley, S. W.

    2017-12-01

    AirSWOT is an airborne, Ka-band synthetic aperture radar interferometer (InSAR) intended to quantify surface water fluxes by mapping water surface elevations (WSE). AirSWOT will also serve as a calibration/validation tool for the Surface Water and Ocean Topography (SWOT) satellite mission (scheduled for launch in 2021). The hydrology objectives for AirSWOT and SWOT are to measure WSE with accuracies sufficient to estimate hydrologic fluxes in lakes, wetlands and rivers. However, current understanding of the performance of these related though not identical instruments when applied to complex river-lake-wetland fluvial environments remains predominantly theoretical. We present AirSWOT data acquired 15-June-2015 over the Yukon Flats, Alaska, USA, together with in situ field surveys, to assess the accuracy of AirSWOT WSE measurements in lakes and rivers. We use these data to demonstrate that AirSWOT can be used to estimate large-scale hydraulic gradients across wetland complexes. Finally, we present key lessons learned from this AirSWOT analysis for consideration in future campaigns, including: maximizing swath overlap for spatial averaging to minimize uncertainty as well as orienting flight paths parallel to river flow directions to reduce along track aircraft drift for neighboring flight paths. We conclude that spatially dense AirSWOT measurements of river and lake WSEs can improve geospatial understanding of surface water hydrology and fluvial processes.

  20. The Effect of Wearing White Coats on Patients' Appreciation of Physician Communication during Postpartum Rounds: A Randomized Controlled Trial.

    PubMed

    La Rosa, Mauricio; Spencer, Nicholas; Abdelwahab, Mahmoud; Zambrano, Gabriela; Saoud, Fawzi; Jelliffe, Katherine; Olson, Gayle; Munn, Mary; Saade, George R; Costantine, Maged

    2018-06-08

     Wearing a white coat (WC) has been associated with risk of colonization and transmission of resistant pathogens. Also, studies have shown that physicians' attire in general affects patients' confidence in their physician and the patient-physician relationship. Our objective is to evaluate the hypothesis that not wearing a WC during physician postpartum rounds does not affect patient-physician communication scores.  This is an unblinded, randomized, parallel arms, controlled trial of postpartum women at a single university hospital. Women were randomly assigned to having their postpartum physicians' team wear a WC or not (no-WC) during rounds. Our primary outcome was "patient-physician communication" score. Univariable and multivariable analysis were used where appropriate.  One hundred and seventy-eight patients were enrolled (87 in WC and 91 in no-WC groups). Note that 40.4% of patients did not remember whether the physicians wore a WC or not. There was no difference in the primary outcome ( p  = 0.64) even after adjusting for possible confounders.  Not wearing a WC during postpartum rounds did not affect the patient-physician communication or patient satisfaction scores. In the setting of prior reports showing a risk of WC pathogen transmission between patients, our findings cannot support the routine wearing of WCs during postpartum rounds. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  1. MOLECULAR EVOLUTION OF WEST NILE VIRUS IN A NORTHERN TEMPERATE REGION: CONNECTICUT, USA 1999–2008

    PubMed Central

    Armstrong, Philip M.; Vossbrinck, Charles R.; Andreadis, Theodore G.; Anderson, John F.; Pesko, Kendra N.; Newman, Ruchi M.; Lennon, Niall J.; Birren, Bruce W.; Ebel, Gregory D.; Henn, Mathew R.

    2011-01-01

    West Nile virus (WNV) has become firmly established in northeastern U.S., reemerging every summer since its introduction into North America in 1999. To determine whether WNV overwinters locally or is reseeded annually, we examined the patterns of viral lineage persistence and replacement in Connecticut over 10 consecutive transmission seasons by phylogenetic analysis. In addition, we compared the full protein coding sequence among WNV isolates to search for evidence of convergent and adaptive evolution. Viruses sampled from Connecticut segregated into a number of well-supported subclades by year of isolation with few clades persisting ≥2 years. Similar viral strains were dispersed in different locations across the state and divergent strains appeared within a single location during a single transmission season, implying widespread movement and rapid colonization of virus. Numerous amino acid substitutions arose in the population but only one change, V→A at position 159 of the envelope protein, became permanently fixed. Several instances of parallel evolution were identified in independent lineages, including one amino acid change in the NS4A protein that appears to bepositively selected. Our results suggest that annual reemergence of WNV is driven by both reintroduction and local-overwintering of virus. Despite ongoing evolution of WNV, most amino acid variants occurred at low frequencies and were transient in the virus population. PMID:21723580

  2. Evaluation of the Precision ID Ancestry Panel for crime case work: A SNP typing assay developed for typing of 165 ancestral informative markers.

    PubMed

    Pereira, Vania; Mogensen, Helle S; Børsting, Claus; Morling, Niels

    2017-05-01

    The application of massive parallel sequencing (MPS) methodologies in forensic genetics is promising and it is gradually being implemented in forensic genetic case work. One of the major advantages of these technologies is that several traditional electrophoresis assays can be combined into one single MPS assay. This reduces both the amount of sample used and the time of the investigations. This study assessed the utility of the Precision ID Ancestry Panel (Thermo Fisher Scientific, Waltham, USA) in forensic genetics. This assay was developed for the Ion Torrent PGM™ System and genotypes 165 ancestry informative SNPs. The performance of the assay and the accompanying software solution for ancestry inference was assessed by typing 142 Danes and 98 Somalis. Locus balance, heterozygote balance, and noise levels were calculated and future analysis criteria for crime case work were estimated. Overall, the Precision ID Ancestry Panel performed well, and only minor changes to the recommended protocol were implemented. Three out of the 165 loci (rs459920, rs7251928, and rs7722456) had consistently poor performance, mainly due to misalignment of homopolymeric stretches. We suggest that these loci should be excluded from the analyses. The different statistical methods for reporting ancestry in forensic genetic case work are discussed. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. Transport and acceleration of plasma in the magnetospheres of Earth and Jupiter and expectations for Saturn

    NASA Astrophysics Data System (ADS)

    Kivelson, M. G.

    The first comparative magnetospheres conference was held in Frascati, Italy thirty years ago this summer, less than half a year after the first spacecraft encounter with Jupiter's magnetosphere (Formisano, V. (Ed.), The Magnetospheres of the Earth and Jupiter, Proceedings of the Neil Brice Memorial Symposium held in Frascati, Italy, May 28-June 1, 1974. D. Reidel Publishing Co., Boston, USA, 1975). Disputes highlighted various issues still being investigated, such as how plasma transport at Jupiter deviates from the prototypical form of transport at Earth and the role of substorms in Jupiter's dynamics. Today there is a wealth of data on which to base the analysis, data gathered by seven missions that culminated with Galileo's 8-year orbital tour. We are still debating how magnetic flux is returned to the inner magnetosphere following its outward transport by iogenic plasma. We are still uncertain about the nature of sporadic dynamical disturbances at Jupiter and their relation to terrestrial substorms. At Saturn, the centrifugal stresses are not effective in distorting the magnetic field, so in some ways the magnetosphere appears Earthlike. Yet the presence of plasma sources in the close-in equatorial magnetosphere parallels conditions at Jupiter. This suggests that we need to study both Jupiter and Earth when thinking about what to anticipate from Cassini's exploration of Saturn's magnetosphere. This paper addresses issues relevant to plasma transport and acceleration in all three magnetospheres.

  4. Dimensional synthesis of a 3-DOF parallel manipulator with full circle rotation

    NASA Astrophysics Data System (ADS)

    Ni, Yanbing; Wu, Nan; Zhong, Xueyong; Zhang, Biao

    2015-07-01

    Parallel robots are widely used in the academic and industrial fields. In spite of the numerous achievements in the design and dimensional synthesis of the low-mobility parallel robots, few research efforts are directed towards the asymmetric 3-DOF parallel robots whose end-effector can realize 2 translational and 1 rotational(2T1R) motion. In order to develop a manipulator with the capability of full circle rotation to enlarge the workspace, a new 2T1R parallel mechanism is proposed. The modeling approach and kinematic analysis of this proposed mechanism are investigated. Using the method of vector analysis, the inverse kinematic equations are established. This is followed by a vigorous proof that this mechanism attains an annular workspace through its circular rotation and 2 dimensional translations. Taking the first order perturbation of the kinematic equations, the error Jacobian matrix which represents the mapping relationship between the error sources of geometric parameters and the end-effector position errors is derived. With consideration of the constraint conditions of pressure angles and feasible workspace, the dimensional synthesis is conducted with a goal to minimize the global comprehensive performance index. The dimension parameters making the mechanism to have optimal error mapping and kinematic performance are obtained through the optimization algorithm. All these research achievements lay the foundation for the prototype building of such kind of parallel robots.

  5. Comparison of prostate cancer survival in Germany and the USA: can differences be attributed to differences in stage distributions?

    PubMed

    Winter, Alexander; Sirri, Eunice; Jansen, Lina; Wawroschek, Friedhelm; Kieschke, Joachim; Castro, Felipe A; Krilaviciute, Agne; Holleczek, Bernd; Emrich, Katharina; Waldmann, Annika; Brenner, Hermann

    2017-04-01

    To better understand the influence of prostate-specific antigen (PSA) screening and other health system determinants on prognosis of prostate cancer, up-to-date relative survival (RS), stage distributions, and trends in survival and incidence in Germany were evaluated and compared with the United States of America (USA). Incidence and mortality rates for Germany and the USA for the period 1999-2010 were obtained from the Centre for Cancer Registry Data at the Robert Koch Institute and the USA Surveillance Epidemiology and End Results (SEER) database. For analyses on stage and survival, data from 12 population-based cancer registries in Germany and from the SEER-13 database were analysed. Patients (aged ≥ 15 years) diagnosed with prostate cancer (1997-2010) and mortality follow-up to December 2010 were included. The 5- and 10-year RS and survival trends (2002-2010) were calculated using standard and model-based period analysis. Between 1999 and 2010, prostate cancer incidence decreased in the USA but increased in Germany. Nevertheless, incidence remained higher in the USA throughout the study period (99.8 vs 76.0 per 100,000 in 2010). The proportion of localised disease significantly increased from 51.9% (1998-2000) to 69.6% (2007-2010) in Germany and from 80.5% (1998-2000) to 82.6% (2007-2010) in the USA. Mortality slightly decreased in both countries (1999-2010). Overall, 5- and 10-year RS was lower in Germany (93.3%; 90.7%) than in the USA (99.4%; 99.6%) but comparable after adjustment for stage. The same patterns were seen in age-specific analyses. Improvements seen in prostate cancer survival between 2002-2004 and 2008-2010 (5-year RS: 87.4% and 91.2%; +3.8% units) in Germany disappeared after adjustment for stage (P = 0.8). The survival increase in Germany and the survival advantage in the USA might be explained by differences in incidence and stage distributions over time and across countries. Effects of early detection or a lead-time bias due to the more widespread utilisation and earlier introduction of PSA testing in the USA are likely to explain the observed patterns. © 2016 The Authors BJU International © 2016 BJU International Published by John Wiley & Sons Ltd.

  6. Analysis and selection of optimal function implementations in massively parallel computer

    DOEpatents

    Archer, Charles Jens [Rochester, MN; Peters, Amanda [Rochester, MN; Ratterman, Joseph D [Rochester, MN

    2011-05-31

    An apparatus, program product and method optimize the operation of a parallel computer system by, in part, collecting performance data for a set of implementations of a function capable of being executed on the parallel computer system based upon the execution of the set of implementations under varying input parameters in a plurality of input dimensions. The collected performance data may be used to generate selection program code that is configured to call selected implementations of the function in response to a call to the function under varying input parameters. The collected performance data may be used to perform more detailed analysis to ascertain the comparative performance of the set of implementations of the function under the varying input parameters.

  7. High-performance parallel analysis of coupled problems for aircraft propulsion

    NASA Technical Reports Server (NTRS)

    Felippa, C. A.; Farhat, C.; Lanteri, S.; Gumaste, U.; Ronaghi, M.

    1994-01-01

    Applications are described of high-performance parallel, computation for the analysis of complete jet engines, considering its multi-discipline coupled problem. The coupled problem involves interaction of structures with gas dynamics, heat conduction and heat transfer in aircraft engines. The methodology issues addressed include: consistent discrete formulation of coupled problems with emphasis on coupling phenomena; effect of partitioning strategies, augmentation and temporal solution procedures; sensitivity of response to problem parameters; and methods for interfacing multiscale discretizations in different single fields. The computer implementation issues addressed include: parallel treatment of coupled systems; domain decomposition and mesh partitioning strategies; data representation in object-oriented form and mapping to hardware driven representation, and tradeoff studies between partitioning schemes and fully coupled treatment.

  8. Parallel-Connected Photovoltaic Inverters: Zero Frequency Sequence Harmonic Analysis and Solution

    NASA Astrophysics Data System (ADS)

    Carmeli, Maria Stefania; Mauri, Marco; Frosio, Luisa; Bezzolato, Alberto; Marchegiani, Gabriele

    2013-05-01

    High-power photovoltaic (PV) plants are usually constituted of the connection of different PV subfields, each of them with its interface transformer. Different solutions have been studied to improve the efficiency of the whole generation system. In particular, transformerless configurations are the more attractive one from efficiency and costs point of view. This paper focuses on transformerless PV configurations characterised by the parallel connection of interface inverters. The problem of zero sequence current due to both the parallel connection and the presence of undesirable parasitic earth capacitances is considered and a solution, which consists of the synchronisation of pulse-width modulation triangular carrier, is proposed and theoretically analysed. The theoretical analysis has been validated through simulation and experimental results.

  9. SBML-PET-MPI: a parallel parameter estimation tool for Systems Biology Markup Language based models.

    PubMed

    Zi, Zhike

    2011-04-01

    Parameter estimation is crucial for the modeling and dynamic analysis of biological systems. However, implementing parameter estimation is time consuming and computationally demanding. Here, we introduced a parallel parameter estimation tool for Systems Biology Markup Language (SBML)-based models (SBML-PET-MPI). SBML-PET-MPI allows the user to perform parameter estimation and parameter uncertainty analysis by collectively fitting multiple experimental datasets. The tool is developed and parallelized using the message passing interface (MPI) protocol, which provides good scalability with the number of processors. SBML-PET-MPI is freely available for non-commercial use at http://www.bioss.uni-freiburg.de/cms/sbml-pet-mpi.html or http://sites.google.com/site/sbmlpetmpi/.

  10. A Model for Speedup of Parallel Programs

    DTIC Science & Technology

    1997-01-01

    Sanjeev. K Setia . The interaction between mem- ory allocation and adaptive partitioning in message- passing multicomputers. In IPPS 󈨣 Workshop on Job...Scheduling Strategies for Parallel Processing, pages 89{99, 1995. [15] Sanjeev K. Setia and Satish K. Tripathi. A compar- ative analysis of static

  11. Highly scalable parallel processing of extracellular recordings of Multielectrode Arrays.

    PubMed

    Gehring, Tiago V; Vasilaki, Eleni; Giugliano, Michele

    2015-01-01

    Technological advances of Multielectrode Arrays (MEAs) used for multisite, parallel electrophysiological recordings, lead to an ever increasing amount of raw data being generated. Arrays with hundreds up to a few thousands of electrodes are slowly seeing widespread use and the expectation is that more sophisticated arrays will become available in the near future. In order to process the large data volumes resulting from MEA recordings there is a pressing need for new software tools able to process many data channels in parallel. Here we present a new tool for processing MEA data recordings that makes use of new programming paradigms and recent technology developments to unleash the power of modern highly parallel hardware, such as multi-core CPUs with vector instruction sets or GPGPUs. Our tool builds on and complements existing MEA data analysis packages. It shows high scalability and can be used to speed up some performance critical pre-processing steps such as data filtering and spike detection, helping to make the analysis of larger data sets tractable.

  12. Block-Parallel Data Analysis with DIY2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morozov, Dmitriy; Peterka, Tom

    DIY2 is a programming model and runtime for block-parallel analytics on distributed-memory machines. Its main abstraction is block-structured data parallelism: data are decomposed into blocks; blocks are assigned to processing elements (processes or threads); computation is described as iterations over these blocks, and communication between blocks is defined by reusable patterns. By expressing computation in this general form, the DIY2 runtime is free to optimize the movement of blocks between slow and fast memories (disk and flash vs. DRAM) and to concurrently execute blocks residing in memory with multiple threads. This enables the same program to execute in-core, out-of-core, serial,more » parallel, single-threaded, multithreaded, or combinations thereof. This paper describes the implementation of the main features of the DIY2 programming model and optimizations to improve performance. DIY2 is evaluated on benchmark test cases to establish baseline performance for several common patterns and on larger complete analysis codes running on large-scale HPC machines.« less

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Emms, David M.; Covshoff, Sarah; Hibberd, Julian M.

    C4 photosynthesis is considered one of the most remarkable examples of evolutionary convergence in eukaryotes. However, it is unknown whether the evolution of C4 photosynthesis required the evolution of new genes. Genome-wide gene-tree species-tree reconciliation of seven monocot species that span two origins of C4 photosynthesis revealed that there was significant parallelism in the duplication and retention of genes coincident with the evolution of C4 photosynthesis in these lineages. Specifically, 21 orthologous genes were duplicated and retained independently in parallel at both C4 origins. Analysis of this gene cohort revealed that the set of parallel duplicated and retained genes ismore » enriched for genes that are preferentially expressed in bundle sheath cells, the cell type in which photosynthesis was activated during C4 evolution. Moreover, functional analysis of the cohort of parallel duplicated genes identified SWEET-13 as a potential key transporter in the evolution of C4 photosynthesis in grasses, and provides new insight into the mechanism of phloem loading in these C4 species.« less

  14. Statistical Analysis of NAS Parallel Benchmarks and LINPACK Results

    NASA Technical Reports Server (NTRS)

    Meuer, Hans-Werner; Simon, Horst D.; Strohmeier, Erich; Lasinski, T. A. (Technical Monitor)

    1994-01-01

    In the last three years extensive performance data have been reported for parallel machines both based on the NAS Parallel Benchmarks, and on LINPACK. In this study we have used the reported benchmark results and performed a number of statistical experiments using factor, cluster, and regression analyses. In addition to the performance results of LINPACK and the eight NAS parallel benchmarks, we have also included peak performance of the machine, and the LINPACK n and n(sub 1/2) values. Some of the results and observations can be summarized as follows: 1) All benchmarks are strongly correlated with peak performance. 2) LINPACK and EP have each a unique signature. 3) The remaining NPB can grouped into three groups as follows: (CG and IS), (LU and SP), and (MG, FT, and BT). Hence three (or four with EP) benchmarks are sufficient to characterize the overall NPB performance. Our poster presentation will follow a standard poster format, and will present the data of our statistical analysis in detail.

  15. High-performance parallel analysis of coupled problems for aircraft propulsion

    NASA Technical Reports Server (NTRS)

    Felippa, C. A.; Farhat, C.; Chen, P.-S.; Gumaste, U.; Leoinne, M.; Stern, P.

    1995-01-01

    This research program deals with the application of high-performance computing methods to the numerical simulation of complete jet engines. The program was initiated in 1993 by applying two-dimensional parallel aeroelastic codes to the interior gas flow problem of a by-pass jet engine. The fluid mesh generation, domain decomposition and solution capabilities were successfully tested. Attention was then focused on methodology for the partitioned analysis of the interaction of the gas flow with a flexible structure and with the fluid mesh motion driven by these structural displacements. The latter is treated by an ALE technique that models the fluid mesh motion as that of a fictitious mechanical network laid along the edges of near-field fluid elements. New partitioned analysis procedures to treat this coupled 3-component problem were developed in 1994. These procedures involved delayed corrections and subcycling, and have been successfully tested on several massively parallel computers. For the global steady-state axisymmetric analysis of a complete engine we have decided to use the NASA-sponsored ENG10 program, which uses a regular FV-multiblock-grid discretization in conjunction with circumferential averaging to include effects of blade forces, loss, combustor heat addition, blockage, bleeds and convective mixing. A load-balancing preprocessor for parallel versions of ENG10 has been developed. It is planned to use the steady-state global solution provided by ENG10 as input to a localized three-dimensional FSI analysis for engine regions where aeroelastic effects may be important.

  16. Mapping trace element distribution in fossil teeth and bone with LA-ICP-MS

    NASA Astrophysics Data System (ADS)

    Hinz, E. A.; Kohn, M. J.

    2009-12-01

    Trace element profiles were measured in fossil bones and teeth from the late Pleistocene (c. 25 ka) Merrell locality, Montana, USA, by using laser-ablation ICP-MS. Laser-ablation ICP-MS can collect element counts along predefined tracks on a sample’s surface using a constant ablation speed allowing for rapid spatial sampling of element distribution. Key elements analyzed included common divalent cations (e.g. Sr, Zn, Ba), a suite of REE (La, Ce, Nd, Sm, Eu, Yb), and U, in addition to Ca for composition normalization and standardization. In teeth, characteristic diffusion penetration distances for all trace elements are at least a factor of 4 greater in traverses parallel to the dentine-enamel interface (parallel to the growth axis of the tooth) than perpendicular to the interface. Multiple parallel traverses in sections parallel and perpendicular to the tooth growth axis were transformed into trace element maps, and illustrate greater uptake of all trace elements along the central axis of dentine compared to areas closer to enamel, or within the enamel itself. Traverses in bone extending from the external surface, through the thickness of cortical bone and several mm into trabecular bone show major differences in trace element uptake compared to teeth: U and Sr are homogeneous, whereas all REE show a kinked profile with high concentrations on outer surfaces that decrease by several orders of magnitude within a few mm inward. The Eu anomaly increases uniformly from the outer edge of bone inward, whereas the Ce anomaly decreases slightly. These observations point to major structural anisotropies in trace element transport and uptake during fossilization, yet transport and uptake of U and REE are not resolvably different. In contrast, transport and uptake of U in bone must proceed orders of magnitude faster than REE as U is homogeneous whereas REE exhibit strong gradients. The kinked REE profiles in bone unequivocally indicate differential transport rates, consistent with a double-medium diffusion model in which microdomains with slow diffusivities are bounded by fast-diffusing pathways.

  17. A Parallel Independent Component Analysis Approach to Investigate Genomic Influence on Brain Function

    PubMed Central

    Liu, Jingyu; Demirci, Oguz; Calhoun, Vince D.

    2009-01-01

    Relationships between genomic data and functional brain images are of great interest but require new analysis approaches to integrate the high-dimensional data types. This letter presents an extension of a technique called parallel independent component analysis (paraICA), which enables the joint analysis of multiple modalities including interconnections between them. We extend our earlier work by allowing for multiple interconnections and by providing important overfitting controls. Performance was assessed by simulations under different conditions, and indicated reliable results can be extracted by properly balancing overfitting and underfitting. An application to functional magnetic resonance images and single nucleotide polymorphism array produced interesting findings. PMID:19834575

  18. A Parallel Independent Component Analysis Approach to Investigate Genomic Influence on Brain Function.

    PubMed

    Liu, Jingyu; Demirci, Oguz; Calhoun, Vince D

    2008-01-01

    Relationships between genomic data and functional brain images are of great interest but require new analysis approaches to integrate the high-dimensional data types. This letter presents an extension of a technique called parallel independent component analysis (paraICA), which enables the joint analysis of multiple modalities including interconnections between them. We extend our earlier work by allowing for multiple interconnections and by providing important overfitting controls. Performance was assessed by simulations under different conditions, and indicated reliable results can be extracted by properly balancing overfitting and underfitting. An application to functional magnetic resonance images and single nucleotide polymorphism array produced interesting findings.

  19. Improving Earth/Prediction Models to Improve Network Processing

    NASA Astrophysics Data System (ADS)

    Wagner, G. S.

    2017-12-01

    The United States Atomic Energy Detection System (USAEDS) primaryseismic network consists of a relatively small number of arrays andthree-component stations. The relatively small number of stationsin the USAEDS primary network make it both necessary and feasibleto optimize both station and network processing.Station processing improvements include detector tuning effortsthat use Receiver Operator Characteristic (ROC) curves to helpjudiciously set acceptable Type 1 (false) vs. Type 2 (miss) errorrates. Other station processing improvements include the use ofempirical/historical observations and continuous background noisemeasurements to compute time-varying, maximum likelihood probabilityof detection thresholds.The USAEDS network processing software makes extensive use of theazimuth and slowness information provided by frequency-wavenumberanalysis at array sites, and polarization analysis at three-componentsites. Most of the improvements in USAEDS network processing aredue to improvements in the models used to predict azimuth, slowness,and probability of detection. Kriged travel-time, azimuth andslowness corrections-and associated uncertainties-are computedusing a ground truth database. Improvements in station processingand the use of improved models for azimuth, slowness, and probabilityof detection have led to significant improvements in USADES networkprocessing.

  20. High-speed reacting flow simulation using USA-series codes

    NASA Astrophysics Data System (ADS)

    Chakravarthy, S. R.; Palaniswamy, S.

    In this paper, the finite-rate chemistry (FRC) formulation for the USA-series of codes and three sets of validations are presented. USA-series computational fluid dynamics (CFD) codes are based on Unified Solution Algorithms including explicity and implicit formulations, factorization and relaxation approaches, time marching and space marching methodolgies, etc., in order to be able to solve a very wide class of CDF problems using a single framework. Euler or Navier-Stokes equations are solved using a finite-volume treatment with upwind Total Variation Diminishing discretization for the inviscid terms. Perfect and real gas options are available including equilibrium and nonequilibrium chemistry. This capability has been widely used to study various problems including Space Shuttle exhaust plumes, National Aerospace Plane (NASP) designs, etc. (1) Numerical solutions are presented showing the full range of possible solutions to steady detonation wave problems. (2) Comparison between the solution obtained by the USA code and Generalized Kinetics Analysis Program (GKAP) is shown for supersonic combustion in a duct. (3) Simulation of combustion in a supersonic shear layer is shown to have reasonable agreement with experimental observations.

  1. SPATIAL ANALYSIS OF VOLATILE ORGANIC COMPOUNDS FROM A COMMUNITY-BASED AIR TOXICS MONITORING NETWORK IN DEER PARK, TEXAS, USA

    EPA Science Inventory

    This RARE Project with EPA Region 6 was a spatial analysis study of select volatile organic compounds (VOC) collected using passive air monitors at outdoor residential locations in the Deer Park, Texas area near the Houston Ship Channel. Correlation analysis of VOC species confi...

  2. WPC Surface Analysis Archive

    Science.gov Websites

    Calculators Contact Us About Our Site About Our Products USA.gov is the U.S. Government's official web portal to all federal, state, and local government web resources and services. WPC's Surface Analysis analysis overlaid with IR satellite imagery (IR Satellite Imagery) Latest image Loop: [3] [7] Days Latest

  3. Parallel confocal detection of single biomolecules using diffractive optics and integrated detector units.

    PubMed

    Blom, H; Gösch, M

    2004-04-01

    The past few years we have witnessed a tremendous surge of interest in so-called array-based miniaturised analytical systems due to their value as extremely powerful tools for high-throughput sequence analysis, drug discovery and development, and diagnostic tests in medicine (see articles in Issue 1). Terminologies that have been used to describe these array-based bioscience systems include (but are not limited to): DNA-chip, microarrays, microchip, biochip, DNA-microarrays and genome chip. Potential technological benefits of introducing these miniaturised analytical systems include improved accuracy, multiplexing, lower sample and reagent consumption, disposability, and decreased analysis times, just to mention a few examples. Among the many alternative principles of detection-analysis (e.g.chemiluminescence, electroluminescence and conductivity), fluorescence-based techniques are widely used, examples being fluorescence resonance energy transfer, fluorescence quenching, fluorescence polarisation, time-resolved fluorescence, and fluorescence fluctuation spectroscopy (see articles in Issue 11). Time-dependent fluctuations of fluorescent biomolecules with different molecular properties, like molecular weight, translational and rotational diffusion time, colour and lifetime, potentially provide all the kinetic and thermodynamic information required in analysing complex interactions. In this mini-review article, we present recent extensions aimed to implement parallel laser excitation and parallel fluorescence detection that can lead to even further increase in throughput in miniaturised array-based analytical systems. We also report on developments and characterisations of multiplexing extension that allow multifocal laser excitation together with matched parallel fluorescence detection for parallel confocal dynamical fluorescence fluctuation studies at the single biomolecule level.

  4. Parallel vigilance: parents' dual focus following diagnosis of Type 1 diabetes mellitus in their young child.

    PubMed

    Niedel, Selaine; Traynor, Michael; McKee, Martin; Grey, Margaret

    2013-05-01

    There is consensus that enabling patient self-care and expertise leads to better management of chronic illness. Clinicians are being encouraged to manage clinical encounters in ways that promote these outcomes rather than perpetuate hierarchical relationships. This article describes one part of a larger study of 55 outpatient consultations conducted within 14 months of the diagnosis of Type 1 diabetes mellitus in young children. Participants were parents and the specialist doctors, nurses, dieticians and social workers who oversee the child's secondary care. Consultations were audio-recorded and transcribed. Our analysis draws on aspects of conversation analysis (CA) to investigate how parents' talk enacts a growing confidence in the management of their child's disease in the face of questioning from professionals. Analysis reveals how this talk distinguishes a duality of focus that combines the normal watchfulness exhibited by all parents as they protect their children, with an additional intense, parallel watchfulness for signs of potentially serious manifestations of diabetes. We term this phenomenon parallel vigilance and illustrate its development using five representative extracts from consultations. The concept of parallel vigilance extends the chronic illness literature and informs our understanding of a process that contributes to parents' developing expertise and provides new and important insights into the way in which parents conceptualize and implement their evolving role in the care of their child. Moreover, parallel vigilance serves as an enabler of parental contributions to the specialist consultation.

  5. Multiprocessor smalltalk: Implementation, performance, and analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pallas, J.I.

    1990-01-01

    Multiprocessor Smalltalk demonstrates the value of object-oriented programming on a multiprocessor. Its implementation and analysis shed light on three areas: concurrent programming in an object oriented language without special extensions, implementation techniques for adapting to multiprocessors, and performance factors in the resulting system. Adding parallelism to Smalltalk code is easy, because programs already use control abstractions like iterators. Smalltalk's basic control and concurrency primitives (lambda expressions, processes and semaphores) can be used to build parallel control abstractions, including parallel iterators, parallel objects, atomic objects, and futures. Language extensions for concurrency are not required. This implementation demonstrates that it is possiblemore » to build an efficient parallel object-oriented programming system and illustrates techniques for doing so. Three modification tools-serialization, replication, and reorganization-adapted the Berkeley Smalltalk interpreter to the Firefly multiprocessor. Multiprocessor Smalltalk's performance shows that the combination of multiprocessing and object-oriented programming can be effective: speedups (relative to the original serial version) exceed 2.0 for five processors on all the benchmarks; the median efficiency is 48%. Analysis shows both where performance is lost and how to improve and generalize the experimental results. Changes in the interpreter to support concurrency add at most 12% overhead; better access to per-process variables could eliminate much of that. Changes in the user code to express concurrency add as much as 70% overhead; this overhead could be reduced to 54% if blocks (lambda expressions) were reentrant. Performance is also lost when the program cannot keep all five processors busy.« less

  6. Novel hybrid GPU-CPU implementation of parallelized Monte Carlo parametric expectation maximization estimation method for population pharmacokinetic data analysis.

    PubMed

    Ng, C M

    2013-10-01

    The development of a population PK/PD model, an essential component for model-based drug development, is both time- and labor-intensive. A graphical-processing unit (GPU) computing technology has been proposed and used to accelerate many scientific computations. The objective of this study was to develop a hybrid GPU-CPU implementation of parallelized Monte Carlo parametric expectation maximization (MCPEM) estimation algorithm for population PK data analysis. A hybrid GPU-CPU implementation of the MCPEM algorithm (MCPEMGPU) and identical algorithm that is designed for the single CPU (MCPEMCPU) were developed using MATLAB in a single computer equipped with dual Xeon 6-Core E5690 CPU and a NVIDIA Tesla C2070 GPU parallel computing card that contained 448 stream processors. Two different PK models with rich/sparse sampling design schemes were used to simulate population data in assessing the performance of MCPEMCPU and MCPEMGPU. Results were analyzed by comparing the parameter estimation and model computation times. Speedup factor was used to assess the relative benefit of parallelized MCPEMGPU over MCPEMCPU in shortening model computation time. The MCPEMGPU consistently achieved shorter computation time than the MCPEMCPU and can offer more than 48-fold speedup using a single GPU card. The novel hybrid GPU-CPU implementation of parallelized MCPEM algorithm developed in this study holds a great promise in serving as the core for the next-generation of modeling software for population PK/PD analysis.

  7. Comparisons of disparities and risks of HIV infection in black and other men who have sex with men in Canada, UK, and USA: a meta-analysis.

    PubMed

    Millett, Gregorio A; Peterson, John L; Flores, Stephen A; Hart, Trevor A; Jeffries, William L; Wilson, Patrick A; Rourke, Sean B; Heilig, Charles M; Elford, Jonathan; Fenton, Kevin A; Remis, Robert S

    2012-07-28

    We did a meta-analysis to assess factors associated with disparities in HIV infection in black men who have sex with men (MSM) in Canada, the UK, and the USA. We searched Embase, Medline, Google Scholar, and online conference proceedings from Jan 1, 1981, to Dec 31, 2011, for racial comparative studies with quantitative outcomes associated with HIV risk or HIV infection. Key words and Medical Subject Headings (US National Library of Medicine) relevant to race were cross-referenced with citations pertinent to homosexuality in Canada, the UK, and the USA. Data were aggregated across studies for every outcome of interest to estimate overall effect sizes, which were converted into summary ORs for 106,148 black MSM relative to 581,577 other MSM. We analysed seven studies from Canada, 13 from the UK, and 174 from the USA. In every country, black MSM were as likely to engage similarly in serodiscordant unprotected sex as other MSM. Black MSM in Canada and the USA were less likely than other MSM to have a history of substance use (odds ratio, OR, 0·53, 95% CI 0·38-0·75, for Canada and 0·67, 0·50-0·92, for the USA). Black MSM in the UK (1·86, 1·58-2·18) and the USA (3·00, 2·06-4·40) were more likely to be HIV positive than were other MSM, but HIV-positive black MSM in each country were less likely (22% in the UK and 60% in the USA) to initiate combination antiretroviral therapy (cART) than other HIV-positive MSM. US HIV-positive black MSM were also less likely to have health insurance, have a high CD4 count, adhere to cART, or be virally suppressed than were other US HIV-positive MSM. Notably, despite a two-fold greater odds of having any structural barrier that increases HIV risk (eg, unemployment, low income, previous incarceration, or less education) compared with other US MSM, US black MSM were more likely to report any preventive behaviour against HIV infection (1·39, 1·23-1·57). For outcomes associated with HIV infection, disparities were greatest for US black MSM versus other MSM for structural barriers, sex partner demographics (eg, age, race), and HIV care outcomes, whereas disparities were least for sexual risk outcomes. Similar racial disparities in HIV and sexually transmitted infections and cART initiation are seen in MSM in the UK and the USA. Elimination of disparities in HIV infection in black MSM cannot be accomplished without addressing structural barriers or differences in HIV clinical care access and outcomes. None. Copyright © 2012 Elsevier Ltd. All rights reserved.

  8. High-Performance Parallel Analysis of Coupled Problems for Aircraft Propulsion

    NASA Technical Reports Server (NTRS)

    Felippa, C. A.; Farhat, C.; Park, K. C.; Gumaste, U.; Chen, P.-S.; Lesoinne, M.; Stern, P.

    1996-01-01

    This research program dealt with the application of high-performance computing methods to the numerical simulation of complete jet engines. The program was initiated in January 1993 by applying two-dimensional parallel aeroelastic codes to the interior gas flow problem of a bypass jet engine. The fluid mesh generation, domain decomposition and solution capabilities were successfully tested. Attention was then focused on methodology for the partitioned analysis of the interaction of the gas flow with a flexible structure and with the fluid mesh motion driven by these structural displacements. The latter is treated by a ALE technique that models the fluid mesh motion as that of a fictitious mechanical network laid along the edges of near-field fluid elements. New partitioned analysis procedures to treat this coupled three-component problem were developed during 1994 and 1995. These procedures involved delayed corrections and subcycling, and have been successfully tested on several massively parallel computers, including the iPSC-860, Paragon XP/S and the IBM SP2. For the global steady-state axisymmetric analysis of a complete engine we have decided to use the NASA-sponsored ENG10 program, which uses a regular FV-multiblock-grid discretization in conjunction with circumferential averaging to include effects of blade forces, loss, combustor heat addition, blockage, bleeds and convective mixing. A load-balancing preprocessor tor parallel versions of ENG10 was developed. During 1995 and 1996 we developed the capability tor the first full 3D aeroelastic simulation of a multirow engine stage. This capability was tested on the IBM SP2 parallel supercomputer at NASA Ames. Benchmark results were presented at the 1196 Computational Aeroscience meeting.

  9. On the progressive enrichment of the oxygen isotopic composition of water along a leaf.

    PubMed

    Farquhar, G. D.; Gan, K. S.

    2003-06-01

    A model has been derived for the enrichment of heavy isotopes of water in leaves, including progressive enrichment along the leaf. In the model, lighter water is preferentially transpired leaving heavier water to diffuse back into the xylem and be carried further along the leaf. For this pattern to be pronounced, the ratio of advection to diffusion (Péclet number) has to be large in the longitudinal direction, and small in the radial direction. The progressive enrichment along the xylem is less than that occurring at the sites of evaporation in the mesophyll, depending on the isolation afforded by the radial Péclet number. There is an upper bound on enrichment, and effects of ground tissue associated with major veins are included. When transpiration rate is spatially nonuniform, averaging of enrichment occurs more naturally with transpiration weighting than with area-based weighting. This gives zero average enrichment of transpired water, the modified Craig-Gordon equation for average enrichment at the sites of evaporation and the Farquhar and Lloyd (In Stable Isotopes and Plant Carbon-Water Relations, pp. 47-70. Academic Press, New York, USA, 1993) prediction for mesophyll water. Earlier results on the isotopic composition of evolved oxygen and of retro-diffused carbon dioxide are preserved if these processes vary in parallel with transpiration rate. Parallel variation should be indicated approximately by uniform carbon isotope discrimination across the leaf.

  10. Basic research planning in mathematical pattern recognition and image analysis

    NASA Technical Reports Server (NTRS)

    Bryant, J.; Guseman, L. F., Jr.

    1981-01-01

    Fundamental problems encountered while attempting to develop automated techniques for applications of remote sensing are discussed under the following categories: (1) geometric and radiometric preprocessing; (2) spatial, spectral, temporal, syntactic, and ancillary digital image representation; (3) image partitioning, proportion estimation, and error models in object scene interference; (4) parallel processing and image data structures; and (5) continuing studies in polarization; computer architectures and parallel processing; and the applicability of "expert systems" to interactive analysis.

  11. Impact of new computing systems on computational mechanics and flight-vehicle structures technology

    NASA Technical Reports Server (NTRS)

    Noor, A. K.; Storaasli, O. O.; Fulton, R. E.

    1984-01-01

    Advances in computer technology which may have an impact on computational mechanics and flight vehicle structures technology were reviewed. The characteristics of supersystems, highly parallel systems, and small systems are summarized. The interrelations of numerical algorithms and software with parallel architectures are discussed. A scenario for future hardware/software environment and engineering analysis systems is presented. Research areas with potential for improving the effectiveness of analysis methods in the new environment are identified.

  12. Fabric analysis of quartzites with negative magnetic susceptibility - Does AMS provide information of SPO or CPO of quartz?

    NASA Astrophysics Data System (ADS)

    Renjith, A. R.; Mamtani, Manish A.; Urai, Janos L.

    2016-01-01

    We ask the question whether petrofabric data from anisotropy of magnetic susceptibility (AMS) analysis of deformed quartzites gives information about shape preferred orientation (SPO) or crystallographic preferred orientation (CPO) of quartz. Since quartz is diamagnetic and has a negative magnetic susceptibility, 11 samples of nearly pure quartzites with a negative magnetic susceptibility were chosen for this study. After performing AMS analysis, electron backscatter diffraction (EBSD) analysis was done in thin sections prepared parallel to the K1K3 plane of the AMS ellipsoid. Results show that in all the samples quartz SPO is sub-parallel to the orientation of the magnetic foliation. However, in most samples no clear correspondance is observed between quartz CPO and K1 (magnetic lineation) direction. This is contrary to the parallelism observed between K1 direction and orientation of quartz c-axis in the case of undeformed single quartz crystal. Pole figures of quartz indicate that quartz c-axis tends to be parallel to K1 direction only in the case where intracrystalline deformation of quartz is accommodated by prism slip. It is therefore established that AMS investigation of quartz from deformed rocks gives information of SPO. Thus, it is concluded that petrofabric information of quartzite obtained from AMS is a manifestation of its shape anisotropy and not crystallographic preferred orientation.

  13. Line-Focused Optical Excitation of Parallel Acoustic Focused Sample Streams for High Volumetric and Analytical Rate Flow Cytometry.

    PubMed

    Kalb, Daniel M; Fencl, Frank A; Woods, Travis A; Swanson, August; Maestas, Gian C; Juárez, Jaime J; Edwards, Bruce S; Shreve, Andrew P; Graves, Steven W

    2017-09-19

    Flow cytometry provides highly sensitive multiparameter analysis of cells and particles but has been largely limited to the use of a single focused sample stream. This limits the analytical rate to ∼50K particles/s and the volumetric rate to ∼250 μL/min. Despite the analytical prowess of flow cytometry, there are applications where these rates are insufficient, such as rare cell analysis in high cellular backgrounds (e.g., circulating tumor cells and fetal cells in maternal blood), detection of cells/particles in large dilute samples (e.g., water quality, urine analysis), or high-throughput screening applications. Here we report a highly parallel acoustic flow cytometer that uses an acoustic standing wave to focus particles into 16 parallel analysis points across a 2.3 mm wide optical flow cell. A line-focused laser and wide-field collection optics are used to excite and collect the fluorescence emission of these parallel streams onto a high-speed camera for analysis. With this instrument format and fluorescent microsphere standards, we obtain analysis rates of 100K/s and flow rates of 10 mL/min, while maintaining optical performance comparable to that of a commercial flow cytometer. The results with our initial prototype instrument demonstrate that the integration of key parallelizable components, including the line-focused laser, particle focusing using multinode acoustic standing waves, and a spatially arrayed detector, can increase analytical and volumetric throughputs by orders of magnitude in a compact, simple, and cost-effective platform. Such instruments will be of great value to applications in need of high-throughput yet sensitive flow cytometry analysis.

  14. Finding Tropical Cyclones on a Cloud Computing Cluster: Using Parallel Virtualization for Large-Scale Climate Simulation Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hasenkamp, Daren; Sim, Alexander; Wehner, Michael

    Extensive computing power has been used to tackle issues such as climate changes, fusion energy, and other pressing scientific challenges. These computations produce a tremendous amount of data; however, many of the data analysis programs currently only run a single processor. In this work, we explore the possibility of using the emerging cloud computing platform to parallelize such sequential data analysis tasks. As a proof of concept, we wrap a program for analyzing trends of tropical cyclones in a set of virtual machines (VMs). This approach allows the user to keep their familiar data analysis environment in the VMs, whilemore » we provide the coordination and data transfer services to ensure the necessary input and output are directed to the desired locations. This work extensively exercises the networking capability of the cloud computing systems and has revealed a number of weaknesses in the current cloud system software. In our tests, we are able to scale the parallel data analysis job to a modest number of VMs and achieve a speedup that is comparable to running the same analysis task using MPI. However, compared to MPI based parallelization, the cloud-based approach has a number of advantages. The cloud-based approach is more flexible because the VMs can capture arbitrary software dependencies without requiring the user to rewrite their programs. The cloud-based approach is also more resilient to failure; as long as a single VM is running, it can make progress while as soon as one MPI node fails the whole analysis job fails. In short, this initial work demonstrates that a cloud computing system is a viable platform for distributed scientific data analyses traditionally conducted on dedicated supercomputing systems.« less

  15. Type synthesis for 4-DOF parallel press mechanism using GF set theory

    NASA Astrophysics Data System (ADS)

    He, Jun; Gao, Feng; Meng, Xiangdun; Guo, Weizhong

    2015-07-01

    Parallel mechanisms is used in the large capacity servo press to avoid the over-constraint of the traditional redundant actuation. Currently, the researches mainly focus on the performance analysis for some specific parallel press mechanisms. However, the type synthesis and evaluation of parallel press mechanisms is seldom studied, especially for the four degrees of freedom(DOF) press mechanisms. The type synthesis of 4-DOF parallel press mechanisms is carried out based on the generalized function(GF) set theory. Five design criteria of 4-DOF parallel press mechanisms are firstly proposed. The general procedure of type synthesis of parallel press mechanisms is obtained, which includes number synthesis, symmetrical synthesis of constraint GF sets, decomposition of motion GF sets and design of limbs. Nine combinations of constraint GF sets of 4-DOF parallel press mechanisms, ten combinations of GF sets of active limbs, and eleven combinations of GF sets of passive limbs are synthesized. Thirty-eight kinds of press mechanisms are presented and then different structures of kinematic limbs are designed. Finally, the geometrical constraint complexity( GCC), kinematic pair complexity( KPC), and type complexity( TC) are proposed to evaluate the press types and the optimal press type is achieved. The general methodologies of type synthesis and evaluation for parallel press mechanism are suggested.

  16. Institute for Defense Analysis. Annual Report 1995.

    DTIC Science & Technology

    1995-01-01

    staff have been involved in the community-wide development of MPI as well as in its application to specific NSA problems. 35 Parallel Groebner ...Basis Code — Symbolic Computing on Parallel Machines The Groebner basis method is a set of algorithms for reformulating very complex algebraic expres

  17. Parallel algorithms for placement and routing in VLSI design. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Brouwer, Randall Jay

    1991-01-01

    The computational requirements for high quality synthesis, analysis, and verification of very large scale integration (VLSI) designs have rapidly increased with the fast growing complexity of these designs. Research in the past has focused on the development of heuristic algorithms, special purpose hardware accelerators, or parallel algorithms for the numerous design tasks to decrease the time required for solution. Two new parallel algorithms are proposed for two VLSI synthesis tasks, standard cell placement and global routing. The first algorithm, a parallel algorithm for global routing, uses hierarchical techniques to decompose the routing problem into independent routing subproblems that are solved in parallel. Results are then presented which compare the routing quality to the results of other published global routers and which evaluate the speedups attained. The second algorithm, a parallel algorithm for cell placement and global routing, hierarchically integrates a quadrisection placement algorithm, a bisection placement algorithm, and the previous global routing algorithm. Unique partitioning techniques are used to decompose the various stages of the algorithm into independent tasks which can be evaluated in parallel. Finally, results are presented which evaluate the various algorithm alternatives and compare the algorithm performance to other placement programs. Measurements are presented on the parallel speedups available.

  18. Code Parallelization with CAPO: A User Manual

    NASA Technical Reports Server (NTRS)

    Jin, Hao-Qiang; Frumkin, Michael; Yan, Jerry; Biegel, Bryan (Technical Monitor)

    2001-01-01

    A software tool has been developed to assist the parallelization of scientific codes. This tool, CAPO, extends an existing parallelization toolkit, CAPTools developed at the University of Greenwich, to generate OpenMP parallel codes for shared memory architectures. This is an interactive toolkit to transform a serial Fortran application code to an equivalent parallel version of the software - in a small fraction of the time normally required for a manual parallelization. We first discuss the way in which loop types are categorized and how efficient OpenMP directives can be defined and inserted into the existing code using the in-depth interprocedural analysis. The use of the toolkit on a number of application codes ranging from benchmark to real-world application codes is presented. This will demonstrate the great potential of using the toolkit to quickly parallelize serial programs as well as the good performance achievable on a large number of toolkit to quickly parallelize serial programs as well as the good performance achievable on a large number of processors. The second part of the document gives references to the parameters and the graphic user interface implemented in the toolkit. Finally a set of tutorials is included for hands-on experiences with this toolkit.

  19. PREFACE Proceedings of the XIV International Conference on Small-Angle Scattering, SAS-2009

    NASA Astrophysics Data System (ADS)

    King, Stephen; Terrill, Nicholas

    2010-10-01

    The XIV International Conference on Small-Angle Scattering, SAS-2009, was held in Oxford UK, 13-18 September 2009, and was jointly organised under the auspices of the International Union of Crystallography Commission on SAS by a team from the Diamond Light Source and the ISIS Pulsed Neutron Source - their first such joint venture - with help from the UK Science and Technology Facilities Council. It was the first time that this long running and successful series of conferences on the application, science and technology of small-angle scattering techniques had been staged in the UK. The UK has a proud heritage in small-angle scattering: as home to one of the world's first SANS instruments (at AERE Harwell), as the site of the world's first 2nd generation X-ray Synchrotron (the SRS at Daresbury with its suite of SAXS beamlines), and latterly as the location of the world's most successful pulsed source SANS instrument. Indeed, 2009 also marked the 25th Anniversary of neutron operations at ISIS and the opening of a Second Target Station. Whilst the SRS ceased operations in 2008, its mantle has been inherited by the Diamond synchrotron. Many delegates took the opportunity to visit both Diamond and ISIS during a conference excursion. Despite the prevailing global economic downturn, we were delighted that 434 delegates from 32 different countries were able to attend SAS-2009; two-thirds were drawn from the UK, Germany, Japan, the USA and France, but there were also sizeable contingents from Australia, Korea, Taiwan and South America. In many ways this geographical spread reflects the present and emerging distribution, respectively, of 3rd generation X-ray synchrotrons and high-flux neutron sources, although the scope of the conference was not solely limited to these probes. Financial support from the IUCr enabled us to grant bursaries to attend SAS-2009 to 12 delegates from emerging countries (Algeria, Argentina, Brazil, India, Nepal, Romania, Russia and the Ukraine). The scientific heart of the conference comprised 10 plenary sessions, interspersed by 39 'themed' parallel sessions, 2 poster sessions, an afternoon tour of Diamond and ISIS, and a week-long exhibition. There were 144 contributed oral presentations and 308 poster presentations across a total of 21 themes. Over half of all presentations fell under 6 themes: biological systems, colloids and solutions, instrumentation, kinetic and time-resolved measurements, polymers, and surfaces and interfaces. The importance of SAS techniques to the study of biology, materials science and soft matter/nanoscience is clear. The plenary presentations, which covered topics as diverse as advanced analysis techniques, biology, green chemistry, materials science and surfaces, were delivered by Frank Bates, Minnesota, USA, Peter Fratzl, MPI Golm, Germany, Buxing Han, Bejing, China, Julia Kornfield, CIT, USA, Jan Skov Pedersen, Aarhus, Denmark, Moonhor Ree, Pohang, Korea, Mitsuhiro Shibayama, Tokyo, Japan, Robert Thomas, Oxford, UK, Jill Trewhella, Sydney, Australia, and Thomas Zemb, ICSM Bagnols, France. Instigated by representatives of the Belgian and Dutch SAS communities one parallel session was dedicated to a tribute for Michel Koch, the pioneer of so many novel applications of SAXS, who retired after 30 years at the EMBL Hamburg in late 2006. With a supporting cast that included Wim Bras, ESRF, France, Tony Ryan, Sheffield, UK and Joe Zaccai, ILL,France, and watched by former colleague André Gabriel, Michel treated the audience to a fascinating - and at times light-hearted - retrospective of the evolution of synchrotron SAXS. Another parallel session was devoted to the work of the canSAS (Collective Action for Nomadic Small-Angle Scatterers) network of large-facility representatives and instrument scientists in areas such as data file formats, intensity calibration and software development. For further information see http://www.smallangles.net/wgwiki/index.php/canSAS_Working_Groups. A total of nine awards were presented at the conference. The Lifetime Achievement, or 'Andre Guinier', Award, given to those who have made a sustained and recognised contribution to the development or application of Small-Angle Scattering, went to Vittorio Luzzati, Emeritus Research Scientist at the Centre de Génétique Moléculaire du CNRS, France. Dr Luzzati has had a long and distinguished career in X-ray scattering publishing over 170 research papers - 10 in Nature - which have so far accumulated over 3500 citations. The award for 'Excellence in SAS Technical/Instrumental Development' went to J Polte, BAM, Germany, for 'New insights into nucleation and growth processes of gold nanoparticles derived via coupled in-situ methods'. That for 'Excellence in the Theoretical Development of SAS' went to C Gommes, Liege, Belgium, for 'SAXS Data Analysis of Ordered and Disordered Morphologies with Gaussian Random Field Models'. B Pauw, Technical University, Denmark, received the award for 'Excellence in the Application of SAS' for work on 'Strain-induced Internal Fibrillation of Aramid Filaments'. And the award for 'Excellence in the Communication of SAS Science' went to J G Grossmann, Liverpool, UK, for his talk on 'Probing the Structure of Biological Macromolecules in the Gas Phase'. A Hexemer, LBNL, USA, won the prize for the 'Best Poster in Technical/Instrumental Development' for 'SAXS/WAXS using a Multilayer Monochromator'. The prize for 'Best Poster in Theoretical Development' went to S Haas, Helmholtz Centre Berlin, Germany, for 'Simultaneous structure and chemical nano-analysis of an efficient frequency upconversion glass-ceramic by ASAXS'. And in a remarkable 'double', the prizes for 'Best Poster for Application in Life Sciences' and 'Best Poster for Application in Physical Sciences' went to A Maerten and J Prass, respectively, both from MPI Golm, Germany, for their work on 'SAXS studies of human tooth dentine: analysis of a spatially inhomogeneous and varying bio-material' and 'Analysis of sorption strains in ordered mesoporous materials by in-situ small-angle x-ray diffraction'. The conference could not have been staged without the support and commitment of a large number of people and organizations, many of whom are listed separately within these Proceedings, and it is only right that we acknowledge their contribution. The generous financial support from our sponsors was particularly welcome given the economic climate. To all of them we offer our heartfelt and grateful thanks. We would also like to thank our Guest Editor, Goran Ungar, his team of Section Editors, and the several dozen anonymous referees, for so admirably managing the scientific review of the manuscripts in these Proceedings, and Graham Douglas and Johanna Schwarz at IoP Publishing for their assistance in bringing these Proceedings to fruition. Stephen King Nicholas Terrill SAS-2009 Co-Chairmen and Local Organisers SAS-2012 will be held in Sydney, Australia, 18-23 November 2012.

  20. Modelling and simulation of parallel triangular triple quantum dots (TTQD) by using SIMON 2.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fathany, Maulana Yusuf, E-mail: myfathany@gmail.com; Fuada, Syifaul, E-mail: fsyifaul@gmail.com; Lawu, Braham Lawas, E-mail: bram-labs@rocketmail.com

    2016-04-19

    This research presents analysis of modeling on Parallel Triple Quantum Dots (TQD) by using SIMON (SIMulation Of Nano-structures). Single Electron Transistor (SET) is used as the basic concept of modeling. We design the structure of Parallel TQD by metal material with triangular geometry model, it is called by Triangular Triple Quantum Dots (TTQD). We simulate it with several scenarios using different parameters; such as different value of capacitance, various gate voltage, and different thermal condition.

  1. Parallel-vector out-of-core equation solver for computational mechanics

    NASA Technical Reports Server (NTRS)

    Qin, J.; Agarwal, T. K.; Storaasli, O. O.; Nguyen, D. T.; Baddourah, M. A.

    1993-01-01

    A parallel/vector out-of-core equation solver is developed for shared-memory computers, such as the Cray Y-MP machine. The input/ output (I/O) time is reduced by using the a synchronous BUFFER IN and BUFFER OUT, which can be executed simultaneously with the CPU instructions. The parallel and vector capability provided by the supercomputers is also exploited to enhance the performance. Numerical applications in large-scale structural analysis are given to demonstrate the efficiency of the present out-of-core solver.

  2. Applications of Parallel Process HiMAP for Large Scale Multidisciplinary Problems

    NASA Technical Reports Server (NTRS)

    Guruswamy, Guru P.; Potsdam, Mark; Rodriguez, David; Kwak, Dochay (Technical Monitor)

    2000-01-01

    HiMAP is a three level parallel middleware that can be interfaced to a large scale global design environment for code independent, multidisciplinary analysis using high fidelity equations. Aerospace technology needs are rapidly changing. Computational tools compatible with the requirements of national programs such as space transportation are needed. Conventional computation tools are inadequate for modern aerospace design needs. Advanced, modular computational tools are needed, such as those that incorporate the technology of massively parallel processors (MPP).

  3. Performance evaluation of parallel electric field tunnel field-effect transistor by a distributed-element circuit model

    NASA Astrophysics Data System (ADS)

    Morita, Yukinori; Mori, Takahiro; Migita, Shinji; Mizubayashi, Wataru; Tanabe, Akihito; Fukuda, Koichi; Matsukawa, Takashi; Endo, Kazuhiko; O'uchi, Shin-ichi; Liu, Yongxun; Masahara, Meishoku; Ota, Hiroyuki

    2014-12-01

    The performance of parallel electric field tunnel field-effect transistors (TFETs), in which band-to-band tunneling (BTBT) was initiated in-line to the gate electric field was evaluated. The TFET was fabricated by inserting an epitaxially-grown parallel-plate tunnel capacitor between heavily doped source wells and gate insulators. Analysis using a distributed-element circuit model indicated there should be a limit of the drain current caused by the self-voltage-drop effect in the ultrathin channel layer.

  4. Thyroid insufficiency in developing rat brain: A genomic analysis.

    EPA Science Inventory

    Thyroid Insufficiency in the Developing Rat Brain: A Genomic Analysis. JE Royland and ME Gilbert, Neurotox. Div., U.S. EPA, RTP, NC, USA. Endocrine disruption (ED) is an area of major concern in environmental neurotoxicity. Severe deficits in thyroid hormone (TH) levels have bee...

  5. COMPUTER ANALYSIS OF PLANAR GAMMA CAMERA IMAGES

    EPA Science Inventory



    COMPUTER ANALYSIS OF PLANAR GAMMA CAMERA IMAGES

    T Martonen1 and J Schroeter2

    1Experimental Toxicology Division, National Health and Environmental Effects Research Laboratory, U.S. EPA, Research Triangle Park, NC 27711 USA and 2Curriculum in Toxicology, Unive...

  6. Spatial analysis of volatile organic compounds in South Philadelphia using passive samplers

    EPA Science Inventory

    Select volatile organic compounds (VOCs) were measured in the vicinity of a petroleum refinery and related operations in South Philadelphia, Pennsylvania, USA, using passive air sampling and laboratory analysis methods. Two-week, time-integrated samplers were deployed at 17 sites...

  7. Dietary intakes among South Asian adults differ by length of residence in the USA

    PubMed Central

    Talegawkar, Sameera A.; Kandula, Namratha R.; Gadgil, Meghana D.; Desai, Dipika; Kanaya, Alka M.

    2015-01-01

    Objective To examine whether nutrient and food intakes among South Asian adult immigrants differ by length of residence in the USA. Design Cross-sectional analysis to examine differences in nutrient and food intakes by length of residence in the USA. Dietary data were collected using an interviewer-administered, culturally appropriate FFQ, while self-reported length of residence was assessed using a questionnaire and modelled as tertiles. Setting The Mediators of Atherosclerosis in South Asians Living in America (MASALA) study. Subjects Eight hundred and seventy-four South Asians (mean age = 55 (SD 9) years; 47 % women; range of length of residence in the USA = 2–58 years), part of the baseline examination of the MASALA study. Results Intakes of fat, including saturated and trans fats, dietary cholesterol and n-6 fatty acids, were directly associated with length of residence, while intakes of energy, carbohydrate, glycaemic index and load, protein, dietary fibre, folate and K were inversely associated with length of residence (P trend <0·05). A longer length of residence in the USA was also associated with higher intakes of alcoholic beverages, mixed dishes including pizza and pasta, fats and oils, and lower intakes of beans and lentils, breads, grains and flour products, milk and dairy products, rice, starchy vegetables and sugar, candy and jam (P for differences across groups < 0·05). Conclusions Length of residence in the USA influences diet and nutrient intakes among South Asian adult immigrants and should be considered when investigating and planning dietary interventions to mitigate chronic disease risk. PMID:25990446

  8. Trends in survival of chronic lymphocytic leukemia patients in Germany and the USA in the first decade of the twenty-first century.

    PubMed

    Pulte, Dianne; Castro, Felipe A; Jansen, Lina; Luttmann, Sabine; Holleczek, Bernd; Nennecke, Alice; Ressing, Meike; Katalinic, Alexander; Brenner, Hermann

    2016-03-22

    Recent population-based studies in the United States of America (USA) and other countries have shown improvements in survival for patients with chronic lymphocytic leukemia (CLL) diagnosed in the early twenty-first century. Here, we examine the survival for patients diagnosed with CLL in Germany in 1997-2011. Data were extracted from 12 cancer registries in Germany and compared to the data from the USA. Period analysis was used to estimate 5- and 10-year relative survival (RS). Five- and 10-year RS estimates in 2009-2011 of 80.2 and 59.5%, respectively, in Germany and 82.4 and 64.7%, respectively, in the USA were observed. Overall, 5-year RS increased significantly in Germany and the difference compared to the survival in the USA which slightly decreased between 2003-2005 and 2009-2011. However, age-specific analyses showed persistently higher survival for all ages except for 15-44 in the USA. In general, survival decreased with age, but the age-related disparity was small for patients younger than 75. In both countries, 5-year RS was >80% for patients less than 75 years of age but <70% for those age 75+. Overall, 5-year survival for patients with CLL is good, but 10-year survival is significantly lower, and survival was much lower for those age 75+. Major differences in survival between countries were not observed. Further research into ways to increase survival for older CLL patients are needed to reduce the persistent large age-related survival disparity.

  9. An overview of EU and USA intestinal transplant current activity.

    PubMed

    Lauro, A; Panaro, F; Iyer, K R

    2017-04-01

    To report the current activity of intestinal transplantation in Europe (EU) and Unites States of America (USA), underlining outcomes in the last 5 years and discussing possible trends. Data review of results was performed through analysis of ITR and UNOS registries, Eurotransplant and newsletter transplant reports, congress abstracts, international published literature, personal communications and hospital web sites. The absence in Europe of a sole organization collecting donors and the presence of many low-volume centers (less than 5 cases/year) makes the difference with USA: in the last 5 years (2010-2014), 222 intestinal/multivisceral transplants have been performed in EU countries (most of them in the UK), while in USA, the number of transplants achieved 634 procedures in the same period of time. Waiting list mortality remains unacceptable in both continents. Improved short-term results, with over 80% survival at 1 year, have been achieved in the busiest transplant centers likely due to immune-induction agents, more recently to innovative cross match strategies and optimizing organ allocation, but long term outcomes are still inferior to other organ transplants. Most long-term survivors were reintegrated to society with self-sustained socioeconomic status. The economic burden for the society is high and related costs are different between USA and EU (and inside Europe between member state's health-care systems), but cost-effectiveness for intestinal transplantation still needs to be proved. Overall intestinal transplantation continues to develop in EU and USA together with surgical and medical rehabilitation of patients affected by short gut syndrome. Copyright © 2017 Elsevier Masson SAS. All rights reserved.

  10. Disparity in dental attendance among older adult populations: a comparative analysis across selected European countries and the USA.

    PubMed

    Manski, Richard; Moeller, John; Chen, Haiyan; Widström, Eeva; Listl, Stefan

    2016-02-01

    The current study addresses the extent to which diversity in dental attendance across population subgroups exists within and between the USA and selected European countries. The analyses relied on 2006/2007 data from the Survey of Health, Ageing and Retirement in Europe (SHARE) and 2004-2006 data from the Health and Retirement Study (HRS) in the USA for respondents≥51 years of age. Logistic regression models were estimated to identify impacts of dental-care coverage, and of oral and general health status, on dental-care use. We were unable to discern significant differences in dental attendance across population subgroups in countries with and without social health insurance, between the USA and European countries, and between European countries classified according to social welfare regime. Patterns of diverse dental use were found, but they did not appear predominately in countries classified according to welfare state regime or according to the presence or absence of social health insurance. The findings of this study suggest that income and education have a stronger, and more persistent, correlation with dental use than the correlation between dental insurance and dental use across European countries. We conclude that: (i) higher overall rates of coverage in most European countries, compared with relatively lower rates in the USA, contribute to this finding; and that (ii) policies targeted to improving the income of older persons and their awareness of the importance of oral health care in both Europe and the USA can contribute to improving the use of dental services. © 2015 FDI World Dental Federation.

  11. Analysis and optimization of gyrokinetic toroidal simulations on homogenous and heterogenous platforms

    DOE PAGES

    Ibrahim, Khaled Z.; Madduri, Kamesh; Williams, Samuel; ...

    2013-07-18

    The Gyrokinetic Toroidal Code (GTC) uses the particle-in-cell method to efficiently simulate plasma microturbulence. This paper presents novel analysis and optimization techniques to enhance the performance of GTC on large-scale machines. We introduce cell access analysis to better manage locality vs. synchronization tradeoffs on CPU and GPU-based architectures. Finally, our optimized hybrid parallel implementation of GTC uses MPI, OpenMP, and NVIDIA CUDA, achieves up to a 2× speedup over the reference Fortran version on multiple parallel systems, and scales efficiently to tens of thousands of cores.

  12. Parallel Curves: Getting There and Getting Back

    ERIC Educational Resources Information Center

    Agnew, A. F.; Mathews, J. H.

    2006-01-01

    This note takes up the issue of parallel curves while illustrating the utility of "Mathematica" in computations. This work complements results presented earlier. The presented treatment, considering the more general case of parametric curves, provides an analysis of the appearance of cusp singularities, and emphasizes the utility of symbolic…

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kargupta, H.; Stafford, B.; Hamzaoglu, I.

    This paper describes an experimental parallel/distributed data mining system PADMA (PArallel Data Mining Agents) that uses software agents for local data accessing and analysis and a web based interface for interactive data visualization. It also presents the results of applying PADMA for detecting patterns in unstructured texts of postmortem reports and laboratory test data for Hepatitis C patients.

  14. Deformation microstructures of Barre granite: An optical, Sem and Tem study

    USGS Publications Warehouse

    Schedl, A.; Kronenberg, A.K.; Tullis, J.

    1986-01-01

    New scanning electron microscope techniques have been developed for characterizing ductile deformation microstructures in felsic rocks. In addition, the thermomechanical history of the macroscopically undeformed Barre granite (Vermont, U.S.A.) has been reconstructed based on examination of deformation microstructures using optical microscopy, scanning electron microscopy, and transmission electron microscopy. The microstructures reveal three distinct events: 1. (1) a low-stress, high-temperature event that produced subgrains in feldspars, and subgrains and recrystallized grains in quartz; 2. (2) a high-stress, low-temperature event that produced a high dislocation density in quartz and feldspars; and 3. (3) a lowest-temperature event that produced cracks, oriented primarily along cleavage planes in feldspars, and parallel to the macroscopic rift in quartz. The first two events are believed to reflect various stages in the intrusion and cooling history of the pluton, and the last may be related to the last stages of cooling, or to later tectonism. ?? 1986.

  15. SToRM: A Model for 2D environmental hydraulics

    USGS Publications Warehouse

    Simões, Francisco J. M.

    2017-01-01

    A two-dimensional (depth-averaged) finite volume Godunov-type shallow water model developed for flow over complex topography is presented. The model, SToRM, is based on an unstructured cell-centered finite volume formulation and on nonlinear strong stability preserving Runge-Kutta time stepping schemes. The numerical discretization is founded on the classical and well established shallow water equations in hyperbolic conservative form, but the convective fluxes are calculated using auto-switching Riemann and diffusive numerical fluxes. Computational efficiency is achieved through a parallel implementation based on the OpenMP standard and the Fortran programming language. SToRM’s implementation within a graphical user interface is discussed. Field application of SToRM is illustrated by utilizing it to estimate peak flow discharges in a flooding event of the St. Vrain Creek in Colorado, U.S.A., in 2013, which reached 850 m3/s (~30,000 f3 /s) at the location of this study.

  16. Historical aspects of human presence in Space

    NASA Astrophysics Data System (ADS)

    Harsch, V.

    2007-02-01

    Purpose: This paper presents the development of human presence in Space from its beginnings. Study hypotheses were based on historical findings on scientific, medical, cultural, and political aspects of manned Space flight due to the different attitudes of Space minded nations and organizations. Impacts of aerospace medicine on the advances of biomedical sciences will be touched upon, as well as the historical development of aviation and Space medical achievements which are described briefly and visions for future developments are given. Methods: An overview was gained by literature-study, archives research and oral history taking. Results: Aviation Medicine evolved parallel to Man's ability to fly. War-triggered advancements in aviation brought mankind to the edge of space-equivalent conditions within a few decades of the first motor-flight, which took place in the USA in 1903 [V. Harsch, Aerospace medicine in Germany: from the very beginnings, Aviation and Space Environment Medicine 71 (2000) 447-450 [1

  17. Mixed methods research in mental health nursing.

    PubMed

    Kettles, A M; Creswell, J W; Zhang, W

    2011-08-01

    Mixed methods research is becoming more widely used in order to answer research questions and to investigate research problems in mental health and psychiatric nursing. However, two separate literature searches, one in Scotland and one in the USA, revealed that few mental health nursing studies identified mixed methods research in their titles. Many studies used the term 'embedded' but few studies identified in the literature were mixed methods embedded studies. The history, philosophical underpinnings, definition, types of mixed methods research and associated pragmatism are discussed, as well as the need for mixed methods research. Examples of mental health nursing mixed methods research are used to illustrate the different types of mixed methods: convergent parallel, embedded, explanatory and exploratory in their sequential and concurrent combinations. Implementing mixed methods research is also discussed briefly and the problem of identifying mixed methods research in mental and psychiatric nursing are discussed with some possible solutions to the problem proposed. © 2011 Blackwell Publishing.

  18. AMS 14C dating of lime mortar

    NASA Astrophysics Data System (ADS)

    Heinemeier, Jan; Jungner, Högne; Lindroos, Alf; Ringbom, Åsa; von Konow, Thorborg; Rud, Niels

    1997-03-01

    A method for refining lime mortar samples for 14C dating has been developed. It includes mechanical and chemical separation of mortar carbonate with optical control of the purity of the samples. The method has been applied to a large series of AMS datings on lime mortar from three medieval churches on the Åland Islands, Finland. The datings show convincing internal consistency and confine the construction time of the churches to AD 1280-1380 with a most probable date just before AD 1300. We have also applied the method to the controversial Newport Tower, Rhode Island, USA. Our mortar datings confine the building to colonial time in the 17th century and thus refute claims of Viking origin of the tower. For the churches, a parallel series of datings of organic (charcoal) inclusions in the mortar show less reliable results than the mortar samples, which is ascribed to poor association with the construction time.

  19. An aftermath analysis of the 2014 coal mine accident in Soma, Turkey: Use of risk performance indicators based on historical experience.

    PubMed

    Spada, Matteo; Burgherr, Peter

    2016-02-01

    On the 13th of May 2014 a fire related incident in the Soma coal mine in Turkey caused 301 fatalities and more than 80 injuries. This has been the largest coal mine accident in Turkey, and in the OECD country group, so far. This study investigated if such a disastrous event should be expected, in a statistical sense, based on historical observations. For this purpose, PSI's ENSAD database is used to extract accident data for the period 1970-2014. Four different cases are analyzed, i.e., OECD, OECD w/o Turkey, Turkey and USA. Analysis of temporal trends for annual numbers of accidents and fatalities indicated a non-significant decreasing tendency for OECD and OECD w/o Turkey and a significant one for USA, whereas for Turkey both measures showed an increase over time. The expectation analysis revealed clearly that an event with the consequences of the Soma accident is rather unlikely for OECD, OECD w/o Turkey and USA. In contrast, such a severe accident has a substantially higher expectation for Turkey, i.e. it cannot be considered an extremely rare event, based on historical experience. This indicates a need for improved safety measures and stricter regulations in the Turkish coal mining sector in order to get closer to the rest of OECD. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. Performance of GeantV EM Physics Models

    NASA Astrophysics Data System (ADS)

    Amadio, G.; Ananya, A.; Apostolakis, J.; Aurora, A.; Bandieramonte, M.; Bhattacharyya, A.; Bianchini, C.; Brun, R.; Canal, P.; Carminati, F.; Cosmo, G.; Duhem, L.; Elvira, D.; Folger, G.; Gheata, A.; Gheata, M.; Goulas, I.; Iope, R.; Jun, S. Y.; Lima, G.; Mohanty, A.; Nikitina, T.; Novak, M.; Pokorski, W.; Ribon, A.; Seghal, R.; Shadura, O.; Vallecorsa, S.; Wenzel, S.; Zhang, Y.

    2017-10-01

    The recent progress in parallel hardware architectures with deeper vector pipelines or many-cores technologies brings opportunities for HEP experiments to take advantage of SIMD and SIMT computing models. Launched in 2013, the GeantV project studies performance gains in propagating multiple particles in parallel, improving instruction throughput and data locality in HEP event simulation on modern parallel hardware architecture. Due to the complexity of geometry description and physics algorithms of a typical HEP application, performance analysis is indispensable in identifying factors limiting parallel execution. In this report, we will present design considerations and preliminary computing performance of GeantV physics models on coprocessors (Intel Xeon Phi and NVidia GPUs) as well as on mainstream CPUs.

  1. The engine design engine. A clustered computer platform for the aerodynamic inverse design and analysis of a full engine

    NASA Technical Reports Server (NTRS)

    Sanz, J.; Pischel, K.; Hubler, D.

    1992-01-01

    An application for parallel computation on a combined cluster of powerful workstations and supercomputers was developed. A Parallel Virtual Machine (PVM) is used as message passage language on a macro-tasking parallelization of the Aerodynamic Inverse Design and Analysis for a Full Engine computer code. The heterogeneous nature of the cluster is perfectly handled by the controlling host machine. Communication is established via Ethernet with the TCP/IP protocol over an open network. A reasonable overhead is imposed for internode communication, rendering an efficient utilization of the engaged processors. Perhaps one of the most interesting features of the system is its versatile nature, that permits the usage of the computational resources available that are experiencing less use at a given point in time.

  2. A high-performance spatial database based approach for pathology imaging algorithm evaluation

    PubMed Central

    Wang, Fusheng; Kong, Jun; Gao, Jingjing; Cooper, Lee A.D.; Kurc, Tahsin; Zhou, Zhengwen; Adler, David; Vergara-Niedermayr, Cristobal; Katigbak, Bryan; Brat, Daniel J.; Saltz, Joel H.

    2013-01-01

    Background: Algorithm evaluation provides a means to characterize variability across image analysis algorithms, validate algorithms by comparison with human annotations, combine results from multiple algorithms for performance improvement, and facilitate algorithm sensitivity studies. The sizes of images and image analysis results in pathology image analysis pose significant challenges in algorithm evaluation. We present an efficient parallel spatial database approach to model, normalize, manage, and query large volumes of analytical image result data. This provides an efficient platform for algorithm evaluation. Our experiments with a set of brain tumor images demonstrate the application, scalability, and effectiveness of the platform. Context: The paper describes an approach and platform for evaluation of pathology image analysis algorithms. The platform facilitates algorithm evaluation through a high-performance database built on the Pathology Analytic Imaging Standards (PAIS) data model. Aims: (1) Develop a framework to support algorithm evaluation by modeling and managing analytical results and human annotations from pathology images; (2) Create a robust data normalization tool for converting, validating, and fixing spatial data from algorithm or human annotations; (3) Develop a set of queries to support data sampling and result comparisons; (4) Achieve high performance computation capacity via a parallel data management infrastructure, parallel data loading and spatial indexing optimizations in this infrastructure. Materials and Methods: We have considered two scenarios for algorithm evaluation: (1) algorithm comparison where multiple result sets from different methods are compared and consolidated; and (2) algorithm validation where algorithm results are compared with human annotations. We have developed a spatial normalization toolkit to validate and normalize spatial boundaries produced by image analysis algorithms or human annotations. The validated data were formatted based on the PAIS data model and loaded into a spatial database. To support efficient data loading, we have implemented a parallel data loading tool that takes advantage of multi-core CPUs to accelerate data injection. The spatial database manages both geometric shapes and image features or classifications, and enables spatial sampling, result comparison, and result aggregation through expressive structured query language (SQL) queries with spatial extensions. To provide scalable and efficient query support, we have employed a shared nothing parallel database architecture, which distributes data homogenously across multiple database partitions to take advantage of parallel computation power and implements spatial indexing to achieve high I/O throughput. Results: Our work proposes a high performance, parallel spatial database platform for algorithm validation and comparison. This platform was evaluated by storing, managing, and comparing analysis results from a set of brain tumor whole slide images. The tools we develop are open source and available to download. Conclusions: Pathology image algorithm validation and comparison are essential to iterative algorithm development and refinement. One critical component is the support for queries involving spatial predicates and comparisons. In our work, we develop an efficient data model and parallel database approach to model, normalize, manage and query large volumes of analytical image result data. Our experiments demonstrate that the data partitioning strategy and the grid-based indexing result in good data distribution across database nodes and reduce I/O overhead in spatial join queries through parallel retrieval of relevant data and quick subsetting of datasets. The set of tools in the framework provide a full pipeline to normalize, load, manage and query analytical results for algorithm evaluation. PMID:23599905

  3. The dynamic response of Kennicott Glacier, Alaska, USA, to the Hidden Creek Lake outburst flood

    USGS Publications Warehouse

    Anderson, R. Scott; Walder, J.S.; Anderson, S.P.; Trabant, D.C.; Fountain, A.G.

    2005-01-01

    Glacier sliding is commonly linked with elevated water pressure at the glacier bed. Ice surface motion during a 3 week period encompassing an outburst of ice-dammed Hidden Creek Lake (HCL) at Kennicott Glacier, Alaska, USA, showed enhanced sliding during the flood. Two stakes, 1.2 km from HCL, revealed increased speed in two episodes, both associated with uplift of the ice surface relative to the trajectory of bed-parallel motion. Uplift of the surface began 12 days before the flood, initially stabilizing at a value of 0.25 m. Two days after lake drainage began, further uplift (reaching 0.4 m) occurred while surface speed peaked at 1.2 m d-1. Maximum surface uplift coincided with peak discharge from HCL, high water level in a down-glacier ice-marginal basin, and low solute concentrations in the Kennicott River. Each of these records is consistent with high subglacial water pressure. We interpret the ice surface motion as arising from sliding up backs of bumps on the bed, which enlarges cavities and produces bed separation. The outburst increased water pressure over a broad region, promoting sliding, inhibiting cavity closure, and blocking drainage of solute-rich water from the distributed system. Pressure drop upon termination of the outburst drained water from and depressurized the distributed system, reducing sliding speeds. Expanded cavities then collapsed with a 1 day time-scale set by the local ice thickness.

  4. Lower Cretaceous paleo-Vertisols and sedimentary interrelationships in stacked alluvial sequences, Utah, USA

    NASA Astrophysics Data System (ADS)

    Joeckel, R. M.; Ludvigson, G. A.; Kirkland, J. I.

    2017-11-01

    The Yellow Cat Member of the Cedar Mountain Formation in Poison Strip, Utah, USA, consists of stacked, erosionally bounded alluvial sequences dominated by massive mudstones (lithofacies Fm) with paleo-Vertisols. Sediment bodies within these sequences grade vertically and laterally into each other at pedogenic boundaries, across which color, texture, and structures (sedimentary vs. pedogenic) change. Slickensides, unfilled (sealed) cracks, carbonate-filled cracks, and deeper cracks filled with sandstone; the latter features suggest thorough desiccation during aridification. Thin sandstones (Sms) in some sequences, typically as well as laminated to massive mudstones (Flm) with which they are interbedded in some cases, are interpreted as avulsion deposits. The termini of many beds of these lithofacies curve upward, parallel to nearby pedogenic slickensides, as the features we call ;turnups.; Turnups are overlain or surrounded by paleosols, but strata sheltered underneath beds with turnups retain primary sedimentary fabrics. Turnups were produced by movement along slickensides during pedogenesis, by differential compaction alongside pre-existing gilgai microhighs, or by a combination of both. Palustrine carbonates (lithofacies C) appear only in the highest or next-highest alluvial sequences, along with a deep paleo-Vertisol that exhibits partially preserved microrelief at the base of the overlying Poison Strip Member. The attributes of the Yellow Cat Member suggest comparatively low accommodation, slow accumulation, long hiatuses in clastic sedimentation, and substantial time intervals of subaerial exposure and pedogenesis; it appears to be distinct among the members of the Cedar Mountain Formation in these respects.

  5. Random forests for classification in ecology

    USGS Publications Warehouse

    Cutler, D.R.; Edwards, T.C.; Beard, K.H.; Cutler, A.; Hess, K.T.; Gibson, J.; Lawler, J.J.

    2007-01-01

    Classification procedures are some of the most widely used statistical methods in ecology. Random forests (RF) is a new and powerful statistical classifier that is well established in other disciplines but is relatively unknown in ecology. Advantages of RF compared to other statistical classifiers include (1) very high classification accuracy; (2) a novel method of determining variable importance; (3) ability to model complex interactions among predictor variables; (4) flexibility to perform several types of statistical data analysis, including regression, classification, survival analysis, and unsupervised learning; and (5) an algorithm for imputing missing values. We compared the accuracies of RF and four other commonly used statistical classifiers using data on invasive plant species presence in Lava Beds National Monument, California, USA, rare lichen species presence in the Pacific Northwest, USA, and nest sites for cavity nesting birds in the Uinta Mountains, Utah, USA. We observed high classification accuracy in all applications as measured by cross-validation and, in the case of the lichen data, by independent test data, when comparing RF to other common classification methods. We also observed that the variables that RF identified as most important for classifying invasive plant species coincided with expectations based on the literature. ?? 2007 by the Ecological Society of America.

  6. A parallel algorithm for nonlinear convection-diffusion equations

    NASA Technical Reports Server (NTRS)

    Scroggs, Jeffrey S.

    1990-01-01

    A parallel algorithm for the efficient solution of nonlinear time-dependent convection-diffusion equations with small parameter on the diffusion term is presented. The method is based on a physically motivated domain decomposition that is dictated by singular perturbation analysis. The analysis is used to determine regions where certain reduced equations may be solved in place of the full equation. The method is suitable for the solution of problems arising in the simulation of fluid dynamics. Experimental results for a nonlinear equation in two-dimensions are presented.

  7. A Multibody Formulation for Three Dimensional Brick Finite Element Based Parallel and Scalable Rotor Dynamic Analysis

    DTIC Science & Technology

    2010-05-01

    connections near the hub end, and containing up to 0.48 million degrees of freedom. The models are analyzed for scala - bility and timing for hover and...Parallel and Scalable Rotor Dynamic Analysis 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK...will enable the modeling of critical couplings that occur in hingeless and bearingless hubs with advanced flex structures. Second , it will enable the

  8. Phloretin derived from apple can reduce alpha-hemolysin expression in methicillin-resistant Staphylococcus aureus USA300.

    PubMed

    Zhou, Xuan; Liu, Shui; Li, Wenhua; Zhang, Bing; Liu, Bowen; Liu, Yan; Deng, Xuming; Peng, Liping

    2015-08-01

    Methicillin-resistant Staphylococcus aureus (MRSA) has become increasingly important because it is the most common cause of hospital-acquired infections, which have become globally epidemic. Our study specifically focused on the MRSA strain USA300, which was shown in 2014 to be responsible for the most current pandemic of highly virulent MRSA in the United States. We aimed to evaluate the in vitro effect of phloretin on USA300. Susceptibility testing, western blotting assays, hemolysis assays and real-time RT-PCR were employed to examine the in vitro effects of phloretin on alpha-hemolysin (Hla) production when the bacterium was co-cultured with phloretin. The protective effect of phloretin against the USA300-mediated injury of human alveolar epithelial cells (A549) was tested using the live/dead analysis and cytotoxicity assays. We showed that sub-inhibitory concentrations of phloretin have no effect on bacterial viability; however, they can markedly inhibit the production of Hla in culture supernatants and the transcriptional levels of hla (the gene encoding Hla) and agrA (the accessory gene regulator). Phloretin, at a final concentration of 16 µg/ml, could protect A549 cells from injury caused by USA300 in the co-culture system. Our study suggests that phloretin might have a potential application in the development of treatment for MRSA infections.

  9. Structural barriers in access to medical marijuana in the USA-a systematic review protocol.

    PubMed

    Valencia, Celina I; Asaolu, Ibitola O; Ehiri, John E; Rosales, Cecilia

    2017-08-07

    There are 43 state medical marijuana programs in the USA, yet limited evidence is available on the demographic characteristics of the patient population accessing these programs. Moreover, insights into the social and structural barriers that inform patients' success in accessing medical marijuana are limited. A current gap in the scientific literature exists regarding generalizable data on the social, cultural, and structural mechanisms that hinder access to medical marijuana among qualifying patients. The goal of this systematic review, therefore, is to identify the aforementioned mechanisms that inform disparities in access to medical marijuana in the USA. This scoping review protocol outlines the proposed study design for the systematic review and evaluation of peer-reviewed scientific literature on structural barriers to medical marijuana access. The protocol follows the guidelines set forth by the Preferred Reporting Items for Systematic review and Meta-Analysis Protocols (PRISMA-P) checklist. The overarching goal of this study is to rigorously evaluate the existing peer-reviewed data on access to medical marijuana in the USA. Income, ethnic background, stigma, and physician preferences have been posited as the primary structural barriers influencing medical marijuana patient population demographics in the USA. Identification of structural barriers to accessing medical marijuana provides a framework for future policies and programs. Evidence-based policies and programs for increasing medical marijuana access help minimize the disparity of access among qualifying patients.

  10. Massively Parallel, Molecular Analysis Platform Developed Using a CMOS Integrated Circuit With Biological Nanopores

    PubMed Central

    Roever, Stefan

    2012-01-01

    A massively parallel, low cost molecular analysis platform will dramatically change the nature of protein, molecular and genomics research, DNA sequencing, and ultimately, molecular diagnostics. An integrated circuit (IC) with 264 sensors was fabricated using standard CMOS semiconductor processing technology. Each of these sensors is individually controlled with precision analog circuitry and is capable of single molecule measurements. Under electronic and software control, the IC was used to demonstrate the feasibility of creating and detecting lipid bilayers and biological nanopores using wild type α-hemolysin. The ability to dynamically create bilayers over each of the sensors will greatly accelerate pore development and pore mutation analysis. In addition, the noise performance of the IC was measured to be 30fA(rms). With this noise performance, single base detection of DNA was demonstrated using α-hemolysin. The data shows that a single molecule, electrical detection platform using biological nanopores can be operationalized and can ultimately scale to millions of sensors. Such a massively parallel platform will revolutionize molecular analysis and will completely change the field of molecular diagnostics in the future.

  11. Parallel group independent component analysis for massive fMRI data sets.

    PubMed

    Chen, Shaojie; Huang, Lei; Qiu, Huitong; Nebel, Mary Beth; Mostofsky, Stewart H; Pekar, James J; Lindquist, Martin A; Eloyan, Ani; Caffo, Brian S

    2017-01-01

    Independent component analysis (ICA) is widely used in the field of functional neuroimaging to decompose data into spatio-temporal patterns of co-activation. In particular, ICA has found wide usage in the analysis of resting state fMRI (rs-fMRI) data. Recently, a number of large-scale data sets have become publicly available that consist of rs-fMRI scans from thousands of subjects. As a result, efficient ICA algorithms that scale well to the increased number of subjects are required. To address this problem, we propose a two-stage likelihood-based algorithm for performing group ICA, which we denote Parallel Group Independent Component Analysis (PGICA). By utilizing the sequential nature of the algorithm and parallel computing techniques, we are able to efficiently analyze data sets from large numbers of subjects. We illustrate the efficacy of PGICA, which has been implemented in R and is freely available through the Comprehensive R Archive Network, through simulation studies and application to rs-fMRI data from two large multi-subject data sets, consisting of 301 and 779 subjects respectively.

  12. Analyzing Current Serials in Virginia: An Application of the Ulrich's Serials Analysis System

    ERIC Educational Resources Information Center

    Metz, Paul; Gasser, Sharon

    2006-01-01

    VIVA (the Virtual Library of Virginia) was one of the first subscribers to R. R. Bowker's Ulrich's Serials Analysis System (USAS). Creating a database that combined a union report of current serial subscriptions within most academic libraries in the state with the data elements present in Ulrich's made possible a comprehensive analysis designed…

  13. Local circulating clones of Staphylococcus aureus in Ecuador.

    PubMed

    Zurita, Jeannete; Barba, Pedro; Ortega-Paredes, David; Mora, Marcelo; Rivadeneira, Sebastián

    The spread of pandemic Staphylococcus aureus clones, mainly methicillin-resistant S. aureus (MRSA), must be kept under surveillance to assemble an accurate, local epidemiological analysis. In Ecuador, the prevalence of the USA300 Latin American variant clone (USA300-LV) is well known; however, there is little information about other circulating clones. The aim of this work was to identify the sequence types (ST) using a Multiple-Locus Variable number tandem repeat Analysis 14-locus genotyping approach. We analyzed 132 S. aureus strains that were recovered from 2005 to 2013 and isolated in several clinical settings in Quito, Ecuador. MRSA isolates composed 46.97% (62/132) of the study population. Within MRSA, 37 isolates were related to the USA300-LV clone (ST8-MRSA-IV, Panton-Valentine Leukocidin [PVL] +) and 10 were related to the Brazilian clone (ST239-MRSA-III, PVL-). Additionally, two isolates (ST5-MRSA-II, PVL-) were related to the New York/Japan clone. One isolate was related to the Pediatric clone (ST5-MRSA-IV, PVL-), one isolate (ST45-MRSA-II, PVL-) was related to the USA600 clone, and one (ST22-MRSA-IV, PVL-) was related to the epidemic UK-EMRSA-15 clone. Moreover, the most prevalent MSSA sequence types were ST8 (11 isolates), ST45 (8 isolates), ST30 (8 isolates), ST5 (7 isolates) and ST22 (6 isolates). Additionally, we found one isolate that was related to the livestock associated S. aureus clone ST398. We conclude that in addition to the high prevalence of clone LV-ST8-MRSA-IV, other epidemic clones are circulating in Quito, such as the Brazilian, Pediatric and New York/Japan clones. The USA600 and UK-EMRSA-15 clones, which were not previously described in Ecuador, were also found. Moreover, we found evidence of the presence of the livestock associated clone ST398 in a hospital environment. Copyright © 2016 Sociedade Brasileira de Infectologia. Published by Elsevier Editora Ltda. All rights reserved.

  14. Clinical utility of the HEART score in patients admitted with chest pain to an inner-city hospital in the USA.

    PubMed

    Patnaik, Soumya; Shah, Mahek; Alhamshari, Yaser; Ram, Pradhum; Puri, Ritika; Lu, Marvin; Balderia, Percy; Imms, John B; Maludum, Obiora; Figueredo, Vincent M

    2017-06-01

    Chest pain is one of the most common presentations to a hospital, and appropriate triaging of these patients can be challenging. The HEART score has been used for such purposes in some countries and only a few validation studies from the USA are available. We aim to determine the utility of the HEART score in patients presenting with chest pain to an inner-city hospital in the USA. We retrospectively screened 417 consecutive patients admitted with chest pain to the observation/telemetry units at Einstein Medical Center Philadelphia. After applying inclusion and exclusion criteria, 299 patients were included in the analysis. Patients were divided into low-risk (0-3) and intermediate-high (≥4)-risk HEART score groups. Baseline characteristics, thrombolysis in myocardial infarction score, need for revascularization during index hospitalization, and major adverse cardiovascular events (MACE) at 6 weeks and 12 months were recorded. There were 98 and 201 patients in the low-score group and intermediate-high-score group, respectively. Compared with the low-score group, patients in the intermediate-high-risk group had a higher incidence of revascularization during the index hospital stay (16.4 vs. 0%; P=0.001), longer hospital stay, higher MACE at 6 weeks (9.5 vs. 0%) and 12 months (20.4 vs. 3.1%), and higher cardiac readmissions. HEART score of at least 4 independently predicted MACE at 12 months (odds ratio 7.456, 95% confidence interval: 2.175-25.56; P=0.001) after adjusting for other risk factors in regression analysis. HEART score of at least 4 was predictive of worse outcomes in patients with chest pain in an inner-city USA hospital. If validated in multicenter prospective studies, the HEART score could potentially be useful in risk-stratifying patients presenting with chest pain in the USA and could impact clinical decision-making.

  15. A Hybrid Parallel Strategy Based on String Graph Theory to Improve De Novo DNA Assembly on the TianHe-2 Supercomputer.

    PubMed

    Zhang, Feng; Liao, Xiangke; Peng, Shaoliang; Cui, Yingbo; Wang, Bingqiang; Zhu, Xiaoqian; Liu, Jie

    2016-06-01

    ' The de novo assembly of DNA sequences is increasingly important for biological researches in the genomic era. After more than one decade since the Human Genome Project, some challenges still exist and new solutions are being explored to improve de novo assembly of genomes. String graph assembler (SGA), based on the string graph theory, is a new method/tool developed to address the challenges. In this paper, based on an in-depth analysis of SGA we prove that the SGA-based sequence de novo assembly is an NP-complete problem. According to our analysis, SGA outperforms other similar methods/tools in memory consumption, but costs much more time, of which 60-70 % is spent on the index construction. Upon this analysis, we introduce a hybrid parallel optimization algorithm and implement this algorithm in the TianHe-2's parallel framework. Simulations are performed with different datasets. For data of small size the optimized solution is 3.06 times faster than before, and for data of middle size it's 1.60 times. The results demonstrate an evident performance improvement, with the linear scalability for parallel FM-index construction. This results thus contribute significantly to improving the efficiency of de novo assembly of DNA sequences.

  16. Development of parallel algorithms for electrical power management in space applications

    NASA Technical Reports Server (NTRS)

    Berry, Frederick C.

    1989-01-01

    The application of parallel techniques for electrical power system analysis is discussed. The Newton-Raphson method of load flow analysis was used along with the decomposition-coordination technique to perform load flow analysis. The decomposition-coordination technique enables tasks to be performed in parallel by partitioning the electrical power system into independent local problems. Each independent local problem represents a portion of the total electrical power system on which a loan flow analysis can be performed. The load flow analysis is performed on these partitioned elements by using the Newton-Raphson load flow method. These independent local problems will produce results for voltage and power which can then be passed to the coordinator portion of the solution procedure. The coordinator problem uses the results of the local problems to determine if any correction is needed on the local problems. The coordinator problem is also solved by an iterative method much like the local problem. The iterative method for the coordination problem will also be the Newton-Raphson method. Therefore, each iteration at the coordination level will result in new values for the local problems. The local problems will have to be solved again along with the coordinator problem until some convergence conditions are met.

  17. Pteros 2.0: Evolution of the fast parallel molecular analysis library for C++ and python.

    PubMed

    Yesylevskyy, Semen O

    2015-07-15

    Pteros is the high-performance open-source library for molecular modeling and analysis of molecular dynamics trajectories. Starting from version 2.0 Pteros is available for C++ and Python programming languages with very similar interfaces. This makes it suitable for writing complex reusable programs in C++ and simple interactive scripts in Python alike. New version improves the facilities for asynchronous trajectory reading and parallel execution of analysis tasks by introducing analysis plugins which could be written in either C++ or Python in completely uniform way. The high level of abstraction provided by analysis plugins greatly simplifies prototyping and implementation of complex analysis algorithms. Pteros is available for free under Artistic License from http://sourceforge.net/projects/pteros/. © 2015 Wiley Periodicals, Inc.

  18. [Globalization of acupuncture technology innovation: a quantitative analysis based on acupuncture patents in the U.S.A].

    PubMed

    Pan, Wei; Hu, Yuan-Jia; Wang, Yi-Tao

    2011-08-01

    The structure of international flow of acupuncture knowledge was explored in this article so as to promote the globalization of acupuncture technology innovation. Statistical methods were adopted to reveal geographical distribution of acupuncture patents in the U.S.A. and the influencing factors of cumulative advantage of acupuncture techniques as well as innovation value of application of acupuncture patents. Social network analysis was also utilized to establish a global innovation network of acupuncture technology. The result shows that the cumulative strength on acupuncture technology correlates with the patent retention period. The innovative value of acupuncture invention correlates with the frequency of patent citation. And the U. S. A. and Canada seize central positions in the global acupuncture information and technology delivery system.

  19. The vulnerabilities of the power-grid system: renewable microgrids as an alternative source of energy.

    PubMed

    Meyer, Victor; Myres, Charles; Bakshi, Nitin

    2010-03-01

    The objective of this paper is to analyse the vulnerabilities of current power-grid systems and to propose alternatives to using fossil fuel power generation and infrastructure solutions in the form of microgrids, particularly those from renewable energy sources. One of the key potential benefits of microgrids, apart from their inherent sustainability and ecological advantages, is increased resilience. The analysis is targeted towards the context of business process outsourcing in India. However, much of the research on vulnerabilities has been derived from the USA and as such many of the examples cite vulnerabilities in the USA and other developed economies. Nevertheless, the vulnerabilities noted are to a degree common to all grid systems, and so the analysis may be more broadly applicable.

  20. Modelling health and output at business cycle horizons for the USA.

    PubMed

    Narayan, Paresh Kumar

    2010-07-01

    In this paper we employ a theoretical framework - a simple macro model augmented with health - that draws guidance from the Keynesian view of business cycles to examine the relative importance of permanent and transitory shocks in explaining variations in health expenditure and output at business cycle horizons for the USA. The variance decomposition analysis of shocks reveals that at business cycle horizons permanent shocks explain the bulk of the variations in output, while transitory shocks explain the bulk of the variations in health expenditures. We undertake a shock decomposition analysis for private health expenditures versus public health expenditures and interestingly find that while transitory shocks are more important for private sector expenditures, permanent shocks dominate public health expenditures. Copyright (c) 2009 John Wiley & Sons, Ltd.

Top