Sample records for de-facto computing platform

  1. Untangling outcomes of de jure and de facto community-based management of natural resources.

    PubMed

    Agarwala, Meghna; Ginsberg, Joshua R

    2017-12-01

    We systematically reviewed the literature on the tragedy of the commons and common-property resources. We segregated studies by legal management regimes (de jure regimes) and management that develops in practice (de facto regimes) to understand how the structure of regime formation affects the outcome of community management on sustainability of resource use. De facto regimes, developed within the community, are more likely to have positive impacts on the resource. However, de facto regimes are fragile and not resilient in the face of increased population pressure and unregulated markets, and de facto management regimes are less successful where physical exclusion of external agents from resources is more difficult. Yet, formalization or imposition of de jure management regimes can have complicated impacts on sustainability. The imposition of de jure regimes usually has a negative outcome when existing de facto regimes operate at larger scales than the imposed de jure regime. In contrast, de jure regimes have largely positive impacts when the de facto regimes operate at scales smaller than the overlying de jure regimes. Formalization may also be counterproductive because of elite capture and the resulting de facto privatization (that allows elites to effectively exclude others) or de facto open access (where the disenfranchised may resort to theft and elites cannot effectively exclude them). This underscores that although the global movement to formalize community-management regimes may address some forms of inequity and may produce better outcomes, it does not ensure resource sustainability and may lead to greater marginalization of users. Comparison of governance systems that differentiate between initiatives that legitimize existing de facto regimes and systems that create new de facto regimes, investigations of new top-down de jure regimes, and studies that further examine different approaches to changing de jure regimes to de facto regimes are avenues for further inquiry. © 2017 Society for Conservation Biology.

  2. 47 CFR 1.9003 - Definitions.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Authority § 1.9003 Definitions. De facto transfer leasing arrangement. A spectrum leasing arrangement in which a licensee retains de jure control of its license while transferring de facto control of the... electronically or manually. Long-term de facto transfer leasing arrangement. A long-term de facto transfer...

  3. 47 CFR 1.9010 - De facto control standard for spectrum leasing arrangements.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 1 2011-10-01 2011-10-01 false De facto control standard for spectrum leasing... PROCEDURE Spectrum Leasing General Policies and Procedures § 1.9010 De facto control standard for spectrum..., the following standard is applied for purposes of determining whether a licensee retains de facto...

  4. 47 CFR 1.9030 - Long-term de facto transfer leasing arrangements.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 1 2011-10-01 2011-10-01 false Long-term de facto transfer leasing... PROCEDURE Spectrum Leasing General Policies and Procedures § 1.9030 Long-term de facto transfer leasing...) and a spectrum lessee may enter into a long-term de facto transfer leasing arrangement in which the...

  5. 47 CFR 1.9035 - Short-term de facto transfer leasing arrangements.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 1 2011-10-01 2011-10-01 false Short-term de facto transfer leasing... PROCEDURE Spectrum Leasing General Policies and Procedures § 1.9035 Short-term de facto transfer leasing...) and a spectrum lessee may enter into a short-term de facto transfer leasing arrangement in which the...

  6. 47 CFR 1.9010 - De facto control standard for spectrum leasing arrangements.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false De facto control standard for spectrum leasing... PROCEDURE Spectrum Leasing General Policies and Procedures § 1.9010 De facto control standard for spectrum..., the following standard is applied for purposes of determining whether a licensee retains de facto...

  7. Assessment of De Facto Wastewater Reuse across the US: trends between 1980 and 2008.

    PubMed

    Rice, Jacelyn; Wutich, Amber; Westerhoff, Paul

    2013-10-01

    De facto wastewater reuse is the incidental presence of treated wastewater in a water supply source. In 1980 the EPA identified drinking water treatment plants (DWTPs) impacted by upstream wastewater treatment plant (WWTP) discharges and found the top 25 most impacted DWTPs contained between 2% and 16% wastewater discharges from upstream locations (i.e., de facto reuse) under average streamflow conditions. This study is the first to provide an update to the 1980 EPA analysis. An ArcGIS model of DWTPs and WWTPs across the U.S. was created to quantify de facto reuse for the top 25 cities in the 1980 EPA study. From 1980 to 2008, de facto reuse increased for 17 of the 25 DWTPs, as municipal flows upstream of the sites increased by 68%. Under low streamflow conditions, de facto reuse in DWTP supplies ranged from 7% to 100%, illustrating the importance of wastewater in sustainable water supplies. Case studies were performed on four cities to analyze the reasons for changes in de facto reuse over time. Three of the four sites have greater than 20% treated wastewater effluent within their drinking water source for streamflow less than the 25th percentile historic flow.

  8. Modeled De Facto Reuse and Contaminants of Emerging Concern in Drinking Water Source Waters

    EPA Science Inventory

    De facto reuse is the percentage of drinking water treatment plant (DWTP) intake potentially composed of effluent discharged from upstream wastewater treatment plants (WWTPs). Results from grab samples and a De Facto Reuse in our Nation's Consumable Supply (DRINCS) geospatial wat...

  9. 47 CFR 1.9010 - De facto control standard for spectrum leasing arrangements.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 47 Telecommunication 1 2012-10-01 2012-10-01 false De facto control standard for spectrum leasing... PROCEDURE Grants by Random Selection Spectrum Leasing General Policies and Procedures § 1.9010 De facto control standard for spectrum leasing arrangements. (a) Under the rules established for spectrum leasing...

  10. 47 CFR 1.9010 - De facto control standard for spectrum leasing arrangements.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 47 Telecommunication 1 2014-10-01 2014-10-01 false De facto control standard for spectrum leasing... PROCEDURE Grants by Random Selection Spectrum Leasing General Policies and Procedures § 1.9010 De facto control standard for spectrum leasing arrangements. (a) Under the rules established for spectrum leasing...

  11. 47 CFR 1.9010 - De facto control standard for spectrum leasing arrangements.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 47 Telecommunication 1 2013-10-01 2013-10-01 false De facto control standard for spectrum leasing... PROCEDURE Grants by Random Selection Spectrum Leasing General Policies and Procedures § 1.9010 De facto control standard for spectrum leasing arrangements. (a) Under the rules established for spectrum leasing...

  12. "De facto" Privatisation of Education and the Poor: Implications of a Study from Sub-Saharan Africa and India

    ERIC Educational Resources Information Center

    Tooley, James; Dixon, Pauline

    2006-01-01

    Three types of privatisation are identified--involving demand-side financing, reforms to the educational supply-side and "de facto" privatisation, where responsibilities are transferred to the private sector, through the rapid growth of private schools, rather than through reform or legislation. Although "de facto"…

  13. sTeam--Providing Primary Media Functions for Web-Based Computer-Supported Cooperative Learning.

    ERIC Educational Resources Information Center

    Hampel, Thorsten

    The World Wide Web has developed as the de facto standard for computer based learning. However, as a server-centered approach, it confines readers and learners to passive nonsequential reading. Authoring and Web-publishing systems aim at supporting the authors' design process. Consequently, learners' activities are confined to selecting and…

  14. Prospects for the Rule of Law in Cyberspace

    DTIC Science & Technology

    2017-01-01

    a de facto norm being outsourced to the private sector—since the U.S. Government and other countries have, in effect, delegated the task—cor...encountered less resistance from the cybersecurity community, but it is still under scru- tiny by researchers who vehemently oppose any ex - port...and reli- able Internet; • Using the OSCE as a platform for dialogue, ex - changing best practices, awareness-raising, and information on capacity

  15. The use of de jure to maintain a de facto status quo

    NASA Astrophysics Data System (ADS)

    Gallard Martínez, Alejandro José; Antrop-González, René

    2013-12-01

    The vignette we use as the introduction works to define and distinguish the concepts of de jure and de facto pedagogical actions, especially as related to Latin@ education and its relationship with STEM fields. The authors assert that de jure educational policies, which are often legal guidelines that mandate minimum levels of compliance, unfortunately become translated to mean the normative way to implement educational practice. Hence, going above and beyond the call of duty to educate Latin@ children and youth through culturally meaningful STEM practices while respecting, affirming, and utilizing decolonizing ways of viewing science and math is not viewed as standard. Thus, it is imperative that STEM educators who work with Latin@ learners demand that de jure education guidelines translated as de facto pedagogical actions are not enough. On the contrary, de jure and de facto ways of teaching and learning should always consist of a counterhegemonic normative.

  16. Spatial and temporal variation in de facto wastewater reuse in drinking water systems across the U.S.A.

    PubMed

    Rice, Jacelyn; Westerhoff, Paul

    2015-01-20

    De facto potable reuse occurs when treated wastewater is discharged into surface waters upstream of potable drinking water treatment plant (DWTP) intakes. Wastewater treatment plant (WWTP) discharges may pose water quality risks at the downstream DWTP, but additional flow aids in providing a reliable water supply source. In this work de facto reuse is analyzed for 2056 surface water intakes serving 1210 DWTPs across the U.S.A. that serve greater than 10,000 people, covering approximately 82% of the nation’s population. An ArcGIS model is developed to assess spatial relationships between DWTPs and WWTPs, with a python script designed to perform a network analysis by hydrologic region. A high frequency of de facto reuse occurrence was observed; 50% of the DWTP intakes are potentially impacted by upstream WWTP discharges. However, the magnitude of de facto reuse was seen to be relatively low, where 50% of the impacted intakes contained less than 1% treated municipal wastewater under average streamflow conditions. De facto reuse increased greatly under low streamflow conditions (modeled by Q95), with 32 of the 80 sites yielding at least 50% treated wastewater, this portion of the analysis is limited to sites where stream gauge data was readily available.

  17. Twitmographics: Learning the Emergent Properties of the Twitter Community

    NASA Astrophysics Data System (ADS)

    Cheong, Marc; Lee, Vincent

    This paper presents a framework for discovery of the emergent properties of users of the Twitter microblogging platform. The novelty of our methodology is the use of machine-learning methods to deduce user demographic information and online usage patterns and habits not readily apparent from the raw messages posted on Twitter. This is different from existing social network analysis performed on de facto social networks such as Face-book, in the sense that we use publicly available metadata from Twitter messages to explore the inherent characteristics about different segments of the Twitter community, in a simple yet effective manner. Our framework is coupled with the self-organizing map visualization method, and tested on a corpus of messages which deal with issues of socio politi-cal and economic impact, to gain insight into the properties of human interaction via Twitter as a medium for computer-mediated self-expression.

  18. IM-TORNADO: a tool for comparison of 16S reads from paired-end libraries.

    PubMed

    Jeraldo, Patricio; Kalari, Krishna; Chen, Xianfeng; Bhavsar, Jaysheel; Mangalam, Ashutosh; White, Bryan; Nelson, Heidi; Kocher, Jean-Pierre; Chia, Nicholas

    2014-01-01

    16S rDNA hypervariable tag sequencing has become the de facto method for accessing microbial diversity. Illumina paired-end sequencing, which produces two separate reads for each DNA fragment, has become the platform of choice for this application. However, when the two reads do not overlap, existing computational pipelines analyze data from read separately and underutilize the information contained in the paired-end reads. We created a workflow known as Illinois Mayo Taxon Organization from RNA Dataset Operations (IM-TORNADO) for processing non-overlapping reads while retaining maximal information content. Using synthetic mock datasets, we show that the use of both reads produced answers with greater correlation to those from full length 16S rDNA when looking at taxonomy, phylogeny, and beta-diversity. IM-TORNADO is freely available at http://sourceforge.net/projects/imtornado and produces BIOM format output for cross compatibility with other pipelines such as QIIME, mothur, and phyloseq.

  19. BioBlocks: Programming Protocols in Biology Made Easier.

    PubMed

    Gupta, Vishal; Irimia, Jesús; Pau, Iván; Rodríguez-Patón, Alfonso

    2017-07-21

    The methods to execute biological experiments are evolving. Affordable fluid handling robots and on-demand biology enterprises are making automating entire experiments a reality. Automation offers the benefit of high-throughput experimentation, rapid prototyping, and improved reproducibility of results. However, learning to automate and codify experiments is a difficult task as it requires programming expertise. Here, we present a web-based visual development environment called BioBlocks for describing experimental protocols in biology. It is based on Google's Blockly and Scratch, and requires little or no experience in computer programming to automate the execution of experiments. The experiments can be specified, saved, modified, and shared between multiple users in an easy manner. BioBlocks is open-source and can be customized to execute protocols on local robotic platforms or remotely, that is, in the cloud. It aims to serve as a de facto open standard for programming protocols in Biology.

  20. MIT Lincoln Laboratory Takes the Mystery Out of Supercomupting

    DTIC Science & Technology

    2017-01-18

    analysis, designing sensors, and developing algorithms. In 2008, the Lincoln demonstrated the largest single problem ever run on a computer using ... computation . As we design and prototype these devices, the use of leading–edge engineering practices have become the de facto standard. This includes...MIT Lincoln Laboratory Takes the Mystery Out of Supercomputing By Dr. Jeremy Kepner 1 The introduction of multicore and manycore processors

  1. The Use of De Jure to Maintain a De Facto Status Quo

    ERIC Educational Resources Information Center

    Gallard Martínez, Alejandro José; Antrop-González, René

    2013-01-01

    The vignette we use as the introduction works to define and distinguish the concepts of de jure and de facto pedagogical actions, especially as related to Latin@ education and its relationship with STEM fields. The authors assert that de jure educational policies, which are often legal guidelines that mandate minimum levels of compliance,…

  2. Research on Interactive Acquisition and Use of Knowledge.

    DTIC Science & Technology

    1983-11-01

    complex and challenging endeavor. Computer scientists faced with the problem of managing software complexity have de - veloped strict design disciplines...handle most-indeed, probably all-- phenomena in the syntax and semantics of natural language. It has also turned out to be well suited for the classes of...Semantics The previous grammar performs a de facto coordination of syntax and semantics by requiring that the (syntactically) preverbal NP play the

  3. 47 CFR 1.9003 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... Spectrum Leasing Scope and Authority § 1.9003 Definitions. De facto transfer leasing arrangement. A spectrum leasing arrangement in which a licensee retains de jure control of its license while transferring de facto control of the leased spectrum to a spectrum lessee, pursuant to the spectrum leasing rules...

  4. 47 CFR 1.9003 - Definitions.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Spectrum Leasing Scope and Authority § 1.9003 Definitions. De facto transfer leasing arrangement. A spectrum leasing arrangement in which a licensee retains de jure control of its license while transferring de facto control of the leased spectrum to a spectrum lessee, pursuant to the spectrum leasing rules...

  5. 47 CFR 1.9003 - Definitions.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Spectrum Leasing Scope and Authority § 1.9003 Definitions. De facto transfer leasing arrangement. A spectrum leasing arrangement in which a licensee retains de jure control of its license while transferring de facto control of the leased spectrum to a spectrum lessee, pursuant to the spectrum leasing rules...

  6. Implications of Transitioning from De Facto to Engineered Water Reuse for Power Plant Cooling.

    PubMed

    Barker, Zachary A; Stillwell, Ashlynn S

    2016-05-17

    Thermoelectric power plants demand large quantities of cooling water, and can use alternative sources like treated wastewater (reclaimed water); however, such alternatives generate many uncertainties. De facto water reuse, or the incidental presence of wastewater effluent in a water source, is common at power plants, representing baseline conditions. In many cases, power plants would retrofit open-loop systems to cooling towers to use reclaimed water. To evaluate the feasibility of reclaimed water use, we compared hydrologic and economic conditions at power plants under three scenarios: quantified de facto reuse, de facto reuse with cooling tower retrofits, and modeled engineered reuse conditions. We created a genetic algorithm to estimate costs and model optimal conditions. To assess power plant performance, we evaluated reliability metrics for thermal variances and generation capacity loss as a function of water temperature. Applying our analysis to the greater Chicago area, we observed high de facto reuse for some power plants and substantial costs for retrofitting to use reclaimed water. Conversely, the gains in reliability and performance through engineered reuse with cooling towers outweighed the energy investment in reclaimed water pumping. Our analysis yields quantitative results of reclaimed water feasibility and can inform sustainable management of water and energy.

  7. Modeled de facto reuse and contaminants of emerging concern in drinking water source waters

    USGS Publications Warehouse

    Nguyen, Thuy; Westerhoff, Paul; Furlong, Edward T.; Kolpin, Dana W.; Batt, Angela L.; Mash, Heath E.; Schenck, Kathleen M.; Boone, J. Scott; Rice, Jacelyn; Glassmeyer, Susan T.

    2018-01-01

    De facto reuse is the percentage of drinking water treatment plant (DWTP) intake potentially composed of effluent discharged from upstream wastewater treatment plants (WWTPs). Results from grab samples and a De Facto Reuse in our Nation's Consumable Supply (DRINCS) geospatial watershed model were used to quantify contaminants of emerging concern (CECs) concentrations at DWTP intakes to qualitatively compare exposure risks obtained by the two approaches. Between nine and 71 CECs were detected in grab samples. The number of upstream WWTP discharges ranged from 0 to >1,000; comparative de facto reuse results from DRINCS ranged from <0.1 to 13% during average flow and >80% during lower streamflows. Correlation between chemicals detected and DRINCS modeling results were observed, particularly DWTPs withdrawing from midsize water bodies. This comparison advances the utility of DRINCS to identify locations of DWTPs for future CEC sampling and treatment technology testing.

  8. Drafted: I Want You to Deliver E-Government

    ERIC Educational Resources Information Center

    Bertot, John Carlo; Jaeger, Paul T.; Langa, Lesley A.; McClure, Charles R.

    2006-01-01

    Public access to the Internet and computers is transforming public libraries into de facto e-government access points, for such disparate services as disaster relief, Medicare drug plans, and even benefits for children and families. This new role for public libraries is not just user-initiated. Government agencies now refer people to public…

  9. 78 FR 40430 - De Facto

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-05

    ... the government in making decisions regarding the selection of management; and (4) whether the... government; (2) export sales negotiations and prices; (3) composition of company management, the process... de facto Test With Requests for Additional Documentary Support and Additional Questions Regarding the...

  10. 47 CFR 1.9035 - Short-term de facto transfer leasing arrangements.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... PROCEDURE Grants by Random Selection Spectrum Leasing General Policies and Procedures § 1.9035 Short-term de... any of the included services) and a spectrum lessee may enter into a short-term de facto transfer... the leased spectrum is transferred to the spectrum lessee for the duration of the spectrum leasing...

  11. 47 CFR 1.9035 - Short-term de facto transfer leasing arrangements.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... PROCEDURE Grants by Random Selection Spectrum Leasing General Policies and Procedures § 1.9035 Short-term de... any of the included services) and a spectrum lessee may enter into a short-term de facto transfer... the leased spectrum is transferred to the spectrum lessee for the duration of the spectrum leasing...

  12. 47 CFR 1.9035 - Short-term de facto transfer leasing arrangements.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... PROCEDURE Grants by Random Selection Spectrum Leasing General Policies and Procedures § 1.9035 Short-term de... any of the included services) and a spectrum lessee may enter into a short-term de facto transfer... the leased spectrum is transferred to the spectrum lessee for the duration of the spectrum leasing...

  13. De Facto Versus de Jure Political Institutions in the Long-Run: A Multivariate Analysis, 1820-2000.

    PubMed

    Foldvari, Peter

    2017-01-01

    In this paper we use the components of the PolityIV project's polity2 and Vanhanen's Index of Democracy indicators to analyse the relationship between de jure and de facto political institutions from 1820 until 2000 with a canonical correlation method corrected for the sample selection bias. We find considerable fluctuation in the relationship between the two measures. After a moderate positive correlation found during the first half of the nineteenth century, the two measures become statistically unrelated until the 1940s. The relationship becomes strong and positive only in the second half of the twentieth century. The relationship between de jure and de facto political institutions hence can be described as a U-curve, reminiscent to an inverse Kuznets-curve.

  14. CADBIT II - Computer-Aided Design for Built-In Test. Volume 1

    DTIC Science & Technology

    1993-06-01

    data provided in the CADBIT I Final Report, as indicated in Figure 1.2. "• CADBIT II IMPLEMENTS SYSTEM CONCEPT, REQUIREMENTS, AND DATA DEVELOPED DURING...CADBIT II software was developed using de facto computer standards including Unix, C, and the X Windows-based OSF/Motif graphical user interface... export connectivity infermation. Design Architect is a package for designers that includes schematic capture, VHDL editor, and libraries of digital

  15. Legally White, Socially "Mexican": The Politics of De Jure and De Facto School Segregation in the American Southwest

    ERIC Educational Resources Information Center

    Donato, Ruben; Hanson, Jarrod S.

    2012-01-01

    The history of Mexican American school segregation is complex, often misunderstood, and currently unresolved. The literature suggests that Mexican Americans experienced de facto segregation because it was local custom and never sanctioned at the state level in the American Southwest. However, the same literature suggests that Mexican Americans…

  16. IM-TORNADO: A Tool for Comparison of 16S Reads from Paired-End Libraries

    PubMed Central

    Jeraldo, Patricio; Kalari, Krishna; Chen, Xianfeng; Bhavsar, Jaysheel; Mangalam, Ashutosh; White, Bryan; Nelson, Heidi; Kocher, Jean-Pierre; Chia, Nicholas

    2014-01-01

    Motivation 16S rDNA hypervariable tag sequencing has become the de facto method for accessing microbial diversity. Illumina paired-end sequencing, which produces two separate reads for each DNA fragment, has become the platform of choice for this application. However, when the two reads do not overlap, existing computational pipelines analyze data from read separately and underutilize the information contained in the paired-end reads. Results We created a workflow known as Illinois Mayo Taxon Organization from RNA Dataset Operations (IM-TORNADO) for processing non-overlapping reads while retaining maximal information content. Using synthetic mock datasets, we show that the use of both reads produced answers with greater correlation to those from full length 16S rDNA when looking at taxonomy, phylogeny, and beta-diversity. Availability and Implementation IM-TORNADO is freely available at http://sourceforge.net/projects/imtornado and produces BIOM format output for cross compatibility with other pipelines such as QIIME, mothur, and phyloseq. PMID:25506826

  17. Optical media standards for industry

    NASA Technical Reports Server (NTRS)

    Hallam, Kenneth J.

    1993-01-01

    Optical storage is a new and growing area of technology that can serve to meet some of the mass storage needs of the computer industry. Optical storage is characterized by information being stored and retrieved by means of diode lasers. When most people refer to optical storage, they mean rotating disk media, but there are 1 or 2 products that use lasers to read and write to tape. Optical media also usually means removable media. Because of its removability, there is a recognized need for standardization, both of the media and of the recording method. Industry standards can come about in one or more different ways. An industry supported body can sanction and publish a formal standard. A company may ship enough of a product that it so dominates an application or industry that it acquires 'standard' status without an official sanction. Such de facto standards are almost always copied by other companies with varying degrees of success. A governmental body can issue a rule or law that requires conformance to a standard. The standard may have been created by the government, or adopted from among many proposed by industry. These are often known as de jure standards. Standards are either open or proprietary. If approved by a government or sanctioning body, the standard is open. A de facto standard may be either open or proprietary. Optical media is too new to have de facto standards accepted by the marketplace yet. The proliferation of non-compatible media types in the last 5 years of optical market development have convinced many of the need for recognized media standards.

  18. A Grassroots Solution to "De Facto" School Segregation.

    ERIC Educational Resources Information Center

    Broderick, Mary

    1997-01-01

    Connecticut is struggling to address the "de facto" desegregation that finds 80% of the state's minority schoolchildren enrolled in only 18 of its 166 school districts. In 1996, the state's supreme court ruled that Connecticut's two-tiered system was violating these students' rights. Southeastern Connecticut's improvement plan reflects…

  19. Transformation of personal computers and mobile phones into genetic diagnostic systems.

    PubMed

    Walker, Faye M; Ahmad, Kareem M; Eisenstein, Michael; Soh, H Tom

    2014-09-16

    Molecular diagnostics based on the polymerase chain reaction (PCR) offer rapid and sensitive means for detecting infectious disease, but prohibitive costs have impeded their use in resource-limited settings where such diseases are endemic. In this work, we report an innovative method for transforming a desktop computer and a mobile camera phone--devices that have become readily accessible in developing countries--into a highly sensitive DNA detection system. This transformation was achieved by converting a desktop computer into a de facto thermal cycler with software that controls the temperature of the central processing unit (CPU), allowing for highly efficient PCR. Next, we reconfigured the mobile phone into a fluorescence imager by adding a low-cost filter, which enabled us to quantitatively measure the resulting PCR amplicons. Our system is highly sensitive, achieving quantitative detection of as little as 9.6 attograms of target DNA, and we show that its performance is comparable to advanced laboratory instruments at approximately 1/500th of the cost. Finally, in order to demonstrate clinical utility, we have used our platform for the successful detection of genomic DNA from the parasite that causes Chagas disease, Trypanosoma cruzi, directly in whole, unprocessed human blood at concentrations 4-fold below the clinical titer of the parasite.

  20. Transformation of Personal Computers and Mobile Phones into Genetic Diagnostic Systems

    PubMed Central

    2014-01-01

    Molecular diagnostics based on the polymerase chain reaction (PCR) offer rapid and sensitive means for detecting infectious disease, but prohibitive costs have impeded their use in resource-limited settings where such diseases are endemic. In this work, we report an innovative method for transforming a desktop computer and a mobile camera phone—devices that have become readily accessible in developing countries—into a highly sensitive DNA detection system. This transformation was achieved by converting a desktop computer into a de facto thermal cycler with software that controls the temperature of the central processing unit (CPU), allowing for highly efficient PCR. Next, we reconfigured the mobile phone into a fluorescence imager by adding a low-cost filter, which enabled us to quantitatively measure the resulting PCR amplicons. Our system is highly sensitive, achieving quantitative detection of as little as 9.6 attograms of target DNA, and we show that its performance is comparable to advanced laboratory instruments at approximately 1/500th of the cost. Finally, in order to demonstrate clinical utility, we have used our platform for the successful detection of genomic DNA from the parasite that causes Chagas disease, Trypanosoma cruzi, directly in whole, unprocessed human blood at concentrations 4-fold below the clinical titer of the parasite. PMID:25223929

  1. Information Management Functional Economic Analysis for Finance Workstations to the Defense Information Technology Services Organization

    DTIC Science & Technology

    1993-03-01

    values themselves. The Wools perform risk-adjusted present-value comparisons and compute the ROI using discount factors. The assessment of risk in a...developed X Window system, the de facto industry standard window system in the UNIX environment. An X- terminal’s use is limited to display. It has no...2.1 IT HARDWARE The DOS-based PC used in this analysis costs $2,060. It includes an ASL 486DX-33 Industry Standard Architecture (ISA) computer with 8

  2. School Integration Policies in Northern Cities.

    ERIC Educational Resources Information Center

    Glazer, Nathan

    It is pointed out that there is little expert research on the effects of de facto segregation in schools in the North and the West. Too often an oversimplified casual relationship is drawn which explains the educational gap between white and Negro students in de facto segregated schools. Other factors considered in analyzing educational status…

  3. Wastewater discharge impact on drinking water sources along the Yangtze River (China).

    PubMed

    Wang, Zhuomin; Shao, Dongguo; Westerhoff, Paul

    2017-12-01

    Unplanned indirect (de facto) wastewater reuse occurs when wastewater is discharged into surface waters upstream of potable drinking water treatment plant intakes. This paper aims to predict percentages and trends of de facto reuse throughout the Yangtze River watershed in order to understand the relative contribution of wastewater discharges into the river and its tributaries towards averting water scarcity concerns. The Yangtze River is the third longest in the world and supports more than 1/15 of the world's population, yet the importance of wastewater on the river remains ill-defined. Municipal wastewater produced in the Yangtze River Basin increased by 41% between 1998 and 2014, from 2580m 3 /s to 3646m 3 /s. Under low flow conditions in the Yangtze River near Shanghai, treated wastewater contributions to river flows increased from 8% in 1998 to 14% in 2014. The highest levels of de facto reuse appeared along a major tributary (Han River) of the Yangtze River, where de facto reuse can exceed 20%. While this initial analysis of de facto reuse used water supply and wastewater data from 110 cities in the basin and 11 gauging stations with >50years of historic streamflow data, the outcome was limited by the lack of gauging stations at more locations (i.e., data had to be predicted using digital elevation mapping) and lack of precise geospatial location of drinking water intakes or wastewater discharges. This limited the predictive capability of the model relative to larger datasets available in other countries (e.g., USA). This assessment is the first analysis of de facto wastewater reuse in the Yangtze River Basin. It will help identify sections of the river at higher risk for wastewater-related pollutants due to presence of-and reliance on-wastewater discharge that could be the focus of field studies and model predictions of higher spatial and temporal resolution. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. 76 FR 39019 - Atlantic Highly Migratory Species; Atlantic Bluefin Tuna Quotas and Atlantic Tuna Fisheries...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-05

    ... for the more selective, directed fishing categories, and be a de facto reallocation of quota shares... allowing a de facto ``incidental catch'' fishery in the Gulf of Mexico, in violation of the ICCAT... regarding post-release mortality, makes it difficult to quantify now the effect of the weak hook requirement...

  5. 75 FR 34689 - Pure Magnesium From the People's Republic of China: Preliminary Results of the 2008-2009...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-18

    ..., the Department issued its antidumping duty questionnaire to TMI, TXR, and Pan Asia by FedEx. TXR.... Absence of De Facto Control Typically, the Department considers four factors in evaluating whether each... profits or financing of losses.\\32\\ The Department has determined that an analysis of de facto control is...

  6. BlueSNP: R package for highly scalable genome-wide association studies using Hadoop clusters.

    PubMed

    Huang, Hailiang; Tata, Sandeep; Prill, Robert J

    2013-01-01

    Computational workloads for genome-wide association studies (GWAS) are growing in scale and complexity outpacing the capabilities of single-threaded software designed for personal computers. The BlueSNP R package implements GWAS statistical tests in the R programming language and executes the calculations across computer clusters configured with Apache Hadoop, a de facto standard framework for distributed data processing using the MapReduce formalism. BlueSNP makes computationally intensive analyses, such as estimating empirical p-values via data permutation, and searching for expression quantitative trait loci over thousands of genes, feasible for large genotype-phenotype datasets. http://github.com/ibm-bioinformatics/bluesnp

  7. A knowledge management system for new drug submission by pharma-industries.

    PubMed

    Pinciroli, Francesco; Mottadelli, Sara; Vinci, Maurizio; Fabbro, Luigi; Gothager, Klas

    2004-01-01

    The pharma-industries are facing a number of crucial business issues to improve operational excellence in product time-to-market and wide regulatory compliance. These organizations own, produce, and manipulate a lot of knowledge. The new regulations by Health Authorities (HA) to pharma-industries should make the content and format of new drug application uniform worldwide. In this paper we suggest a novel approach of a pharma-industry to capture, process, and transmit clinical data electronically. The approach begins with an analysis of the knowledge generation points, some of them being outside the company. Implementations are grounded on the use of a de facto standard platform being Microsoft, having acceptable cost levels. The proposed infrastructure is integrated into existing company environment and technological platform, minimizing cost and risks, but improving efficiency and efficacy of new drug dossier compilation.

  8. A Framework for Synthesizing the United States Code in Support of Cyberspace Operations

    DTIC Science & Technology

    2016-03-01

    inter- title operations, it is unclear whether this de facto shift in the application of U.S.C. statutes is necessary. The U.S.C. has a limited number of...de facto shift in the application of U.S.C. statutes is necessary. The U.S.C. has a limited number of exclusionary distinctions de jure, which is...NATION STATE Though statecraft has progressed through numerous iterations over the course of human history, in the current post -colonial era, the

  9. Counternetwork: Countering the Expansion of Transnational Criminal Networks

    DTIC Science & Technology

    2017-01-01

    herein are protected by law. This representation of RAND intellectual property is provided for noncommercial use only. Unauthorized posting of this...capture and control of relevant aspects of the public domain—sometimes result in de facto governance.35 Whether this means that a counterinsurgency...p. 178. 35 Clark, 2009. Critics of the criminal insurgency concept argue that the cartels engage in de facto governance not for the sake of

  10. Frontier Security: The Case of Brazil

    DTIC Science & Technology

    2016-08-01

    locations close to indigenous and other settlements along the northern border. In concept, the military post would be augmented by civilian ex - pertise...legal doctrine of uti possidetis de facto —possession takes precedence over legal title. The presence of some Brazilian settlers, particularly in...possidetis de facto to seize unoccupied territory. In the 1970s, the armed forces launched a colonization program along the nation’s western frontier called

  11. Lessons from Kosovo: The KFOR Experience

    DTIC Science & Technology

    2002-07-01

    There are wide extremes of weather and terrain, a mix of urban and rural, modern and primitive, and upscale and slum. Transportation routes are...systems for exchanging information and coordinating actions—it became the de facto formal messaging system. For the United States, the highly secure...education in their own language, and exposed to massive abuse of their human rights and civil liberties. Kosovo became a de facto Serbian colony where 90

  12. Drivers of Microbial Risk for Direct Potable Reuse and de Facto Reuse Treatment Schemes: The Impacts of Source Water Quality and Blending.

    PubMed

    Chaudhry, Rabia M; Hamilton, Kerry A; Haas, Charles N; Nelson, Kara L

    2017-06-13

    Although reclaimed water for potable applications has many potential benefits, it poses concerns for chemical and microbial risks to consumers. We present a quantitative microbial risk assessment (QMRA) Monte Carlo framework to compare a de facto water reuse scenario (treated wastewater-impacted surface water) with four hypothetical Direct Potable Reuse (DPR) scenarios for Norovirus, Cryptosporidium , and Salmonella . Consumer microbial risks of surface source water quality (impacted by 0-100% treated wastewater effluent) were assessed. Additionally, we assessed risks for different blending ratios (0-100% surface water blended into advanced-treated DPR water) when source surface water consisted of 50% wastewater effluent. De facto reuse risks exceeded the yearly 10 -4 infections risk benchmark while all modeled DPR risks were significantly lower. Contamination with 1% or more wastewater effluent in the source water, and blending 1% or more wastewater-impacted surface water into the advanced-treated DPR water drove the risk closer to the 10 -4 benchmark. We demonstrate that de facto reuse by itself, or as an input into DPR, drives microbial risks more so than the advanced-treated DPR water. When applied using location-specific inputs, this framework can contribute to project design and public awareness campaigns to build legitimacy for DPR.

  13. Drivers of Microbial Risk for Direct Potable Reuse and de Facto Reuse Treatment Schemes: The Impacts of Source Water Quality and Blending

    PubMed Central

    Chaudhry, Rabia M.; Hamilton, Kerry A.; Haas, Charles N.; Nelson, Kara L.

    2017-01-01

    Although reclaimed water for potable applications has many potential benefits, it poses concerns for chemical and microbial risks to consumers. We present a quantitative microbial risk assessment (QMRA) Monte Carlo framework to compare a de facto water reuse scenario (treated wastewater-impacted surface water) with four hypothetical Direct Potable Reuse (DPR) scenarios for Norovirus, Cryptosporidium, and Salmonella. Consumer microbial risks of surface source water quality (impacted by 0–100% treated wastewater effluent) were assessed. Additionally, we assessed risks for different blending ratios (0–100% surface water blended into advanced-treated DPR water) when source surface water consisted of 50% wastewater effluent. De facto reuse risks exceeded the yearly 10−4 infections risk benchmark while all modeled DPR risks were significantly lower. Contamination with 1% or more wastewater effluent in the source water, and blending 1% or more wastewater-impacted surface water into the advanced-treated DPR water drove the risk closer to the 10−4 benchmark. We demonstrate that de facto reuse by itself, or as an input into DPR, drives microbial risks more so than the advanced-treated DPR water. When applied using location-specific inputs, this framework can contribute to project design and public awareness campaigns to build legitimacy for DPR. PMID:28608808

  14. A Comparison and Contrast in Alternative Learning versus Traditional Learning

    ERIC Educational Resources Information Center

    Simmons, Mia A.

    2013-01-01

    Preparation for today's influenced technology professional world starts with structure of primary and secondary educational learning environments. The student learning platforms should be aligned in some ways to professional working platforms. This quantitative correlational ex post facto study compared the effectiveness of learning modalities in…

  15. Vulnerability Analysis of HD Photo Image Viewer Applications

    DTIC Science & Technology

    2007-09-01

    the successor to the ubiquitous JPEG image format, as well as the eventual de facto standard in the digital photography market. With massive efforts...renamed to HD Photo in November of 2006, is being touted as the successor to the ubiquitous JPEG image format, as well as the eventual de facto standard...associated state-of-the-art compression algorithm “specifically designed [for] all types of continuous tone photographic” images [HDPhotoFeatureSpec

  16. Leadership From the Centre: A New Foreign and Security Policy for Germany

    DTIC Science & Technology

    2016-03-01

    this period, a new Germany confident enough to declare leadership from the centre assumed de facto leadership in the European Union. The dichotomy of...enough to declare leadership from the centre assumed de facto leadership in the European Union. The dichotomy of Germany’s past and ambitions in foreign...14 C. THE REALPOLITIK OF THE POST -WAR ORDER ........................ 17 III. REUNIFICATION AND THE RISE OF THE CIVILIAN POWER ............ 23 A

  17. Beating the Islamic State: Selecting a New Strategy for Iraq and Syria

    DTIC Science & Technology

    2017-01-01

    noncommercial use only. Unauthorized posting of this publication online is prohibited. Permission is given to duplicate this document for personal use only...These conditions fed the rise of the diffuse Sunni insurgency, a de facto sectarian civil war, and the eventual rise of IS. Nationalist groups led...and elevated his stature to de facto leader of the otherwise fractured Sunni Arab Iraqi insurgent movement. From 2004 through 2006 Zarqawi implemented

  18. JPRS Report, East Europe.

    DTIC Science & Technology

    1992-11-20

    Convention [DELO 6 Nov] 33 Transportation Agreement Talks Held With EC [DELO 2 Nov] 33 Government, Unions Sign Collective Contract [DELO 30 Oct] 34...the standpoint of current payments. Although it is possible that the forint’s de facto convertibility has expanded in a relative sense, it does not...They intend to continue this restriction for a while—and appropriately so, in my view. On the other hand, the forint’s de facto external

  19. Formal Analysis of BPMN Models Using Event-B

    NASA Astrophysics Data System (ADS)

    Bryans, Jeremy W.; Wei, Wei

    The use of business process models has gone far beyond documentation purposes. In the development of business applications, they can play the role of an artifact on which high level properties can be verified and design errors can be revealed in an effort to reduce overhead at later software development and diagnosis stages. This paper demonstrates how formal verification may add value to the specification, design and development of business process models in an industrial setting. The analysis of these models is achieved via an algorithmic translation from the de-facto standard business process modeling language BPMN to Event-B, a widely used formal language supported by the Rodin platform which offers a range of simulation and verification technologies.

  20. Southeast Asia’s Relations with Taiwan, 2000-2016: An Assessment of Vietnam and Singapore

    DTIC Science & Technology

    2017-09-01

    were considered good during Taiwan president Lee Teng-hui’s terms (1988–2000) with robust economic ties and de facto diplomatic relations. Studies of...the post -Lee period have described a state of deteriorated relations between Taiwan and Southeast Asian countries when Chen Shui-bian succeeded Lee...Taiwan set up its de facto embassy in Ho Chi Minh City in 1992, and Vietnam set up its own representative office in Taipei in 1993. Vietnam continued to

  1. De facto molecular weight distributions of glucans by size-exclusion chromatography combined with mass/molar-detection of fluorescence labeled terminal hemiacetals.

    PubMed

    Praznik, Werner; Huber, Anton

    2005-09-25

    A major capability of polysaccharides in aqueous media is their tendency for aggregation and dynamic formation of supermolecular structures. Even extended dissolution processes will not eliminate these structures which dominate many analytical approaches, in particular absolute molecular weight determinations referring to light scattering data. An alternative approach for determination of de facto molecular weight for glucans with free terminal hemiacetal functionality (reducing end group) has been adjusted from carbohydrates for midrange and high-dp glucans: quantitative and stabilized labeling as aminopyridyl-derivatives (AP-glucans) and subsequent analysis of SEC-separated elution profiles based on simultaneously monitored mass and molar fractions by refractive index and fluorescence detection. SEC-DRI/FL of AP-glucans proved as an appropriate approach for determination of de facto molecular weight of constituting glucan molecules even in the presence of supermolecular structures for non-branched (pullulan), branched (dextran), narrow distributed and broad distributed and for mixes of compact and loose packed polymer coils (starch glucan hydrolizate).

  2. Potable Water Reuse: What Are the Microbiological Risks?

    PubMed

    Nappier, Sharon P; Soller, Jeffrey A; Eftim, Sorina E

    2018-06-01

    With the increasing interest in recycling water for potable reuse purposes, it is important to understand the microbial risks associated with potable reuse. This review focuses on potable reuse systems that use high-level treatment and de facto reuse scenarios that include a quantifiable wastewater effluent component. In this article, we summarize the published human health studies related to potable reuse, including both epidemiology studies and quantitative microbial risk assessments (QMRA). Overall, there have been relatively few health-based studies evaluating the microbial risks associated with potable reuse. Several microbial risk assessments focused on risks associated with unplanned (or de facto) reuse, while others evaluated planned potable reuse, such as indirect potable reuse (IPR) or direct potable reuse (DPR). The reported QMRA-based risks for planned potable reuse varied substantially, indicating there is a need for risk assessors to use consistent input parameters and transparent assumptions, so that risk results are easily translated across studies. However, the current results overall indicate that predicted risks associated with planned potable reuse scenarios may be lower than those for de facto reuse scenarios. Overall, there is a clear need to carefully consider water treatment train choices when wastewater is a component of the drinking water supply (whether de facto, IPR, or DPR). More data from full-scale water treatment facilities would be helpful to quantify levels of viruses in raw sewage and reductions across unit treatment processes for both culturable and molecular detection methods.

  3. FPGA implementation of sparse matrix algorithm for information retrieval

    NASA Astrophysics Data System (ADS)

    Bojanic, Slobodan; Jevtic, Ruzica; Nieto-Taladriz, Octavio

    2005-06-01

    Information text data retrieval requires a tremendous amount of processing time because of the size of the data and the complexity of information retrieval algorithms. In this paper the solution to this problem is proposed via hardware supported information retrieval algorithms. Reconfigurable computing may adopt frequent hardware modifications through its tailorable hardware and exploits parallelism for a given application through reconfigurable and flexible hardware units. The degree of the parallelism can be tuned for data. In this work we implemented standard BLAS (basic linear algebra subprogram) sparse matrix algorithm named Compressed Sparse Row (CSR) that is showed to be more efficient in terms of storage space requirement and query-processing timing over the other sparse matrix algorithms for information retrieval application. Although inverted index algorithm is treated as the de facto standard for information retrieval for years, an alternative approach to store the index of text collection in a sparse matrix structure gains more attention. This approach performs query processing using sparse matrix-vector multiplication and due to parallelization achieves a substantial efficiency over the sequential inverted index. The parallel implementations of information retrieval kernel are presented in this work targeting the Virtex II Field Programmable Gate Arrays (FPGAs) board from Xilinx. A recent development in scientific applications is the use of FPGA to achieve high performance results. Computational results are compared to implementations on other platforms. The design achieves a high level of parallelism for the overall function while retaining highly optimised hardware within processing unit.

  4. Chimera distribution amplitudes for the pion and the longitudinally polarized ρ-meson

    NASA Astrophysics Data System (ADS)

    Stefanis, N. G.; Pimikov, A. V.

    2016-01-01

    Using QCD sum rules with nonlocal condensates, we show that the distribution amplitude of the longitudinally polarized ρ-meson may have a shorttailed platykurtic profile in close analogy to our recently proposed platykurtic distribution amplitude for the pion. Such a chimera distribution de facto amalgamates the broad unimodal profile of the distribution amplitude, obtained with a Dyson-Schwinger equations-based computational scheme, with the suppressed tails characterizing the bimodal distribution amplitudes derived from QCD sum rules with nonlocal condensates. We argue that pattern formation, emerging from the collective synchronization of coupled oscillators, can provide a single theoretical scaffolding to study unimodal and bimodal distribution amplitudes of light mesons without recourse to particular computational schemes and the reasons for them.

  5. 75 FR 78676 - De Facto Criteria for Establishing a Separate Rate in Antidumping Proceedings Involving Non...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-16

    ...In antidumping proceedings involving non-market economy (``NME'') countries,\\1\\ the Department of Commerce (``the Department'') has a rebuttable presumption that the export activities of all companies within the country are subject to government control and, thus, should be assessed a single antidumping duty rate (i.e., the NME- Entity rate). It is the Department's policy to assign to all exporters of merchandise subject to investigation in an NME country this single rate unless an exporter can demonstrate that it is sufficiently independent so as to be entitled to a ``separate rate'' (i.e., a dumping margin separate from the margin assigned to the NME-Entity). Exporters can demonstrate this independence through the absence of both de jure and de facto governmental control over their export activities. ---------------------------------------------------------------------------

  6. AAL Platform with a "De Facto" Standard Communication Interface (TICO): Training in Home Control in Special Education.

    PubMed

    Guillomía San Bartolomé, Miguel A; Falcó Boudet, Jorge L; Artigas Maestre, José Ignacio; Sánchez Agustín, Ana

    2017-10-12

    Framed within a long-term cooperation between university and special education teachers, training in alternative communication skills and home control was realized using the "TICO" interface, a communication panel editor extensively used in special education schools. From a technological view we follow AAL technology trends by integrating a successful interface in a heterogeneous services AAL platform, focusing on a functional view. Educationally, a very flexible interface in line with communication training allows dynamic adjustment of complexity, enhanced by an accessible mindset and virtual elements significance already in use, offers specific interaction feedback, adapts to the evolving needs and capacities and improves the personal autonomy and self-confidence of children at school and home. TICO-home-control was installed during the last school year in the library of a special education school to study adaptations and training strategies to enhance the autonomy opportunities of its pupils. The methodology involved a case study and structured and semi-structured observations. Five children, considered unable to use commercial home control systems were trained obtaining good results in enabling them to use an open home control system. Moreover this AAL platform has proved efficient in training children in previous cognitive steps like virtual representation and cause-effect interaction.

  7. Twenty Years after Brown: Where Are We Now?

    ERIC Educational Resources Information Center

    Edelman, Marian Wright

    1974-01-01

    To resolve the basic question of what is unconstitutional school segregation, lawyer Edelman suggests bombarding the citadel of de facto segregation with a legal strategy that expands the definition of de jure segregation to include acts that produce segregation in fact. (Editor)

  8. Effectively incorporating selected multimedia content into medical publications

    PubMed Central

    2011-01-01

    Until fairly recently, medical publications have been handicapped by being restricted to non-electronic formats, effectively preventing the dissemination of complex audiovisual and three-dimensional data. However, authors and readers could significantly profit from advances in electronic publishing that permit the inclusion of multimedia content directly into an article. For the first time, the de facto gold standard for scientific publishing, the portable document format (PDF), is used here as a platform to embed a video and an audio sequence of patient data into a publication. Fully interactive three-dimensional models of a face and a schematic representation of a human brain are also part of this publication. We discuss the potential of this approach and its impact on the communication of scientific medical data, particularly with regard to electronic and open access publications. Finally, we emphasise how medical teaching can benefit from this new tool and comment on the future of medical publishing. PMID:21329532

  9. Effectively incorporating selected multimedia content into medical publications.

    PubMed

    Ziegler, Alexander; Mietchen, Daniel; Faber, Cornelius; von Hausen, Wolfram; Schöbel, Christoph; Sellerer, Markus; Ziegler, Andreas

    2011-02-17

    Until fairly recently, medical publications have been handicapped by being restricted to non-electronic formats, effectively preventing the dissemination of complex audiovisual and three-dimensional data. However, authors and readers could significantly profit from advances in electronic publishing that permit the inclusion of multimedia content directly into an article. For the first time, the de facto gold standard for scientific publishing, the portable document format (PDF), is used here as a platform to embed a video and an audio sequence of patient data into a publication. Fully interactive three-dimensional models of a face and a schematic representation of a human brain are also part of this publication. We discuss the potential of this approach and its impact on the communication of scientific medical data, particularly with regard to electronic and open access publications. Finally, we emphasise how medical teaching can benefit from this new tool and comment on the future of medical publishing.

  10. "Brown" and Black-White Achievement

    ERIC Educational Resources Information Center

    Armor, David J.

    2006-01-01

    "Brown v. Board of Education" only presumed to eliminate the "de jure" apartheid that existed in 1954. It was never intended to resolve the "de facto" gap in minority achievement that still faces education policymakers today. Sociologist David J. Armor goes beyond "Brown" to identify a set of definite risk…

  11. How Do the Approaches to Accountability Compare for Charities Working in International Development?

    PubMed Central

    Kirsch, David

    2014-01-01

    Approaches to accountability vary between charities working to reduce under-five mortality in underdeveloped countries, and healthcare workers and facilities in Canada. Comparison reveals key differences, similarities and trade-offs. For example, while health professionals are governed by legislation and healthcare facilities have a de facto obligation to be accredited, charities and other international organizations are not subject to mandatory international laws or guidelines or to de facto international standards. Charities have policy goals similar to those found in the Canadian substudies, including access, quality, cost control, cost-effectiveness and customer satisfaction. However, the relative absence of external policy tools means that these goals may not be realized. Accountability can be beneficial, but too much or the wrong kind of accountability can divert resources and diminish returns. PMID:25305397

  12. Project on National Security Reform: Forging a New Shield

    DTIC Science & Technology

    2008-11-01

    lead individual has the de jure or de facto authority to command independent departments and agencies. The lead agency approach thus usually means...Director George Tenet remarked about the absence of a Principals Committee meeting to consider the ― de -Baathification of Iraqi society‖ following the...the move.‖ Tenet complained that senior U.S. officials in Baghdad announced the orders on de - Baathification ―to Iraq and the world‖ but that the

  13. How do the approaches to accountability compare for charities working in international development?

    PubMed

    Kirsch, David

    2014-09-01

    Approaches to accountability vary between charities working to reduce under-five mortality in underdeveloped countries, and healthcare workers and facilities in Canada. Comparison reveals key differences, similarities and trade-offs. For example, while health professionals are governed by legislation and healthcare facilities have a de facto obligation to be accredited, charities and other international organizations are not subject to mandatory international laws or guidelines or to de facto international standards. Charities have policy goals similar to those found in the Canadian substudies, including access, quality, cost control, cost-effectiveness and customer satisfaction. However, the relative absence of external policy tools means that these goals may not be realized. Accountability can be beneficial, but too much or the wrong kind of accountability can divert resources and diminish returns. Copyright © 2014 Longwoods Publishing.

  14. The 3D Hough Transform for plane detection in point clouds: A review and a new accumulator design

    NASA Astrophysics Data System (ADS)

    Borrmann, Dorit; Elseberg, Jan; Lingemann, Kai; Nüchter, Andreas

    2011-03-01

    The Hough Transform is a well-known method for detecting parameterized objects. It is the de facto standard for detecting lines and circles in 2-dimensional data sets. For 3D it has attained little attention so far. Even for the 2D case high computational costs have lead to the development of numerous variations for the Hough Transform. In this article we evaluate different variants of the Hough Transform with respect to their applicability to detect planes in 3D point clouds reliably. Apart from computational costs, the main problem is the representation of the accumulator. Usual implementations favor geometrical objects with certain parameters due to uneven sampling of the parameter space. We present a novel approach to design the accumulator focusing on achieving the same size for each cell and compare it to existing designs. [Figure not available: see fulltext.

  15. Evolving a lingua franca and associated software infrastructure for computational systems biology: the Systems Biology Markup Language (SBML) project.

    PubMed

    Hucka, M; Finney, A; Bornstein, B J; Keating, S M; Shapiro, B E; Matthews, J; Kovitz, B L; Schilstra, M J; Funahashi, A; Doyle, J C; Kitano, H

    2004-06-01

    Biologists are increasingly recognising that computational modelling is crucial for making sense of the vast quantities of complex experimental data that are now being collected. The systems biology field needs agreed-upon information standards if models are to be shared, evaluated and developed cooperatively. Over the last four years, our team has been developing the Systems Biology Markup Language (SBML) in collaboration with an international community of modellers and software developers. SBML has become a de facto standard format for representing formal, quantitative and qualitative models at the level of biochemical reactions and regulatory networks. In this article, we summarise the current and upcoming versions of SBML and our efforts at developing software infrastructure for supporting and broadening its use. We also provide a brief overview of the many SBML-compatible software tools available today.

  16. Ungoverned Territories. Understanding and Reducing Terrorism Risks

    DTIC Science & Technology

    2007-01-01

    Paper 357, Oxford: OUP-IISS, 2003. “El mayor riesgo de ser victima de homicidio en Costa Rica, Guatemala y El Salvador,” Fundacion Genero y Sociedad...Afghan opiates. According to the Paris-based Observa- toire Geopolitique de Drouges, the movement netted at least $60 million from the heroin trade...Middle East was reorganized after the end of World War I.4 Prior to that period, the area was loosely ruled by the Ottoman Empire, with de facto

  17. Dewey and Sports: An Overview of Sport in His Work

    ERIC Educational Resources Information Center

    Jaitner, David

    2016-01-01

    From beginning to end, John Dewey's oeuvre is filled with philosophical discussions and political comments on the significance de jure and de facto of a wide range of distinct social spaces. In contrast to subjects he addresses regularly and others that he focuses on occasionally, his work does not systematically address sport. Nonetheless, sport…

  18. Working through the Challenges: Struggle and Resilience within the Historically Black Land Grant Institutions

    ERIC Educational Resources Information Center

    Harris, Rosalind P.; Worthen, H. Dreamal

    2004-01-01

    In 1890 the Morrill-McComas Act provided for the establishment of segregated land grant colleges within the sixteen southern and border states practicing both "de jure" and "de facto" racial discrimination. This article reviews the significant role played by the legacies of racial discrimination, funding inequities and the model of institution…

  19. 75 FR 56070 - Certain Steel Nails From the People's Republic of China: Notice of Preliminary Results and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-15

    ..., whether by electroplating or hot-dipping one or more times), phosphate cement, and paint. Head styles... control and thus should be assessed a single antidumping duty rate. See, e.g., Polyethylene Terephthalate... independence through the absence of both de jure and de facto government control over export activities. Id...

  20. 76 FR 56147 - Certain Steel Nails From the People's Republic of China: Preliminary Results and Preliminary...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-12

    ... electroplating or hot dipping one or more times), phosphate cement, and paint. Head styles include, but are not... control and thus should be assessed a single antidumping duty rate.\\28\\ Exporters can demonstrate this independence through the absence of both de jure and de facto government control over export activities. Id...

  1. 47 CFR 1.9035 - Short-term de facto transfer leasing arrangements.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... spectrum commercially. (2) Designated entity/entrepreneur rules. Unjust enrichment provisions (see § 1.2111... pursuant to the Commission's small business and/or entrepreneur provisions (see § 1.2110 and § 24.709 of...

  2. 78 FR 25782 - Certificates of Public Convenience and Necessity and Foreign Air Carrier Permits

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-02

    ...., US Airways, Inc., American Eagle Airlines, Inc., PSA Airlines, Inc. and Piedmont Airlines, Inc. (collectively, the ``Joint Applicants'') requesting approval of the de facto route transfer of US Airways', PSA...

  3. A New Data Access Mechanism for HDFS

    NASA Astrophysics Data System (ADS)

    Li, Qiang; Sun, Zhenyu; Wei, Zhanchen; Sun, Gongxing

    2017-10-01

    With the era of big data emerging, Hadoop has become the de facto standard of big data processing platform. However, it is still difficult to get legacy applications, such as High Energy Physics (HEP) applications, to run efficiently on Hadoop platform. There are two reasons which lead to the difficulties mentioned above: firstly, random access is not supported on Hadoop File System (HDFS), secondly, it is difficult to make legacy applications adopt to HDFS streaming data processing mode. In order to address the two issues, a new read and write mechanism of HDFS is proposed. With this mechanism, data access is done on the local file system instead of through HDFS streaming interfaces. To enable files modified by users, three attributes including permissions, owner and group are imposed on Block objects. Blocks stored on Datanodes have the same attributes as the file they are owned by. Users can modify blocks when the Map task running locally, and HDFS is responsible to update the rest replicas later after the block modification finished. To further improve the performance of Hadoop system, a complete localization task execution mechanism is implemented for I/O intensive jobs. Test results show that average CPU utilization is improved by 10% with the new task selection strategy, data read and write performances are improved by about 10% and 30% separately.

  4. The Legacy Effect: Understanding How Segregation and Environmental Injustice Unfold over Time in Baltimore

    Treesearch

    Morgan Grove; Laura Ogden; Steward Pickett; Chris Boone; Geoff Buckley; Dexter H. Locke; Charlie Lord; Billy Hall

    2018-01-01

    Legacies of social and environmental injustices can leave an imprint on the present and constrain transitions for more sustainable futures. In this article, we ask this question: What is the relationship of environmental inequality and histories of segregation? The answer for Baltimore is complex, where past practices of de jure and de facto segregation have created...

  5. Ethics and choosing appropriate means to an end: problems with coal mine and nuclear workplace safety.

    PubMed

    Shrader-Frechette, Kristin; Cooke, Roger

    2004-02-01

    A common problem in ethics is that people often desire an end but fail to take the means necessary to achieve it. Employers and employees may desire the safety end mandated by performance standards for pollution control, but they may fail to employ the means, specification standards, necessary to achieve this end. This article argues that current (de jure) performance standards, for lowering employee exposures to ionizing radiation, fail to promote de facto worker welfare, in part because employers and employees do not follow the necessary means (practices known as specification standards) to achieve the end (performance standards) of workplace safety. To support this conclusion, the article argues that (1) safety requires attention to specification, as well as performance, standards; (2) coal-mine specification standards may fail to promote performance standards; (3) nuclear workplace standards may do the same; (4) choosing appropriate means to the end of safety requires attention to the ways uncertainties and variations in exposure may mask violations of standards; and (5) correcting regulatory inattention to differences between de jure and de facto is necessary for achievement of ethical goals for safety.

  6. Microcomputers in Libraries: The Quiet Revolution.

    ERIC Educational Resources Information Center

    Boss, Richard

    1985-01-01

    This article defines three separate categories of microcomputers--personal, desk-top, multi-user devices--and relates storage capabilities (expandability, floppy disks) to library applications. Highlghts include de facto standards, operating systems, database management systems, applications software, circulation control systems, dumb and…

  7. Development of an E-Prime Based Computer Simulation of an Interactive Human Rights Violation Negotiation Script (Developpement d’un Programme de Simulation par Ordinateur Fonde sur le Logiciel E Prime pour la Negociation Interactive en cas de Violation des Droits de la Personne)

    DTIC Science & Technology

    2010-12-01

    Base ( CFB ) Kingston. The computer simulation developed in this project is intended to be used for future research and as a possible training platform...DRDC Toronto No. CR 2010-055 Development of an E-Prime based computer simulation of an interactive Human Rights Violation negotiation script...Abstract This report describes the method of developing an E-Prime computer simulation of an interactive Human Rights Violation (HRV) negotiation. An

  8. Summer and winter plankton fish assemblages around offshore oil and gas platforms in south-eastern Australia

    NASA Astrophysics Data System (ADS)

    Neira, Francisco J.

    2005-06-01

    Opportunistic plankton surveys were conducted within a 5-nmi radius of nine offshore oil and gas platforms in Bass Strait, south-eastern Australia, in February 1998 and 1999 (summer) and August 1998 (winter). The 108 day-night samples collected alongside (vertical tows) and nearby (surface and oblique tows) platforms yielded 1526 larval and early juvenile fishes representing 55 taxa from 45 families. Epipelagic/mesopelagic taxa dominated the catches, whereas hard/soft habitat-associated taxa were uncommon. Carangidae (36.2%) and Myctophidae (31.5%) dominated in summer and winter, respectively. The most abundant taxon was Trachurus declivis (Carangidae, 35.1%), followed by Bovichtus angustifrons (Bovichtidae, 8.7%), Scomberesox saurus (Scomberesocidae, 3.7%), Centroberyx affinis (Berycidae, 3.0%) and Arripis trutta (Arripidae, 1.7%). Fish concentrations (nos. per 100 m 3) alongside platforms did not differ significantly between day and night across all surveys. Likewise, concentrations nearby platforms in February 1999, including those of T. declivis, did not vary significantly by tow type (surface vs. oblique) or day vs. night. The far greater diversity and abundance recorded in February 1999 are likely the result of upwelling conditions over the eastern Bass Strait shelf during the sampling period, and which were not detected in February 1998. In the absence of data on adult fishes associated with the Bass Strait platforms, and given the limited availability of reefs directly around the area, it could be argued that some of the taxa caught may originate from spawning around neighboring natural reefs, particularly those off the Gippsland coastline and the south-east corner of mainland Australia. However, the prime position of the platforms almost right in the center of a productivity "hotspot" would have a confounding effect on the potential source(s) of larval fishes in that region of south-eastern Australia. The role of platforms as potential de-facto reefs for juvenile fishes in Bass Strait, as well as spawning areas, is discussed based on the findings of this study, the first on early stages of fishes around oil and gas platforms in Australia.

  9. Contaminants of Emerging Concern During De Facto Water Reuse

    EPA Science Inventory

    Our drinking water and wastewater cycles are integrally linked. Chemicals that are present in household wastewater may be sufficiently mobile and persistent to survive both on-site or municipal wastewater treatment and post-discharge environmental processes. Thus, such contamin...

  10. PyPele Rewritten To Use MPI

    NASA Technical Reports Server (NTRS)

    Hockney, George; Lee, Seungwon

    2008-01-01

    A computer program known as PyPele, originally written as a Pythonlanguage extension module of a C++ language program, has been rewritten in pure Python language. The original version of PyPele dispatches and coordinates parallel-processing tasks on cluster computers and provides a conceptual framework for spacecraft-mission- design and -analysis software tools to run in an embarrassingly parallel mode. The original version of PyPele uses SSH (Secure Shell a set of standards and an associated network protocol for establishing a secure channel between a local and a remote computer) to coordinate parallel processing. Instead of SSH, the present Python version of PyPele uses Message Passing Interface (MPI) [an unofficial de-facto standard language-independent application programming interface for message- passing on a parallel computer] while keeping the same user interface. The use of MPI instead of SSH and the preservation of the original PyPele user interface make it possible for parallel application programs written previously for the original version of PyPele to run on MPI-based cluster computers. As a result, engineers using the previously written application programs can take advantage of embarrassing parallelism without need to rewrite those programs.

  11. Terrorism in the United States 1999

    DTIC Science & Technology

    1999-01-01

    Rican separatist groups such as the Armed Forces for Puerto Rican National Liberation (FALN–Fuerzas Armadas de Liberacion Nacional Puertorriquena...use vio- lence and intimidation as a tool of foreign policy. The U.S. Department of State currently des - ignates seven nations–Cuba, Iran, Iraq...organizations, and his prominent standing among the mujahedin and other disaffected populations have established him as a type of de facto state sponsor of

  12. To Marry or Not to Marry: Marital Status and the Household Division of Labor

    ERIC Educational Resources Information Center

    Baxter, Janeen

    2005-01-01

    Data from an Australian national survey (1996 to 1997) are used to examine domestic labor patterns among de facto and married men and women. The results show that women spend more time on housework and do a greater proportion of housework than men. However, the patterns are most traditional among married men and women. Women in de facto…

  13. Army Contracting Command Workforce Model Analysis

    DTIC Science & Technology

    2012-02-09

    Empresas in Madrid. His Air Force contracting experience includes F-22 Fighter, C-17 Cargo Transport , and a contingency deployment as director of Joint...and the University of Maryland (University College). He has also conducted visiting seminars at American University in Cairo and Instituto de ...the long total process times that are sometimes involved in weapon system contracting, such an assessment may equate to a de facto future work

  14. What About Legal Literacy?

    ERIC Educational Resources Information Center

    Taylor, Kelly R.

    2001-01-01

    A brief list of basic legal concepts and terms to help principals become more familiar with some of the Constitutional and statutory procedural and substantive aspects of school law, including, for example, concepts like due process and privacy and terms like En Banc and De Facto. (PKP)

  15. Understanding U.S. Strategy: A Reader,

    DTIC Science & Technology

    1983-01-01

    Originally intended asa means of prevntlng Sp nihandPr tuguee 1 :1 In the Western Hemisphre after de ~t of bap~’ Frac, the Monroe Doctrine evtually...goals ntm de isel broso sasatttnololes ss ut Ivet ay lmso keh undes1rabl rases at th npatioa evel6_ heae t o nffigswltthepIOwllO o - neorA - w att leg...surreptitiously (which could easily go undiscov- ered by the United States), and abrogated the treaty in de facto fashion by using its space weaponry at the

  16. China’s Forbearance Has Limits: Chinese Threat and Retaliation Signaling and Its Implications for a Sino-American Military Confrontation

    DTIC Science & Technology

    2013-04-01

    the United States. Consequently, China should not be fearful of em- ploying military force to deter Taiwan’s de jure independence because the United...critical ways. First is the reality that while those two regions are integral components of the PRC, Taiwan has functioned as a de facto independent...sovereign territories, and political interests from predatory adversaries. These security and national interest issues have varied in their de - gree of

  17. An Energy Efficient Protocol For The Internet Of Things

    NASA Astrophysics Data System (ADS)

    Venčkauskas, Algimantas; Jusas, Nerijus; Kazanavičius, Egidijus; Štuikys, Vytautas

    2015-01-01

    The Internet of Things (IoT) is a technological revolution that represents the future of computing and communications. One of the most important challenges of IoT is security: protection of data and privacy. The SSL protocol is the de-facto standard for secure Internet communications. The extra energy cost of encrypting and authenticating of the application data with SSL is around 15%. For IoT devices, where energy resources are limited, the increase in the cost of energy is a very significant factor. In this paper we present the energy efficient SSL protocol which ensures the maximum bandwidth and the required level of security with minimum energy consumption. The proper selection of the security level and CPU multiplier, can save up to 85% of the energy required for data encryption.

  18. Multilayer modeling and analysis of human brain networks

    PubMed Central

    2017-01-01

    Abstract Understanding how the human brain is structured, and how its architecture is related to function, is of paramount importance for a variety of applications, including but not limited to new ways to prevent, deal with, and cure brain diseases, such as Alzheimer’s or Parkinson’s, and psychiatric disorders, such as schizophrenia. The recent advances in structural and functional neuroimaging, together with the increasing attitude toward interdisciplinary approaches involving computer science, mathematics, and physics, are fostering interesting results from computational neuroscience that are quite often based on the analysis of complex network representation of the human brain. In recent years, this representation experienced a theoretical and computational revolution that is breaching neuroscience, allowing us to cope with the increasing complexity of the human brain across multiple scales and in multiple dimensions and to model structural and functional connectivity from new perspectives, often combined with each other. In this work, we will review the main achievements obtained from interdisciplinary research based on magnetic resonance imaging and establish de facto, the birth of multilayer network analysis and modeling of the human brain. PMID:28327916

  19. The Distorted Looking Glass: Examining How Housing Identity Privilege Obviates the Goals of "Brown v. Board of Education" at 60

    ERIC Educational Resources Information Center

    Gooden, Mark A.; Thompson Dorsey, Dana N.

    2014-01-01

    Background: In 1954, the "Brown v. Board of Education" case involved four states and their school segregation laws and policies. During that period, de jure and de facto segregation were a way of life in America. Sixty years later, as most schools across the country have resegregated, the authors ask the question of whether we should be…

  20. Unique Problems of the Inner City Colleges.

    ERIC Educational Resources Information Center

    Lombardi, John

    Urban changes such as population increase, shifts in population groups, suburban growth, and central city decay have produced special problems for the inner city college--de facto segregation, inadequate education programs, racial imbalance of employees, and discrimination in student participant activities. The attempts to prevent…

  1. The Language Situation in Mexico

    ERIC Educational Resources Information Center

    Terborg, Roland; Landa, Laura Garcia; Moore, Pauline

    2006-01-01

    This monograph will cover the language situation in Mexico; a linguistically very complex country with 62 recognised indigenous languages, the "de facto" official language, Spanish, and some immigrant languages of lesser importance. Throughout the monograph, we will concentrate on three distinct challenges which we consider relevant for…

  2. Empowerment, Coercive Persuasion and Organizational Learning: Do They Connect?

    ERIC Educational Resources Information Center

    Schein, Edgar H.

    1999-01-01

    Individual learning in organizations can be de facto coercive persuasion when organizational learning and culture change require that learners develop appropriate attitudes and thinking. If the goal of organizational learning--innovative organizations--is accepted, moral choices that restrict individual freedom must be made. (SK)

  3. Librarians and Graphic Design: Preparation, Roles, and Desired Support

    ERIC Educational Resources Information Center

    Wakimoto, Diana K.

    2015-01-01

    Librarians often become de facto graphic designers for their libraries, taking responsibility for designing signage, handouts, brochures, web pages, and many other promotional, instructional, and wayfinding documents. However, the majority of librarians with graphic design responsibilities are not trained as graphic designers. This exploratory…

  4. Anglicising Postapartheid South Africa

    ERIC Educational Resources Information Center

    Louw, P. Eric

    2004-01-01

    The apartheid state deliberately encouraged linguistic diversity and actively built cultural infrastructures which impeded Anglicisation. With the end of apartheid has come "de facto" Anglicisation. So although South Africa has, since 1994, had 11 official languages, in reality, English is swamping the other 10 languages. Afrikaans has,…

  5. Assessing the public health impacts of legalizing recreational cannabis use in the USA.

    PubMed

    Hall, W; Weier, M

    2015-06-01

    A major challenge in assessing the public health impact of legalizing cannabis use in Colorado and Washington State is the absence of any experience with legal cannabis markets. The Netherlands created a de facto legalized cannabis market for recreational use, but policy analysts disagree about how it has affected rates of cannabis use. Some US states have created de facto legal supply of cannabis for medical use. So far this policy does not appear to have increased cannabis use or cannabis-related harm. Given experience with more liberal alcohol policies, the legalization of recreational cannabis use is likely to increase use among current users. It is also likely that legalization will increase the number of new users among young adults but it remains uncertain how many may be recruited, within what time frame, among which groups within the population, and how many of these new users will become regular users. © 2015 American Society for Clinical Pharmacology and Therapeutics.

  6. Eigenvector decomposition of full-spectrum x-ray computed tomography.

    PubMed

    Gonzales, Brian J; Lalush, David S

    2012-03-07

    Energy-discriminated x-ray computed tomography (CT) data were projected onto a set of basis functions to suppress the noise in filtered back-projection (FBP) reconstructions. The x-ray CT data were acquired using a novel x-ray system which incorporated a single-pixel photon-counting x-ray detector to measure the x-ray spectrum for each projection ray. A matrix of the spectral response of different materials was decomposed using eigenvalue decomposition to form the basis functions. Projection of FBP onto basis functions created a de facto image segmentation of multiple contrast agents. Final reconstructions showed significant noise suppression while preserving important energy-axis data. The noise suppression was demonstrated by a marked improvement in the signal-to-noise ratio (SNR) along the energy axis for multiple regions of interest in the reconstructed images. Basis functions used on a more coarsely sampled energy axis still showed an improved SNR. We conclude that the noise-resolution trade off along the energy axis was significantly improved using the eigenvalue decomposition basis functions.

  7. OpenID Connect as a security service in cloud-based medical imaging systems.

    PubMed

    Ma, Weina; Sartipi, Kamran; Sharghigoorabi, Hassan; Koff, David; Bak, Peter

    2016-04-01

    The evolution of cloud computing is driving the next generation of medical imaging systems. However, privacy and security concerns have been consistently regarded as the major obstacles for adoption of cloud computing by healthcare domains. OpenID Connect, combining OpenID and OAuth together, is an emerging representational state transfer-based federated identity solution. It is one of the most adopted open standards to potentially become the de facto standard for securing cloud computing and mobile applications, which is also regarded as "Kerberos of cloud." We introduce OpenID Connect as an authentication and authorization service in cloud-based diagnostic imaging (DI) systems, and propose enhancements that allow for incorporating this technology within distributed enterprise environments. The objective of this study is to offer solutions for secure sharing of medical images among diagnostic imaging repository (DI-r) and heterogeneous picture archiving and communication systems (PACS) as well as Web-based and mobile clients in the cloud ecosystem. The main objective is to use OpenID Connect open-source single sign-on and authorization service and in a user-centric manner, while deploying DI-r and PACS to private or community clouds should provide equivalent security levels to traditional computing model.

  8. The nursing department's view towards moroccan patients.

    PubMed

    Sánchez-Ojeda, María Angustias; Alemany Arrebola, Inmaculada; Gallardo Vigil, Miguel Ángel

    2017-05-25

    To determine the Melilla Hospital Nursing Department's attitude towards Moroccan patients. Descriptive ex post facto study. A questionnaire has been handed over to staff, on the Immigration Attitude Scale for Nursing. In general, nurses exhibit negative attitudes towards Moroccan patients, such as: the increase in crime is caused by the arrival of immigrants, those who commit offenses must be expelled from Spain, they take advantage of the Spanish health system and too many resources are devoted to immigration. The worst-rated immigrants are the Moroccans, considering that they do not pay much attention to their personal hygiene and do not adapt to their host countries. It is necessary to work with the nursing staff to change these attitudes. Future degree students must be trained in cultural skills and the care of immigrants will improve with a greater commitment towards cultural differences. Conocer la actitud de enfermería del Hospital de Melilla hacia los pacientes marroquíes. Estudio ex post facto descriptivo. Se ha pasado un cuestionario de Escala de Actitud ante la Inmigración para Enfermería. En general las enfermeras presentan actitudes negativas ante los pacientes marroquíes, como: el aumento de la delincuencia es provocado por la llegada de inmigrantes, los que delinquen deben ser expulsados de España, se aprovechan del sistema sanitario y se dedican demasiados recursos para la inmigración. Los inmigrantes peores valorados son los marroquíes, considerando que son pocos cuidadosos con su higiene personal y no se adaptan a los países de acogida. Es necesario trabajar con el personal de enfermería para que cambien estas actitudes. Las futuras promociones de Grado deben estar formadas en competencias culturales y mejorarán los cuidados a los inmigrantes como un mayor compromiso con la diferencia cultural.

  9. "Becoming" Teachers of Inner-City Students: Identification Creativity and Curriculum Wisdom of Committed White Male Teachers

    ERIC Educational Resources Information Center

    Jupp, James C.; Slattery, G. Patrick, Jr.

    2012-01-01

    Broadly speaking, this reflection approaches the on-going concern of capacitating an overwhelmingly White teaching profession for effectively teaching inner-city students attending "de facto" segregated schools. Using professional identifications, this reflection presents narrativized understanding of respondents' "becoming"…

  10. 47 CFR 1.9030 - Long-term de facto transfer leasing arrangements.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... additional qualifications are met: (A) The license does not involve spectrum licensed in a Wireless Radio... facilitate international or Interdepartment Radio Advisory Committee (IRAC) coordination). (4) The spectrum... spectrum leasing arrangement involving a licensee in the Public Safety Radio Services (see part 90, subpart...

  11. 47 CFR 1.9030 - Long-term de facto transfer leasing arrangements.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... additional qualifications are met: (A) The license does not involve spectrum licensed in a Wireless Radio... facilitate international or Interdepartment Radio Advisory Committee (IRAC) coordination). (4) The spectrum... spectrum leasing arrangement involving a licensee in the Public Safety Radio Services (see part 90, subpart...

  12. 47 CFR 1.9030 - Long-term de facto transfer leasing arrangements.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... additional qualifications are met: (A) The license does not involve spectrum licensed in a Wireless Radio... facilitate international or Interdepartment Radio Advisory Committee (IRAC) coordination). (4) The spectrum... spectrum leasing arrangement involving a licensee in the Public Safety Radio Services (see part 90, subpart...

  13. Spatial Planning of School Districts

    ERIC Educational Resources Information Center

    Maxfield, Donald W.

    1972-01-01

    The development of several plans based on linear programming and geographic methodology will permit school administrators to make better decisions concerning the planning of school districts: where to locate boundaries, how to eliminate overcrowding, where to locate new classrooms, and how to overcome de facto segregation. The primal and dual…

  14. Parameters, Journal of the US Army War College, Volume 16, Number 2, Summer 1986,

    DTIC Science & Technology

    1986-01-01

    o rfugesbutths des ot (London: International Institute for Strategic Studies. 1985).% thlo frangslat int effectives arme 2. American military...adversaries. Although the theory and de - ... some allies still have reservations. NATO dlared intent behind SD! seem logical and :. . nations view the...Overall, it will be actions, T strategic nuclear capabilities of our allies and rather than words, that will determine the caused their de facto disarmament

  15. Route Sanitizer: Connected Vehicle Trajectory De-Identification Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carter, Jason M; Ferber, Aaron E

    Route Sanitizer is ORNL's connected vehicle moving object database de-identification tool and a graphical user interface to ORNL's connected vehicle de-identification algorithm. It uses the Google Chrome (soon to be Electron) platform so it will run on different computing platforms. The basic de-identification strategy is record redaction: portions of a vehicle trajectory (e.g. sequences of precise temporal spatial records) are removed. It does not alter retained records. The algorithm uses custom techniques to find areas within trajectories that may be considered private, then it suppresses those in addition to enough of the trajectory surrounding those locations to protect against "inferencemore » attacks" in a mathematically sound way. Map data is integrated into the process to make this possible.« less

  16. How Have Concepts of Informal Learning Developed over Time?

    ERIC Educational Resources Information Center

    Carliner, Saul

    2013-01-01

    Although the current interest in informal learning seems recent, performance improvement professionals have long had an interest in informal learning-the ways that people learn outside of formal structures. The earliest forms of learning for work were informal, including de facto and formal apprenticeship programs and the "school of…

  17. 77 FR 61330 - Policies Regarding Mobile Spectrum Holdings

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-09

    ... the transaction and is based on the size of the post-transaction Herfindahl-Hirschman Index (HHI) \\26... further case-by-case market analysis those markets in which, post-transaction, the HHI would be greater... De Facto Transfer Leasing Arrangements and Petition for Declaratory Ruling that the Transaction is...

  18. 47 CFR 24.720 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... vertical ownership chain. (e) Control Group. A control group is an entity, or a group of individuals or... assure de facto control, such as for example, when the voting stock of the control group is widely... member of the applicant's (or licensee's) control group and whose gross revenues and total assets, when...

  19. 47 CFR 24.720 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... vertical ownership chain. (e) Control Group. A control group is an entity, or a group of individuals or... assure de facto control, such as for example, when the voting stock of the control group is widely... member of the applicant's (or licensee's) control group and whose gross revenues and total assets, when...

  20. 47 CFR 24.720 - Definitions.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... vertical ownership chain. (e) Control Group. A control group is an entity, or a group of individuals or... assure de facto control, such as for example, when the voting stock of the control group is widely... member of the applicant's (or licensee's) control group and whose gross revenues and total assets, when...

  1. 47 CFR 24.720 - Definitions.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... vertical ownership chain. (e) Control Group. A control group is an entity, or a group of individuals or... assure de facto control, such as for example, when the voting stock of the control group is widely... member of the applicant's (or licensee's) control group and whose gross revenues and total assets, when...

  2. 47 CFR 24.720 - Definitions.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... vertical ownership chain. (e) Control Group. A control group is an entity, or a group of individuals or... assure de facto control, such as for example, when the voting stock of the control group is widely... member of the applicant's (or licensee's) control group and whose gross revenues and total assets, when...

  3. Whitened Rainbows: How White College Students Protect Whiteness through Diversity Discourses

    ERIC Educational Resources Information Center

    Hikido, Annie; Murray, Susan B.

    2016-01-01

    This qualitative study investigates white students' attitudes toward campus diversity at a large, multiracial public university. Drawing upon focus group data gathered from a larger campus climate study, we identified four themes: participants voiced that: (1) racial diversity fosters campus tolerance; (2) diversity fragments into de facto racial…

  4. CRS Issue Statement on Latin America and the Caribbean

    DTIC Science & Technology

    2010-01-12

    government, while the refusal of de facto President Roberto Micheletti to resign his interim presidency before Lobo’s inauguration could make it...jbeittel@crs.loc.gov, 7-7613 Peter J. Meyer Analyst in Latin American Affairs pmeyer@crs.loc.gov, 7-5474 Mary Jane Bolle Specialist in

  5. GLOBAL ASSESSMENT OF WASTEWATER IRRIGATION: UNDERSTANDING HEALTH RISKS AND CONTRIBUTIONS TO FOOD SECURITY USING AN ENVIRONMENTAL SYSTEMS APPROACH

    EPA Science Inventory

    This research will quantify the extent of de facto reuse of untreated wastewater at the global scale. Through the integration of multiple existing spatial data sources, this project will produce rigorous analyses assessing the relationship between wastewater irrigation, hea...

  6. The Northern Territories: Case Study in Japanese-Soviet Relations.

    DTIC Science & Technology

    1982-06-01

    types ( watermelons and tomatoes) due to the relatively warm climatic conditions. Pigs, sheep, and horses populate most all settlements, and dairy...reluctantly accepted these restrictions because of its depend- ence on fish as its primary source of protein , but in doing so has given de facto recognition to

  7. MBTA: Management By Timeshifting Around

    NASA Astrophysics Data System (ADS)

    Carmel, Erran

    How do good managers manage and coordinate? As technologies evolve the answer has also been evolving—from MBWA (Management By Wandering Around), to MBFA (Management By Flying Around), and now to MBTA (Management By Timeshifting Around). The purpose of this chapter is to surface and introduce this de-facto managerial approach.

  8. Exclusionary Discipline and the Forfeiture of Special Education Rights: A Survey.

    ERIC Educational Resources Information Center

    King, Ashley Thomas

    1996-01-01

    A survey of exclusionary discipline practices with handicapped students revealed a national pattern of "de facto" differential treatment. In denying a school's unilateral authority to remove dangerous or disruptive students, the Supreme Court's judgment in "Honig v. Doe" (1988) took precedence over all earlier court decisions.…

  9. Annotated Bibliography and Summaries of Reference Materials. School Desegregation/Integration Notebook.

    ERIC Educational Resources Information Center

    American Civil Liberties Union, New York, NY.

    This annotated bibliography provides a framework within which questions and answers about the school desegregation process can be formulated and addressed. A glossary of terms dealing with school integration are included. Among these are the following: ability grouping, annexation, bilingual education, clustering, consolidation, de facto and de…

  10. Neoliberalism, Performance Measurement, and the Governance of American Academic Science

    ERIC Educational Resources Information Center

    Feller, Irwin

    2008-01-01

    The international thrust of neoliberal liberal policies on higher education systems has generally been to reduce governmental control over the operations of universities in de facto exchange for these institutions assuming increased responsibility for generating a larger share of their revenues and for providing quantitative evidence of…

  11. Pulling Away from the Racial Gerrymander.

    ERIC Educational Resources Information Center

    Ehrenhalt, Alan

    1983-01-01

    Although Black and Hispanic congressmen managed to overcome redistricting of urban areas and maintain or increase their seats in the House, demographic trends remain a threat to minority representation. There are two solutions: (1) a break-up of de facto residential segregation, and (2) increased willingness of White majorities to elect minority…

  12. TOKEN DESEGREGATION AND BEYOND.

    ERIC Educational Resources Information Center

    MORLAND, J. KENNETH

    TOKEN INTEGRATION IN THE SOUTH HAD ESSENTIALLY THE SAME GOAL THAT MASSIVE RESISTANCE HAD, IT TRIED TO PRESERVE THE ALREADY ESTABLISHED SEGREGATION. THOUGH IT MET THE DEMANDS OF THE SUPREME COURT BY ALLOWING SOME INTEGRATION, IT STILL MAINTAINED DE FACTO SEGREGATION. METHODS OF KEEPING SCHOOL INTEGRATION AT A TOKEN LEVEL INCLUDED PUPIL PLACEMENT…

  13. DESEGREGATION--A COMMUNITY DESIGN.

    ERIC Educational Resources Information Center

    SCHERMER, GEORGE

    ALTHOUGH DISCRIMINATION AND EXCLUSION ARE DOOMED ACCORDING TO TO THE LAW, THERE APPEARS TO BE CONSIDERABLE DISTINCTION IN MEN'S MINDS BETWEEN ABSENCE OF DISCRIMINATION ON THE ONE HAND AND AFFIRMATIVE MEASURES TO IMPLEMENT INTEGRATION ON THE OTHER. PHILADELPHIA HAS A GREAT DEAL OF DE FACTO SEGREGATION. AS THE POPULATION GROWS, NEGROES AND WHITES…

  14. Nana for a New Generation

    ERIC Educational Resources Information Center

    Sellers, Denise T.

    2010-01-01

    In this article, the author reflects on all she received from playing at her grandmother's (Nana) house, where Nana was in essence, a volunteer childcare provider, overseeing neighborhood children during their out-of-school time. This was "de facto" childcare. Parents knew and trusted Nana, and the children loved her. Out-of-school time…

  15. In the Eye of the Examinee: Likable Examiners Interfere with Performance

    ERIC Educational Resources Information Center

    Vormittag, Isabella; Ortner, Tuulia M.

    2014-01-01

    We investigated effects of examiners' ascribed likability and examiners' gender on test performance during a standardized face-to-face testing situation assessing self-estimated and de facto verbal knowledge. One hundred fourteen nonpsychology students were individually tested by one of 22 examiners. A moderated regression analysis…

  16. EDUCATING THE CULTURALLY DEPRIVED IN THE GREAT CITIES.

    ERIC Educational Resources Information Center

    KAPLAN, BERNARD A.; AND OTHERS

    A SERIES OF ARTICLES FEATURES THE EDUCATION OF THE CULTURALLY DISADVANTAGED IN URBAN AREAS. SUBJECTS RANGE FROM CONSIDERATION OF GENERAL ISSUES INVOLVED (IDENTIFICATION OF THE DISADVANTAGED, DE FACTO SEGREGATION, FINANCIAL ASSISTANCE, INVOLVEMENT OF THE FAMILY), TO SPECIFIC PROBLEMS FACING THE INNER-CITY TEACHER AND A PERSONAL EXPERIENCE OF A…

  17. 23 CFR 1208.3 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... (including a club which is de facto open to the public). The term does not apply to the possession of alcohol..., nurse, hospital or medical institution; in private clubs or establishments; or to the sale, handling, transport, or service in dispensing of any alcoholic beverage pursuant to lawful employment of a person...

  18. 23 CFR 1208.3 - Definitions.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... (including a club which is de facto open to the public). The term does not apply to the possession of alcohol..., nurse, hospital or medical institution; in private clubs or establishments; or to the sale, handling, transport, or service in dispensing of any alcoholic beverage pursuant to lawful employment of a person...

  19. 23 CFR 1208.3 - Definitions.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... (including a club which is de facto open to the public). The term does not apply to the possession of alcohol..., nurse, hospital or medical institution; in private clubs or establishments; or to the sale, handling, transport, or service in dispensing of any alcoholic beverage pursuant to lawful employment of a person...

  20. 23 CFR 1208.3 - Definitions.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... (including a club which is de facto open to the public). The term does not apply to the possession of alcohol..., nurse, hospital or medical institution; in private clubs or establishments; or to the sale, handling, transport, or service in dispensing of any alcoholic beverage pursuant to lawful employment of a person...

  1. Dissect, Design, and Customize the Curriculum

    ERIC Educational Resources Information Center

    Tienken, Christopher H.

    2013-01-01

    Education bureaucrats in 45 states have approved the Common Core State Standards ([CCSS], 2010) as the de facto national curriculum. The implementation of the CCSS will be monitored by a national standardized test in language arts and mathematics. The confluence of a standardized curriculum enforced with a standardized test will entrench a…

  2. Preparing School Leaders: Standards-Based Curriculum in the United States

    ERIC Educational Resources Information Center

    Young, Michelle D.; Anderson, Erin; Nash, Angel Miles

    2017-01-01

    For the last few decades, leadership standards have served as a de facto "recommended curriculum" for preparation programs in the United States. In this article, we: (1) share the new National Educational Leadership Preparation (NELP) standards, (2) present the literature supporting the standards, and (3) critically assess the influence…

  3. OrChem - An open source chemistry search engine for Oracle(R).

    PubMed

    Rijnbeek, Mark; Steinbeck, Christoph

    2009-10-22

    Registration, indexing and searching of chemical structures in relational databases is one of the core areas of cheminformatics. However, little detail has been published on the inner workings of search engines and their development has been mostly closed-source. We decided to develop an open source chemistry extension for Oracle, the de facto database platform in the commercial world. Here we present OrChem, an extension for the Oracle 11G database that adds registration and indexing of chemical structures to support fast substructure and similarity searching. The cheminformatics functionality is provided by the Chemistry Development Kit. OrChem provides similarity searching with response times in the order of seconds for databases with millions of compounds, depending on a given similarity cut-off. For substructure searching, it can make use of multiple processor cores on today's powerful database servers to provide fast response times in equally large data sets. OrChem is free software and can be redistributed and/or modified under the terms of the GNU Lesser General Public License as published by the Free Software Foundation. All software is available via http://orchem.sourceforge.net.

  4. Time Series Expression Analyses Using RNA-seq: A Statistical Approach

    PubMed Central

    Oh, Sunghee; Song, Seongho; Grabowski, Gregory; Zhao, Hongyu; Noonan, James P.

    2013-01-01

    RNA-seq is becoming the de facto standard approach for transcriptome analysis with ever-reducing cost. It has considerable advantages over conventional technologies (microarrays) because it allows for direct identification and quantification of transcripts. Many time series RNA-seq datasets have been collected to study the dynamic regulations of transcripts. However, statistically rigorous and computationally efficient methods are needed to explore the time-dependent changes of gene expression in biological systems. These methods should explicitly account for the dependencies of expression patterns across time points. Here, we discuss several methods that can be applied to model timecourse RNA-seq data, including statistical evolutionary trajectory index (SETI), autoregressive time-lagged regression (AR(1)), and hidden Markov model (HMM) approaches. We use three real datasets and simulation studies to demonstrate the utility of these dynamic methods in temporal analysis. PMID:23586021

  5. Time series expression analyses using RNA-seq: a statistical approach.

    PubMed

    Oh, Sunghee; Song, Seongho; Grabowski, Gregory; Zhao, Hongyu; Noonan, James P

    2013-01-01

    RNA-seq is becoming the de facto standard approach for transcriptome analysis with ever-reducing cost. It has considerable advantages over conventional technologies (microarrays) because it allows for direct identification and quantification of transcripts. Many time series RNA-seq datasets have been collected to study the dynamic regulations of transcripts. However, statistically rigorous and computationally efficient methods are needed to explore the time-dependent changes of gene expression in biological systems. These methods should explicitly account for the dependencies of expression patterns across time points. Here, we discuss several methods that can be applied to model timecourse RNA-seq data, including statistical evolutionary trajectory index (SETI), autoregressive time-lagged regression (AR(1)), and hidden Markov model (HMM) approaches. We use three real datasets and simulation studies to demonstrate the utility of these dynamic methods in temporal analysis.

  6. An Arab NATO in the Making Middle Eastern Military Cooperation Since 2011

    DTIC Science & Technology

    2016-09-01

    with a focus on strategy and security. In addition to monitoring post -conflict devel- opments in Iraq, Lebanon, and Libya, she researches Arab...non-Arab states such as Iran, Turkey, or post -conflict Israel. Announced at the 2010 Sirte Summit, the League’s Arab Neighbor- hood Policy has failed...preceding de- cade. For example, Saudi Arabia had increased its air 14 force to 305 fighter jets—and currently has a de facto monopoly on Airborne

  7. Large Scale Data Analytics of User Behavior for Improving Content Delivery

    DTIC Science & Technology

    2014-12-01

    video streaming, web browsing To Achan and Amma iv Abstract The Internet is fast becoming the de facto content delivery network of the world...operators everywhere and they seek to de - sign and manage their networks better to improve content delivery and provide better quality of experience...Anjali, Kriti and Ruta have been great company and have let me partake in their delicious homecooked food more times than I can remember. My friends

  8. Clear, Hold, and Build: The Role of Culture in the Creation of Local Security Forces

    DTIC Science & Technology

    2006-05-25

    welcome a national agency as a neutral arbitrator of local (especially inter -tribal) disputes if it is seen as scrupulously neutral and honest and...practice of military dictatorship. As used here it refers to the tendency for state security institutions to become the de facto or de jure dominant...or inter -tribal or political faction violence. These, in turn, may accelerate local problems and trends; if not balanced between necessity and reform

  9. Ada Quality and Style: Guidelines for Professional Programmers

    DTIC Science & Technology

    1989-01-01

    Paes. Enter the total Block 7. Performing Organization Name(s) and number of pages.AccLpr., A). Self-explanatory. Block 16. Price.o de Enter...parts of typical header comment blocks. Including other, de facto extraneous or superfluous information is a waste of time. Most of the information...specification and to export only what is necessary for another unit to use the package properly. Visibility of objects such as DEFAULT.3IDT.t in package TEXTo

  10. Development of Avionics Installation Interface Standards. Revision.

    DTIC Science & Technology

    1981-08-01

    requirements for new avionics in the Navy during the period 1985 to 1990, however, will be the F-18 programa , which is design-committed (and which will probably...programs that will continue late into the 1980s. Avionics programs currently in development will establish a de facto func- tional baseline as well...the equip- ment, appropriate sensors must be included at the cooling-air inlet to de - tect air-flow conditions directly, or to detect excessive heat

  11. PROGRESS AND PROBLEMS FROM THE SOCIAL SCIENTIST'S VIEWPOINT.

    ERIC Educational Resources Information Center

    CLARK, KENNETH

    REVIEWED FROM A SOCIAL SCIENTIST'S VIEWPOINT IS THE EFFECT OF THE SUPREME COURT'S 1954 BROWN DECISION ON PATTERNS OF DEFACTO SEGREGATION IN NORTHERN COMMUNITIES. THE DECISION HAD PROFOUND EFFECTS ON DE FACTO SEGREGATION, PARTICULARLY IN RELATION TO THE DEMOCRATIC IDEALS OF EQUALITY AND TO THE DAMAGED SELF-IMAGE CREATED BY SEGREGATED SCHOOLS. IT…

  12. Kiswahili as Vehicle of Unity and Development in the Great Lakes Region

    ERIC Educational Resources Information Center

    Kishe, Anna M.

    2003-01-01

    This paper discusses the potentiality of Kiswahili in accelerating social, political, economic and cultural integration within the Great Lakes Region. Presently, Kiswahili is a "de facto" lingua franca spoken by almost 100 million people in the world (Ntakirutimana, 2000). This is an indication of its viability in promoting unity among…

  13. Separation as an Important Risk Factor for Suicide: A Systematic Review

    ERIC Educational Resources Information Center

    Ide, Naoko; Wyder, Marianne; Kolves, Kairi; De Leo, Diego

    2010-01-01

    Examining how different phases of relationship separation effects the development of suicidal behaviors has been largely ignored in suicide studies. The few studies conducted suggest that individuals experiencing the acute phase of marital/de facto separation may be at greater risk of suicide compared with those experiencing long-term separation…

  14. Whose Citizenship Education? Hong Kong from a Spatial and Cultural Politics Perspective

    ERIC Educational Resources Information Center

    Tse, Thomas Kwan-choi

    2007-01-01

    Citizenship (education) is "de facto" a political and spatial concept and should be considered in local, national, and global contexts. Adopting a spatial and cultural politics perspective and with the dynamic formation of Hong Kong's citizenship education as a case study, this article tries to illustrate the politics at three different…

  15. Factors affecting people's response to invasive species management

    Treesearch

    Paul H. Gobster

    2011-01-01

    Natural areas managers contend with an increasingly diverse array of invasive species in their mission to conserve the health and integrity of ecosystems under their charge. As users, nearby neighbours and de facto 'owners' of the lands where many significant natural areas reside, the public is often highly supportive of broad programme goals for management...

  16. Repurposing Education

    ERIC Educational Resources Information Center

    McClung, Merle

    2013-01-01

    The economic purpose of getting a job, or getting into college in order to get a better job, has evolved into the de facto primary purpose of K-12 (and higher) education. Business model solutions are seen by businessmen as the answer to education problems. But the business models they advocate and help fund are not a good fit for education…

  17. From Crisis to Empowerment: African American Women in Community Colleges

    ERIC Educational Resources Information Center

    Bates, Marcie Ann

    2012-01-01

    Social challenges tear at the fabric of the African American family, revealing complexities that identify a de facto leader, the African American woman. She exists in a chasm of overt circumstances which heavily influences her successes. The purpose of this study is to identify factors that motivated seven female African American community college…

  18. 76 FR 62349 - Preliminary Results Freshwater Crawfish Tail Meat From the People's Republic of China: of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-07

    ... value by Xiping Opeck Food Co., Ltd., our analysis of the applicable transactions requires additional... Co., Ltd. (Shanghai Ocean Flavor), Xiping Opeck Food Co., Ltd. (Xiping Opeck), Xuzhou Jinjiang... analysis of de facto control is critical in determining whether the respondents are, in fact, subject to a...

  19. The Debate over National Testing. ERIC Digest.

    ERIC Educational Resources Information Center

    Boston, Carol

    This digest presents various views of the federal role in testing and offers a brief examination of the National Assessment of Educational Progress (NAEP), the "nation's report card," in its national sample format and its state administration, which critics fear has the potential to become a de facto national test if selected as the…

  20. Brown versus Board at 62: Marching Back into the Future

    ERIC Educational Resources Information Center

    Brown Henderson, Cheryl; Brown, Steven M.

    2017-01-01

    Sixty-two years after the "Brown" decision, American schools are collapsing under the weight of an antiquated system of school finance, pockets of poverty, and a "Black and Browning" urban core. This article focuses on the "march backwards" to the de facto re-segregation of our nation's public schools. In 2016, the…

  1. Universal Patterns or the Tale of Two Systems? Mathematics Achievement and Educational Expectations in Post-Socialist Europe

    ERIC Educational Resources Information Center

    Bodovski, Katerina; Kotok, Stephen; Henck, Adrienne

    2014-01-01

    Although communist ideology claimed to destroy former class stratification based on labour market capitalist relationships, "de facto" during socialism one social class hierarchy was substituted for another that was equally unequal. The economic transition during the 1990s increased stratification by wealth, which affected educational…

  2. 47 CFR 1.9030 - Long-term de facto transfer leasing arrangements.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... use restrictions apply to the spectrum lessee as well. (4) Designated entity/entrepreneur rules. (i) A licensee that holds a license pursuant to small business and/or entrepreneur provisions (see § 1.2110 and... mobile voice and/or data services; (B) The licensee is not a designated entity or entrepreneur subject to...

  3. Icelandic for Adult Foreigners: Effects of Imposing an Icelandic Language Test

    ERIC Educational Resources Information Center

    Innes, Pamela; Skaptadóttir, Unnur Dís

    2017-01-01

    Legislation linking language course attendance and passage of a language test for residence visas and citizenship, respectively, was enacted in Iceland in the early 2000s. Curricular guidelines and the language test were developed as a result. Research in other countries suggests such structures cause teachers to create "de facto"…

  4. The Rise and Fall of School Integration in Israel: Research and Policy Analysis

    ERIC Educational Resources Information Center

    Resh, Nura; Dar, Yechezkel

    2012-01-01

    School integration (desegregation) was introduced in Israeli junior high schools in 1968 with the aim of increasing educational equality and decreasing (Jewish) ethnic divides. While never officially abandoned, a "de facto" retreat from this policy has been observed since the early 1990s, despite the voluminous research that revealed its…

  5. Melinda: De Facto Primary School Music Teacher

    ERIC Educational Resources Information Center

    de Vries, Peter

    2013-01-01

    A series of reviews dating back to the 1960s and a body of research literature points to the inadequate delivery of music education by generalist primary school teachers in Australian schools. Despite recommendations for specialist music teachers to teach music in all Australian primary schools to counter this ongoing trend, such an approach has…

  6. Evaluating the Impact of Different Early Literacy Interventions on Low-Income Costa Rican Kindergarteners

    ERIC Educational Resources Information Center

    Rolla San Francisco, Andrea; Arias, Melissa; Villers, Renata; Snow, Catherine

    2006-01-01

    Grade retention has been the de facto policy for children with academic difficulties in many Latin American countries [Schiefelbein, E., & Wolff, L. (1992). "Repetition and inadequate achievement in Latin America's primary schools: a review of magnitudes, causes, relationships, and strategies." Washington, DC: World Bank.]. In Costa…

  7. Evaluation of reference crop evapotranspiration methods in arid, semi-arid and humid regions

    USDA-ARS?s Scientific Manuscript database

    It is necessary to find a simpler method in different climatic regions to calculate reference crop evapotranspiration (ETo) since the application of the FAO-56 Penman-Monteith method is often restricted due to unavailability of a full weather data set. Seven ETo methods, the de facto standard FAO-56...

  8. Orthopedic board certification and physician performance: an analysis of medical malpractice, hospital disciplinary action, and state medical board disciplinary action rates.

    PubMed

    Kocher, Mininder S; Dichtel, Laura; Kasser, James R; Gebhardt, Mark C; Katz, Jeffery N

    2008-02-01

    Specialty board certification status has become the de facto standard of competency by which the profession and the public recognize physician specialists. However, the relationship between orthopedic board certification and physician performance has not been established. Rates of medical malpractice claims, hospital disciplinary actions, and state medical board disciplinary actions were compared between 1309 board-certified (BC) and 154 non-board-certified (NBC) orthopedic surgeons in 3 states. There was no significant difference between BC and NBC surgeons in medical malpractice claim proportions (BC, 19.1% NBC, 16.9% P = .586) or in hospital disciplinary action proportions (BC, 0.9% NBC, 0.8% P = 1.000). There was a significantly higher proportion of state medical board disciplinary action for NBC surgeons (BC, 7.6% NBC, 13.0% P = .028). An association between board certification status and physician performance is necessary to validate its status as the de facto standard of competency. In this study, BC surgeons had lower rates of state medical board disciplinary action.

  9. Welfare State Replacements: Deinstitutionalization, Privatization and the Outsourcing to Immigrant Women Enterprise.

    PubMed

    Nazareno, Jennifer

    2018-04-01

    The U.S. government has a long tradition of providing direct care services to many of its most vulnerable citizens through market-based solutions and subsidized private entities. The privatized welfare state has led to the continued displacement of some of our most disenfranchised groups in need of long-term care. Situated after the U.S. deinstitutionalization era, this is the first study to examine how immigrant Filipino women emerged as owners of de facto mental health care facilities that cater to the displaced, impoverished, severely mentally ill population. These immigrant women-owned businesses serve as welfare state replacements, overseeing the health and illness of these individuals by providing housing, custodial care, and medical services after the massive closure of state mental hospitals that occurred between 1955 and 1980. This study explains the onset of these businesses and the challenges that one immigrant group faces as owners, the meanings of care associated with their de facto mental health care enterprises, and the conditions under which they have operated for more than 40 years.

  10. PARVMEC: An Efficient, Scalable Implementation of the Variational Moments Equilibrium Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Seal, Sudip K; Hirshman, Steven Paul; Wingen, Andreas

    The ability to sustain magnetically confined plasma in a state of stable equilibrium is crucial for optimal and cost-effective operations of fusion devices like tokamaks and stellarators. The Variational Moments Equilibrium Code (VMEC) is the de-facto serial application used by fusion scientists to compute magnetohydrodynamics (MHD) equilibria and study the physics of three dimensional plasmas in confined configurations. Modern fusion energy experiments have larger system scales with more interactive experimental workflows, both demanding faster analysis turnaround times on computational workloads that are stressing the capabilities of sequential VMEC. In this paper, we present PARVMEC, an efficient, parallel version of itsmore » sequential counterpart, capable of scaling to thousands of processors on distributed memory machines. PARVMEC is a non-linear code, with multiple numerical physics modules, each with its own computational complexity. A detailed speedup analysis supported by scaling results on 1,024 cores of a Cray XC30 supercomputer is presented. Depending on the mode of PARVMEC execution, speedup improvements of one to two orders of magnitude are reported. PARVMEC equips fusion scientists for the first time with a state-of-theart capability for rapid, high fidelity analyses of magnetically confined plasmas at unprecedented scales.« less

  11. Quantifying pathogen risks associated with potable reuse: A risk assessment case study for Cryptosporidium.

    PubMed

    Amoueyan, Erfaneh; Ahmad, Sajjad; Eisenberg, Joseph N S; Pecson, Brian; Gerrity, Daniel

    2017-08-01

    This study evaluated the reliability and equivalency of three different potable reuse paradigms: (1) surface water augmentation via de facto reuse with conventional wastewater treatment; (2) surface water augmentation via planned indirect potable reuse (IPR) with ultrafiltration, pre-ozone, biological activated carbon (BAC), and post-ozone; and (3) direct potable reuse (DPR) with ultrafiltration, ozone, BAC, and UV disinfection. A quantitative microbial risk assessment (QMRA) was performed to (1) quantify the risk of infection from Cryptosporidium oocysts; (2) compare the risks associated with different potable reuse systems under optimal and sub-optimal conditions; and (3) identify critical model/operational parameters based on sensitivity analyses. The annual risks of infection associated with the de facto and planned IPR systems were generally consistent with those of conventional drinking water systems [mean of (9.4 ± 0.3) × 10 -5 to (4.5 ± 0.1) × 10 -4 ], while DPR was clearly superior [mean of (6.1 ± 67) × 10 -9 during sub-optimal operation]. Because the advanced treatment train in the planned IPR system was highly effective in reducing Cryptosporidium concentrations, the associated risks were generally dominated by the pathogen loading already present in the surface water. As a result, risks generally decreased with higher recycled water contributions (RWCs). Advanced treatment failures were generally inconsequential either due to the robustness of the advanced treatment train (i.e., DPR) or resiliency provided by the environmental buffer (i.e., planned IPR). Storage time in the environmental buffer was important for the de facto reuse system, and the model indicated a critical storage time of approximately 105 days. Storage times shorter than the critical value resulted in significant increases in risk. The conclusions from this study can be used to inform regulatory decision making and aid in the development of design or operational criteria for IPR and DPR systems. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Russo-Japanese Relations and the Future of the U.S.-Japanese Alliance

    DTIC Science & Technology

    1993-01-01

    return of the southern Kurils is the Russian sense of threat from the U.S. force structure in the area, coupled with the existence of the U.S. military...intransigence by continuing to allude to a de facto For a historical reviw of the evolution of their decislonmaklng system. see Hiarty Gdninn, De jam...Foreign Minister Mikhail Kapits . 4K(ovulenko was head of the Japan section of the Central Committee International Departmenit from 1963, and a deputy chief

  13. Counterinsurgency on the Ground in Afghanistan. How Different Units Adapted to Local Conditions

    DTIC Science & Technology

    2010-11-01

    insurgents were defeated at the hands of the Marines. From the day the Marines arrived, they executed a deliberate campaign plan de - veloped by the platoon...The downside of the oil spot approach was that it left many areas under de facto Taliban rule. For example, the Dutch followed an oil spot strategy... plan in 2006 was to oil spot out from Helmand’s provin- cial capital, Lashkar Gah, but when the Taliban attacked all of the towns in northern Helmand

  14. OpenID Connect as a security service in cloud-based medical imaging systems

    PubMed Central

    Ma, Weina; Sartipi, Kamran; Sharghigoorabi, Hassan; Koff, David; Bak, Peter

    2016-01-01

    Abstract. The evolution of cloud computing is driving the next generation of medical imaging systems. However, privacy and security concerns have been consistently regarded as the major obstacles for adoption of cloud computing by healthcare domains. OpenID Connect, combining OpenID and OAuth together, is an emerging representational state transfer-based federated identity solution. It is one of the most adopted open standards to potentially become the de facto standard for securing cloud computing and mobile applications, which is also regarded as “Kerberos of cloud.” We introduce OpenID Connect as an authentication and authorization service in cloud-based diagnostic imaging (DI) systems, and propose enhancements that allow for incorporating this technology within distributed enterprise environments. The objective of this study is to offer solutions for secure sharing of medical images among diagnostic imaging repository (DI-r) and heterogeneous picture archiving and communication systems (PACS) as well as Web-based and mobile clients in the cloud ecosystem. The main objective is to use OpenID Connect open-source single sign-on and authorization service and in a user-centric manner, while deploying DI-r and PACS to private or community clouds should provide equivalent security levels to traditional computing model. PMID:27340682

  15. Role of PROLOG (Programming and Logic) in natural-language processing. Report for September-December 1987

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McHale, M.L.

    The field of artificial Intelligence strives to produce computer programs that exhibit intelligent behavior. One of the areas of interest is the processing of natural language. This report discusses the role of the computer language PROLOG in Natural Language Processing (NLP) both from theoretic and pragmatic viewpoints. The reasons for using PROLOG for NLP are numerous. First, linguists can write natural-language grammars almost directly as PROLOG programs; this allows fast-prototyping of NLP systems and facilitates analysis of NLP theories. Second, semantic representations of natural-language texts that use logic formalisms are readily produced in PROLOG because of PROLOG's logical foundations. Third,more » PROLOG's built-in inferencing mechanisms are often sufficient for inferences on the logical forms produced by NLPs. Fourth, the logical, declarative nature of PROLOG may make it the language of choice for parallel computing systems. Finally, the fact that PROLOG has a de facto standard (Edinburgh) makes the porting of code from one computer system to another virtually trouble free. Perhaps the strongest tie one could make between NLP and PROLOG was stated by John Stuart Mill in his inaugural Address at St. Andrews: The structure of every sentence is a lesson in logic.« less

  16. Averting Denver Airports on a Chip

    NASA Technical Reports Server (NTRS)

    Sullivan, Kevin J.

    1995-01-01

    As a result of recent advances in software engineering capabilities, we are now in a more stable environment. De-facto hardware and software standards are emerging. Work on software architecture and design patterns signals a consensus on the importance of early system-level design decisions, and agreements on the uses of certain paradigmatic software structures. We now routinely build systems that would have been risky or infeasible a few years ago. Unfortunately, technological developments threaten to destabilize software design again. Systems designed around novel computing and peripheral devices will spark ambitious new projects that will stress current software design and engineering capabilities. Micro-electro-mechanical systems (MEMS) and related technologies provide the physical basis for new systems with the potential to produce this kind of destabilizing effect. One important response to anticipated software engineering and design difficulties is carefully directed engineering-scientific research. Two specific problems meriting substantial research attention are: A lack of sufficient means to build software systems by generating, extending, specializing, and integrating large-scale reusable components; and a lack of adequate computational and analytic tools to extend and aid engineers in maintaining intellectual control over complex software designs.

  17. Application and Evaluation of Interactive 3D PDF for Presenting and Sharing Planning Results for Liver Surgery in Clinical Routine

    PubMed Central

    Newe, Axel; Becker, Linda; Schenk, Andrea

    2014-01-01

    Background & Objectives The Portable Document Format (PDF) is the de-facto standard for the exchange of electronic documents. It is platform-independent, suitable for the exchange of medical data, and allows for the embedding of three-dimensional (3D) surface mesh models. In this article, we present the first clinical routine application of interactive 3D surface mesh models which have been integrated into PDF files for the presentation and the exchange of Computer Assisted Surgery Planning (CASP) results in liver surgery. We aimed to prove the feasibility of applying 3D PDF in medical reporting and investigated the user experience with this new technology. Methods We developed an interactive 3D PDF report document format and implemented a software tool to create these reports automatically. After more than 1000 liver CASP cases that have been reported in clinical routine using our 3D PDF report, an international user survey was carried out online to evaluate the user experience. Results Our solution enables the user to interactively explore the anatomical configuration and to have different analyses and various resection proposals displayed within a 3D PDF document covering only a single page that acts more like a software application than like a typical PDF file (“PDF App”). The new 3D PDF report offers many advantages over the previous solutions. According to the results of the online survey, the users have assessed the pragmatic quality (functionality, usability, perspicuity, efficiency) as well as the hedonic quality (attractiveness, novelty) very positively. Conclusion The usage of 3D PDF for reporting and sharing CASP results is feasible and well accepted by the target audience. Using interactive PDF with embedded 3D models is an enabler for presenting and exchanging complex medical information in an easy and platform-independent way. Medical staff as well as patients can benefit from the possibilities provided by 3D PDF. Our results open the door for a wider use of this new technology, since the basic idea can and should be applied for many medical disciplines and use cases. PMID:25551375

  18. Application and evaluation of interactive 3D PDF for presenting and sharing planning results for liver surgery in clinical routine.

    PubMed

    Newe, Axel; Becker, Linda; Schenk, Andrea

    2014-01-01

    The Portable Document Format (PDF) is the de-facto standard for the exchange of electronic documents. It is platform-independent, suitable for the exchange of medical data, and allows for the embedding of three-dimensional (3D) surface mesh models. In this article, we present the first clinical routine application of interactive 3D surface mesh models which have been integrated into PDF files for the presentation and the exchange of Computer Assisted Surgery Planning (CASP) results in liver surgery. We aimed to prove the feasibility of applying 3D PDF in medical reporting and investigated the user experience with this new technology. We developed an interactive 3D PDF report document format and implemented a software tool to create these reports automatically. After more than 1000 liver CASP cases that have been reported in clinical routine using our 3D PDF report, an international user survey was carried out online to evaluate the user experience. Our solution enables the user to interactively explore the anatomical configuration and to have different analyses and various resection proposals displayed within a 3D PDF document covering only a single page that acts more like a software application than like a typical PDF file ("PDF App"). The new 3D PDF report offers many advantages over the previous solutions. According to the results of the online survey, the users have assessed the pragmatic quality (functionality, usability, perspicuity, efficiency) as well as the hedonic quality (attractiveness, novelty) very positively. The usage of 3D PDF for reporting and sharing CASP results is feasible and well accepted by the target audience. Using interactive PDF with embedded 3D models is an enabler for presenting and exchanging complex medical information in an easy and platform-independent way. Medical staff as well as patients can benefit from the possibilities provided by 3D PDF. Our results open the door for a wider use of this new technology, since the basic idea can and should be applied for many medical disciplines and use cases.

  19. Moving the Field Forward: A Micro-Meso-Macro Model for Critical Language Planning. The Case of Estonia

    ERIC Educational Resources Information Center

    Skerrett, Delaney Michael

    2016-01-01

    This study investigates "de facto" language policy in Estonia. It investigates how language choices at the micro (or individual) level are negotiated within the macro (or social and historical) context: how official language policy and other features of the discursive environment surrounding language and its use in Estonia translate into…

  20. Innovative Educational Technology Programs in Low- and Middle-Income Countries

    ERIC Educational Resources Information Center

    Trucano, Michael

    2017-01-01

    For many people, the use of technology in education constitutes a de facto "innovation." Whether or not this belief is actually accurate, or useful, is a legitimate question for discussion. That said, there is no denying that many of the educational innovations celebrated (or at least touted) today are enabled by the use of such…

  1. The New Latino South and the Challenge to Public Education: Strategies for Educators and Policymakers in Emerging Immigrant Communities

    ERIC Educational Resources Information Center

    Wainer, Andrew

    2004-01-01

    The lack of resources devoted to educating Latinos in emerging immigrant communities is generating negative educational outcomes and de facto educational segregation in the South. While Latino immigrants continue to dominate employment in the meat processing, service, and construction sectors in these communities, they are underrepresented on…

  2. Line and Rabble: Drill, Doctrine, and Military Books in Revolutionary America

    DTIC Science & Technology

    2018-04-20

    professionalism demanded the de facto wholesale adoption of European military knowledge. Yet, in addition to more sophisticated works suited for...angustiis, feroci hostium saevitiae, belloque crudeli ex inopinato patriae nostrae illato debitis, maxime accommodata. (Philadelphiae: Ex officina...civitatum pertinentis ; hodiernae nostrae inopiae rerumque angustiis, feroci hostium saevitiae, belloque crudeli ex inopinato patriae nostrae illato

  3. 78 FR 75563 - Commission Policies and Procedures Under the Communications Act, Foreign Investment in Broadcast...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-12

    ...; see also CBI May 28, 2013, Ex Parte at 1. CBI members comprise national broadcast networks, radio and... alien de facto control and real-party-in-interest issues for section 310(b)(4) compliance). \\21\\ See, e... Designs Compromised by Chinese Cyberspies, by Ellen Nakashima, The Washington Post (May 27, 2013). IV...

  4. 48 CFR 652.237-73 - Statement of Qualifications for Preference as a U.S. Person.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... de facto joint venture with no written agreement. To be considered a “qualified joint venture person...-venturer agrees to be individually responsible for performance of the contract, notwithstanding the terms...: (ii) Type of return (e.g., income tax, franchise tax, etc.). Include all that apply: 3. Section 136(d...

  5. The Discursive Construction of Superintendent Statesmanship on Twitter

    ERIC Educational Resources Information Center

    Hurst, Todd M.

    2017-01-01

    The modern school superintendent fulfills a unique role in the U.S. public education system. He or she is structurally empowered as the de facto head of the local educational system, thereby granted with a certain amount of trust and authority regarding educational issues. At the same time, the superintendent is, in most cases, an employee of a…

  6. Affirmative School Integration: Efforts to Overcome De Facto Segregation in Urban Schools.

    ERIC Educational Resources Information Center

    Hill, Roscoe, Ed.; Feeley, Malcolm, Ed.

    This book contains abbreviated accounts of eight community case studies and various reviews of a cluster of recent studies relating to race and education. The foreword discusses three phases of school integration, and the introductory chapter relates law, violence, and civil rights. The eight case studies on Evanston, Berkeley, New Haven,…

  7. Ability-Based Criteria and the Lower Class Student: The De Facto Screw.

    ERIC Educational Resources Information Center

    Amato, Josephine; Backman, Carl B.

    Social class discrimination in the schooling process is less a product of teachers' attitudes than of the systems by which schools group students. Although class bias has been demonstrated in grading practices and other key decision points, numerous studies have shown that teachers are less to blame for inequitable learning situations than…

  8. Arnold's Advantages: How Governor Schwarzenegger Acquired English through De Facto Bilingual Education

    ERIC Educational Resources Information Center

    Ramos, Francisco; Krashen, Stephen

    2013-01-01

    Governor Arnold Schwarzenegger has repeatedly mentioned that immigrants to the United States should do what he did to acquire English: Avoid using their first languages and speak, listen to, and read a vast amount of materials in English--a combination he referred to as "immersion." Yet, Schwarzenegger's real path to successful English…

  9. The Librarian Lion: Constructing Children's Literature through Connections, Capital, and Criticism (1906-1941)

    ERIC Educational Resources Information Center

    Martens, Marianne

    2013-01-01

    While much has been written about the pioneering children's librarian Anne Carroll Moore, little has been written about her role as a "de facto" literary agent. As such, Moore was an innovator not only in children's librarianship, but also in the field of children's publishing. This paper analyzes Moore's letters at the Manuscripts and…

  10. Keeping up with Our Students: The Evolution of Technology and Standards in Art Education

    ERIC Educational Resources Information Center

    Patton, Ryan M.; Buffington, Melanie L.

    2016-01-01

    This article addresses the standards of technology in the visual arts, arguing the standards function as de facto policy, the guidelines that shape what teachers teach. In this study, we investigate how art education standards approach technology as a teaching tool and artmaking medium, analyzing the current National Visual Arts Standards, the…

  11. Effects of Resource Allocation on Student Academic Achievement and Self-Perceptions of Success in an Urban Setting

    ERIC Educational Resources Information Center

    Harris, Kimberly

    2014-01-01

    Civil Rights legislation, now 50 years old, "de facto" segregation based on socioeconomic factors, such as poverty and ethnicity in urban areas translates into the surrounding schools, with a legacy of limited funding, reduced services, and teachers with limited training to successfully engage students in high poverty areas. This study…

  12. Influence of an Education Abroad Program on the Intercultural Sensitivity of STEM Undergraduates: A Mixed Methods Study

    ERIC Educational Resources Information Center

    Demetry, Chrysanthe; Vaz, Richard F.

    2017-01-01

    Education abroad programs are becoming more common as a mechanism for developing the global competencies of engineering graduates. An increasing body of research shows that intercultural learning does not occur "de facto" in such programs. This study used quantitative and qualitative methods to explore changes in students' intercultural…

  13. Distinctive Curriculum Materials in K-6 Social Studies. Elementary Subjects Center Series No. 35.

    ERIC Educational Resources Information Center

    Brophy, Jere

    In a previous report, the author critiqued the 1988 Silver Burdette & Ginn elementary social studies series (Silver Burdett & Ginn Social Studies), treating it as a representative example of what has been called the de facto national curriculum in elementary social studies. The present report begins with brief critiques of three other market-share…

  14. American College Biology and Zoology Course Requirements: A de facto Standardized Curriculum.

    ERIC Educational Resources Information Center

    Heppner, Frank; And Others

    Without a formal mechanism to produce consensus, American colleges generally have come to agree on what constitutes an appropriate set of course requirements for Biology and Zoology majors. This report describes a survey of American four-year colleges and universities offering biology and/or zoology degrees. Questionnaires were sent to 741 biology…

  15. Well-Being Is a Process of Becoming: Respondent-Led Research with Organic Farmers in Madagascar

    ERIC Educational Resources Information Center

    Farnworth, Cathy Rozel

    2009-01-01

    Malagasy "players"--farmers, middle men, organic organisations and policy makers--see in export-orientated organic agriculture a way for Madagascar to build upon its historic export strengths: spices, essential oils, medicinal plants and tropical fruits. They point to the "de facto" organic status of most farming in the country…

  16. Transgenic Crops and Sustainable Agriculture in the European Context

    ERIC Educational Resources Information Center

    Ponti, Luigi

    2005-01-01

    The rapid adoption of transgenic crops in the United States, Argentina, and Canada stands in strong contrast to the situation in the European Union (EU), where a de facto moratorium has been in place since 1998. This article reviews recent scientific literature relevant to the problematic introduction of transgenic crops in the EU to assess if…

  17. The Minority Languages Dilemmas in Turkey: A Critical Approach to an Emerging Literature

    ERIC Educational Resources Information Center

    Ozfidan, Burhan; Burlbaw, Lynn M.; Aydin, Hasan

    2018-01-01

    Turkey comprises many ethnic groups other than Turks including, but not limited to, Armenians, Assyrians, Alevi, Arabs, Circassians, Greeks, Kurds, Laz, and Zaza. These groups are ethnically different from Turks and were incorporated into the Ottoman Empire's eastern provinces with de facto autonomy. The main objective of this study is to…

  18. De Facto Language Policy in Legislation Defining Adult Basic Education in the United States

    ERIC Educational Resources Information Center

    Vanek, Jenifer

    2016-01-01

    This paper investigates the impact of differing interpretation of federal education policy in three different states. The policy, the Workforce Investment Act Title II, has defined the services provided for adult English language learners (ELLs) enrolled in Adult Basic Education programs in the United States since it was passed in 1998. At the…

  19. Questioning the Fidelity of the "Next Generation Science Standards" for Astronomy and Space Sciences Education

    ERIC Educational Resources Information Center

    Slater, Stephanie J.; Slater, Timothy F.

    2015-01-01

    Although the Next Generation Science Standards (NGSS) are not federally mandated national standards or performance expectations for K-12 schools in the United States, they stand poised to become a de facto national science and education policy, as state governments, publishers of curriculum materials, and assessment providers across the country…

  20. De Facto Language Education Policy through Teachers' Attitudes and Practices: A Critical Ethnographic Study in Three Jamaican Schools

    ERIC Educational Resources Information Center

    Nero, Shondel J.

    2014-01-01

    Using Jamaica, a former British colony where Jamaican Creole (JC) is the mass vernacular but Standard Jamaican English is the official language, as an illustrative case, this critical ethnographic study in three Jamaican schools examines the theoretical and practical challenges of language education policy (LEP) development and implementation in…

  1. Choosing estimands in clinical trials with missing data.

    PubMed

    Mallinckrodt, Craig; Molenberghs, Geert; Rathmann, Suchitrita

    2017-01-01

    Recent research has fostered new guidance on preventing and treating missing data. Consensus exists that clear objectives should be defined along with the causal estimands; trial design and conduct should maximize adherence to the protocol specified interventions; and a sensible primary analysis should be used along with plausible sensitivity analyses. Two general categories of estimands are effects of the drug as actually taken (de facto, effectiveness) and effects of the drug if taken as directed (de jure, efficacy). Motivated by examples, we argue that no single estimand is likely to meet the needs of all stakeholders and that each estimand has strengths and limitations. Therefore, stakeholder input should be part of an iterative study development process that includes choosing estimands that are consistent with trial objectives. To this end, an example is used to illustrate the benefit from assessing multiple estimands in the same study. A second example illustrates that maximizing adherence reduces sensitivity to missing data assumptions for de jure estimands but may reduce generalizability of results for de facto estimands if efforts to maximize adherence in the trial are not feasible in clinical practice. A third example illustrates that whether or not data after initiation of rescue medication should be included in the primary analysis depends on the estimand to be tested and the clinical setting. We further discuss the sample size and total exposure to placebo implications of including post-rescue data in the primary analysis. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  2. Flexible Description Language for HPC based Processing of Remote Sense Data

    NASA Astrophysics Data System (ADS)

    Nandra, Constantin; Gorgan, Dorian; Bacu, Victor

    2016-04-01

    When talking about Big Data, the most challenging aspect lays in processing them in order to gain new insight, find new patterns and gain knowledge from them. This problem is likely most apparent in the case of Earth Observation (EO) data. With ever higher numbers of data sources and increasing data acquisition rates, dealing with EO data is indeed a challenge [1]. Geoscientists should address this challenge by using flexible and efficient tools and platforms. To answer this trend, the BigEarth project [2] aims to combine the advantages of high performance computing solutions with flexible processing description methodologies in order to reduce both task execution times and task definition time and effort. As a component of the BigEarth platform, WorDeL (Workflow Description Language) [3] is intended to offer a flexible, compact and modular approach to the task definition process. WorDeL, unlike other description alternatives such as Python or shell scripts, is oriented towards the description topologies, using them as abstractions for the processing programs. This feature is intended to make it an attractive alternative for users lacking in programming experience. By promoting modular designs, WorDeL not only makes the processing descriptions more user-readable and intuitive, but also helps organizing the processing tasks into independent sub-tasks, which can be executed in parallel on multi-processor platforms in order to improve execution times. As a BigEarth platform [4] component, WorDeL represents the means by which the user interacts with the system, describing processing algorithms in terms of existing operators and workflows [5], which are ultimately translated into sets of executable commands. The WorDeL language has been designed to help in the definition of compute-intensive, batch tasks which can be distributed and executed on high-performance, cloud or grid-based architectures in order to improve the processing time. Main references for further information: [1] Gorgan, D., "Flexible and Adaptive Processing of Earth Observation Data over High Performance Computation Architectures", International Conference and Exhibition Satellite 2015, August 17-19, Houston, Texas, USA. [2] Bigearth project - flexible processing of big earth data over high performance computing architectures. http://cgis.utcluj.ro/bigearth, (2014) [3] Nandra, C., Gorgan, D., "Workflow Description Language for Defining Big Earth Data Processing Tasks", Proceedings of the Intelligent Computer Communication and Processing (ICCP), IEEE-Press, pp. 461-468, (2015). [4] Bacu, V., Stefan, T., Gorgan, D., "Adaptive Processing of Earth Observation Data on Cloud Infrastructures Based on Workflow Description", Proceedings of the Intelligent Computer Communication and Processing (ICCP), IEEE-Press, pp.444-454, (2015). [5] Mihon, D., Bacu, V., Colceriu, V., Gorgan, D., "Modeling of Earth Observation Use Cases through the KEOPS System", Proceedings of the Intelligent Computer Communication and Processing (ICCP), IEEE-Press, pp. 455-460, (2015).

  3. A spline-based approach for computing spatial impulse responses.

    PubMed

    Ellis, Michael A; Guenther, Drake; Walker, William F

    2007-05-01

    Computer simulations are an essential tool for the design of phased-array ultrasonic imaging systems. FIELD II, which determines the two-way temporal response of a transducer at a point in space, is the current de facto standard for ultrasound simulation tools. However, the need often arises to obtain two-way spatial responses at a single point in time, a set of dimensions for which FIELD II is not well optimized. This paper describes an analytical approach for computing the two-way, far-field, spatial impulse response from rectangular transducer elements under arbitrary excitation. The described approach determines the response as the sum of polynomial functions, making computational implementation quite straightforward. The proposed algorithm, named DELFI, was implemented as a C routine under Matlab and results were compared to those obtained under similar conditions from the well-established FIELD II program. Under the specific conditions tested here, the proposed algorithm was approximately 142 times faster than FIELD II for computing spatial sensitivity functions with similar amounts of error. For temporal sensitivity functions with similar amounts of error, the proposed algorithm was about 1.7 times slower than FIELD II using rectangular elements and 19.2 times faster than FIELD II using triangular elements. DELFI is shown to be an attractive complement to FIELD II, especially when spatial responses are needed at a specific point in time.

  4. Democratisation or Management and Corporate Capture?: Theses on the Governance Crisis of Australia's Semi-Privatised Public Universities

    ERIC Educational Resources Information Center

    Bonnell, Andrew G.

    2016-01-01

    This paper proceeds from the view that managerial capture has already become a fundamental problem after a couple of decades of largely untrammelled managerialism in our public universities, and that this problem is likely to be compounded by further shifts towards deregulation and de facto privatisation, which is the direction that current…

  5. The Case for a Joint Evaluation

    DTIC Science & Technology

    2017-01-01

    intelligence, and engineering. Finally, the comparative time ex - pended by the combatant commanders (CCDRs) on fulfilling four different evaluation...template for the joint-centric construct would align with the four de facto sections noted earlier: an identifica- tion section, a performance metric...intangible or have not been properly researched. For example, under one evaluation system, a Servicemember’s separation or retire- ment into a post

  6. The Hermeneutical Function of the Family in Right Understanding of Catholic Social Teaching and Its Use for Catholic University Education

    ERIC Educational Resources Information Center

    Anderson, Justin

    2016-01-01

    In this paper, I argue that a "de facto" politicizing approach to the principles of Catholic Social Teaching miscasts several qualities of that body of teaching, leading to several negative prejudices. As a remedy to this politicization, I propose a "familial hermeneutical" approach that renders the principles of Catholic…

  7. The Growth of Foreign Qualification Suppliers in Sri Lanka: "de facto" Decentralisation?

    ERIC Educational Resources Information Center

    Little, Angela W.; Evans, Jane

    2005-01-01

    Based mainly on a study of newspaper adverts for qualifications and tuition courses in Sri Lanka over a period from 1965 to 2000, this paper describes a decentralisation of control over the supply of qualifications. It is argued that this has occurred not through a deliberate policy mechanism to decentralise qualifications, but rather by default,…

  8. Oral Societies and Colonial Experiences: Sub-Saharan Africa and the "de-facto" Power of the Written Word

    ERIC Educational Resources Information Center

    Abdi, Ali A.

    2007-01-01

    Pre-colonial traditional societies in Sub-Saharan Africa were mostly oral societies whose languages were not written. In the African context, especially, it was clear that the mostly oral traditions of these societies' languages were neither being appreciated nor promoted as media of communication, or means of education by the invading Europeans.…

  9. Global Trends and Research Aims for English Academic Oral Presentations: Changes, Challenges, and Opportunities for Learning Technology

    ERIC Educational Resources Information Center

    Barrett, Neil E.; Liu, Gi-Zen

    2016-01-01

    English has become the de facto language for communication in academia in many parts of the world, but English language learners often lack the language resources to make effective oral academic presentations. However, English for academic purposes (EAP) research is beginning to provide valuable insights into this emerging field. This literature…

  10. Composition, structure, and dynamics of a mature, unmanaged, pine-dominated old-field stand in southeastern Arkansas

    Treesearch

    Don C. Bragg; Eric Heitzman

    2009-01-01

    This study describes the composition and structure of a mature, second-growthPinus taeda (Loblolly Pine) and Pinus echinata (Shortleaf Pine)-dominatedold-field stand. Now owned by the University of Arkansas, this 22.5-ha parcel justoutside of the city of Monticello, AR, has been protected as a de facto natural area

  11. Architecture Knowledge for Evaluating Scalable Databases

    DTIC Science & Technology

    2015-01-16

    problems, arising from the proliferation of new data models and distributed technologies for building scalable, available data stores . Architects must...longer are relational databases the de facto standard for building data repositories. Highly distributed, scalable “ NoSQL ” databases [11] have emerged...This is especially challenging at the data storage layer. The multitude of competing NoSQL database technologies creates a complex and rapidly

  12. The Eclipse of Imagination within Educational "Official" Framework and Why It Should Be Returned to Educational Discourse: A Deweyan Perspective

    ERIC Educational Resources Information Center

    d'Agnese, Vasco

    2017-01-01

    In recent decades, the shift towards the "learnification" of educational discourse has "de facto" reframed educational purposes and schooling practice, thus reframing what students should know, strive for, and, in a sense, be. In this paper, given the efforts to disrupt the dominance of learning discourse, I seek to engage…

  13. European Union's Moratorium Impact on Food Biotechnology: A Discussion-Based Scenario

    ERIC Educational Resources Information Center

    Snyder, Lori Unruh; Gallo, Maria; Fulford, Stephen G.; Irani, Tracy; Rudd, Rick; DiFino, Sharon M.; Durham, Timothy C.

    2008-01-01

    Genetically modified (GM) crops such as maize (Zea mays L.), cotton (Gossypium hirsutum L.), soybean [Glycine max (L.) Moench], and canola (Brassica rapa L.) have been widely adopted by American farmers. In spite of their use in the United States, the European Union (EU) imposed a 6-year de facto moratorium (1998-2004) on the cultivation/import of…

  14. Exploration of Peer Leader Verbal Behaviors as They Intervene with Small Groups in College General Chemistry

    ERIC Educational Resources Information Center

    Kulatunga, Ushiri; Lewis, Jennifer E.

    2013-01-01

    Current literature has emphasized the lack of research into verbal behaviors of teachers as a barrier to understanding the effectiveness of instructional interventions. This study focuses on the verbal behaviors of peer leaders, who serve as de facto teachers in a college chemistry teaching reform based on cooperative learning. Video data obtained…

  15. They Say We Suck: The Failure of IPEDS Graduation Rates to Fully Measure Student Success

    ERIC Educational Resources Information Center

    Weber, Jennifer Kathryn

    2017-01-01

    IPEDS graduation rates have become de facto means for higher education accountability in the United States, used by the federal government, state and local agencies, non-profits and media to compare and rank institutions. IPEDS uses a limited subset of students, as well as an institutional perspective to measure graduation rate. Under this model,…

  16. F-16 MMC Strafe in Mountainous Terrain

    DTIC Science & Technology

    2016-04-01

    19 Figure 10. Steep fast (480 KIAS ) high-angle strafe...a fast (480 KIAS in this example) approach varies from the recovery radius (4000 feet in this case) and down track range prior to recovery initiation...Steep fast (480 KIAS ) high-angle strafe Alternate Analysis The F-16 community, to some extent, has adopted high-angle strafe as the de facto strafe

  17. Tenured Faculty at Colleges and Universities in the United States: A De Facto Private Membership Club

    ERIC Educational Resources Information Center

    Spanbauer, Julie M.

    2009-01-01

    There has been a gradual increase at U.S. universities and colleges in the appointment of women to full time faculty positions with women currently comprising approximately 40% of full time faculty. When status, job security, and institutional affiliation are taken into account, the percentage drops significantly: Women occupy only 24% of tenured…

  18. The De Facto National Curriculum in Elementary Social Studies: Critique of a Representative Sample. Elementary Subjects Center Series No. 17.

    ERIC Educational Resources Information Center

    Brophy, Jere

    Despite scholarly disagreement about the nature and purposes of social studies education, the most widely adopted elementary social studies textbook series tend to be remarkably uniform, consisting of compendia of facts organized within the expanding communities curriculum structure. Content selection and explication tend to be guided primarily by…

  19. Russian in Estonia's Public Sector: "Playing on the Borderline" between Official Policy and Real-Life Needs

    ERIC Educational Resources Information Center

    Berezkina, Maimu

    2017-01-01

    This article examines the use of Russian in state communication in officially monolingual Estonia. Drawing on interviews with high-level public employees in four central state institutions and an analysis of these institutions' websites, the article shows that while Russian is not specifically mentioned in the laws, it is "de facto"…

  20. Neo-Liberalism, Globalization, Language Policy and Practice Issues in the Asia-Pacific Region

    ERIC Educational Resources Information Center

    Majhanovich, Suzanne

    2014-01-01

    By the beginning of the twenty-first century, the English language had become the de facto "lingua franca" of the modern world. It is the most popular second or foreign language studied, such that now there are more people who have learned English as a second language and speak it with some competence than there are native English…

  1. Mother's Schooling, Fertility, and Children's Education: Evidence from a Natural Experiment. NBER Working Paper No. 16856

    ERIC Educational Resources Information Center

    Lavy, Victor; Zablotsky, Alexander

    2011-01-01

    This paper studies the effect of mothers' education on their fertility and their children's schooling. We base our evidence on a natural experiment that sharply reduced the cost of attending school and, as a consequence, significantly increased the education of affected cohorts. This natural experiment was the result of the de facto revocation in…

  2. Assessing Grand Strategies: How the EU and NATO Rock the Strategic Boat

    DTIC Science & Technology

    2015-05-01

    part to constitute the embryos of the European Union and NATO grand strategies briefly illustrated in the extracts presented in the epigraph. These... embryos gestated over time becoming de facto grand strategies for these institutions. Applying the characterization of grand strategy elaborated in...6. Michio Kaku, Physics of the Future: How Science Will Shape Human Destiny and Our Daily

  3. MRUniNovo: an efficient tool for de novo peptide sequencing utilizing the hadoop distributed computing framework.

    PubMed

    Li, Chuang; Chen, Tao; He, Qiang; Zhu, Yunping; Li, Kenli

    2017-03-15

    Tandem mass spectrometry-based de novo peptide sequencing is a complex and time-consuming process. The current algorithms for de novo peptide sequencing cannot rapidly and thoroughly process large mass spectrometry datasets. In this paper, we propose MRUniNovo, a novel tool for parallel de novo peptide sequencing. MRUniNovo parallelizes UniNovo based on the Hadoop compute platform. Our experimental results demonstrate that MRUniNovo significantly reduces the computation time of de novo peptide sequencing without sacrificing the correctness and accuracy of the results, and thus can process very large datasets that UniNovo cannot. MRUniNovo is an open source software tool implemented in java. The source code and the parameter settings are available at http://bioinfo.hupo.org.cn/MRUniNovo/index.php. s131020002@hnu.edu.cn ; taochen1019@163.com. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  4. OrChem - An open source chemistry search engine for Oracle®

    PubMed Central

    2009-01-01

    Background Registration, indexing and searching of chemical structures in relational databases is one of the core areas of cheminformatics. However, little detail has been published on the inner workings of search engines and their development has been mostly closed-source. We decided to develop an open source chemistry extension for Oracle, the de facto database platform in the commercial world. Results Here we present OrChem, an extension for the Oracle 11G database that adds registration and indexing of chemical structures to support fast substructure and similarity searching. The cheminformatics functionality is provided by the Chemistry Development Kit. OrChem provides similarity searching with response times in the order of seconds for databases with millions of compounds, depending on a given similarity cut-off. For substructure searching, it can make use of multiple processor cores on today's powerful database servers to provide fast response times in equally large data sets. Availability OrChem is free software and can be redistributed and/or modified under the terms of the GNU Lesser General Public License as published by the Free Software Foundation. All software is available via http://orchem.sourceforge.net. PMID:20298521

  5. OpenID connect as a security service in Cloud-based diagnostic imaging systems

    NASA Astrophysics Data System (ADS)

    Ma, Weina; Sartipi, Kamran; Sharghi, Hassan; Koff, David; Bak, Peter

    2015-03-01

    The evolution of cloud computing is driving the next generation of diagnostic imaging (DI) systems. Cloud-based DI systems are able to deliver better services to patients without constraining to their own physical facilities. However, privacy and security concerns have been consistently regarded as the major obstacle for adoption of cloud computing by healthcare domains. Furthermore, traditional computing models and interfaces employed by DI systems are not ready for accessing diagnostic images through mobile devices. RESTful is an ideal technology for provisioning both mobile services and cloud computing. OpenID Connect, combining OpenID and OAuth together, is an emerging REST-based federated identity solution. It is one of the most perspective open standards to potentially become the de-facto standard for securing cloud computing and mobile applications, which has ever been regarded as "Kerberos of Cloud". We introduce OpenID Connect as an identity and authentication service in cloud-based DI systems and propose enhancements that allow for incorporating this technology within distributed enterprise environment. The objective of this study is to offer solutions for secure radiology image sharing among DI-r (Diagnostic Imaging Repository) and heterogeneous PACS (Picture Archiving and Communication Systems) as well as mobile clients in the cloud ecosystem. Through using OpenID Connect as an open-source identity and authentication service, deploying DI-r and PACS to private or community clouds should obtain equivalent security level to traditional computing model.

  6. The Cyborg Astrobiologist: testing a novelty detection algorithm on two mobile exploration systems at Rivas Vaciamadrid in Spain and at the Mars Desert Research Station in Utah

    NASA Astrophysics Data System (ADS)

    McGuire, P. C.; Gross, C.; Wendt, L.; Bonnici, A.; Souza-Egipsy, V.; Ormö, J.; Díaz-Martínez, E.; Foing, B. H.; Bose, R.; Walter, S.; Oesker, M.; Ontrup, J.; Haschke, R.; Ritter, H.

    2010-01-01

    In previous work, a platform was developed for testing computer-vision algorithms for robotic planetary exploration. This platform consisted of a digital video camera connected to a wearable computer for real-time processing of images at geological and astrobiological field sites. The real-time processing included image segmentation and the generation of interest points based upon uncommonness in the segmentation maps. Also in previous work, this platform for testing computer-vision algorithms has been ported to a more ergonomic alternative platform, consisting of a phone camera connected via the Global System for Mobile Communications (GSM) network to a remote-server computer. The wearable-computer platform has been tested at geological and astrobiological field sites in Spain (Rivas Vaciamadrid and Riba de Santiuste), and the phone camera has been tested at a geological field site in Malta. In this work, we (i) apply a Hopfield neural-network algorithm for novelty detection based upon colour, (ii) integrate a field-capable digital microscope on the wearable computer platform, (iii) test this novelty detection with the digital microscope at Rivas Vaciamadrid, (iv) develop a Bluetooth communication mode for the phone-camera platform, in order to allow access to a mobile processing computer at the field sites, and (v) test the novelty detection on the Bluetooth-enabled phone camera connected to a netbook computer at the Mars Desert Research Station in Utah. This systems engineering and field testing have together allowed us to develop a real-time computer-vision system that is capable, for example, of identifying lichens as novel within a series of images acquired in semi-arid desert environments. We acquired sequences of images of geologic outcrops in Utah and Spain consisting of various rock types and colours to test this algorithm. The algorithm robustly recognized previously observed units by their colour, while requiring only a single image or a few images to learn colours as familiar, demonstrating its fast learning capability.

  7. Lawful Hacking: Toward a Middle-Ground Solution to the Going Dark Problem

    DTIC Science & Technology

    2017-03-01

    14 2. 9/11 Terrorist Attacks..................................................................17 3. The Rise of Encryption Post -Snowden’s...taking pictures, posting pictures, shopping, conducting business, and searching the web. To investigate these 100,000 [the number of criminal cases his...non-U.S. person who does not have strong ties to the United States, FISA is the de facto Fourth Amendment limitation on government’s domestic

  8. The De-Facto Privatization of Secondary Education in Egypt: A Study of Private Tutoring in Technical and General Schools

    ERIC Educational Resources Information Center

    Sobhy, Hania

    2012-01-01

    Most secondary school students in Egypt enrol in private tutoring in almost all subjects throughout the school year. A large proportion of students have stopped attending school altogether due to their reliance on tutoring. This study of how educational markets are perpetuated at school level finds that in the technical track catering to the…

  9. A STATEMENT OF CONCERN FOR PUBLIC EDUCATION IN PHILADELPHIA, WITH PARTICULAR REFERENCE TO THE SPECIAL NEEDS OF CHILDREN IN UNDERPRIVILEGED, SEGREGATED AREAS.

    ERIC Educational Resources Information Center

    Commission on Human Relations, Philadelphia, PA.

    THIS 1960 STATEMENT OF THE COMMISSION ON HUMAN RELATIONS DESCRIBES THE PROBLEM OF DE FACTO SEGREGATION IN PHILADELPHIA. THE EXTENT OF SCHOOL AND FACULTY SEGREGATION IS INDICATED, AND THE REASONS FOR ITS EXISTENCE ARE DISCUSSED. IT IS FELT THAT THE CULTURAL DEFICIT WHICH RESULTS FROM RACIAL DISCRIMINATION AND SEGREGATION REQUIRES COMPENSATORY…

  10. School Dropouts or Student Pushouts?; A Case Study of the Possible Violation of Property Rights & Liberties by the de Facto Exclusion of Students From the Public Schools.

    ERIC Educational Resources Information Center

    Berlowitz, Marvin J.; Durand, Henry

    The purpose of this study is to demonstrate that the disproportionate number of poor, minority, and working class students represented among the population of "school dropouts" are, objectively, the victims of an institutional syndrome of systematic exclusion referred to as "the student pushout" phenomenon. Cincinnati's junior…

  11. Three Most Essential Teacher Attributes Needed for Success as Perceived by Teachers in Defacto Segregated Schools.

    ERIC Educational Resources Information Center

    Eubanks, E.

    This paper presents two studies involving 97 teachers from six de facto segregated high schools in a large midwestern school district and their perceptions of the teacher attributes most essential for success in their school. The first study compares the responses of 47 teachers in white high schools with 50 teachers in black high schools on the…

  12. Can Less Be More? Comparison of an 8-Item Placement Quality Measure with the 50-Item Dundee Ready Educational Environment Measure (DREEM)

    ERIC Educational Resources Information Center

    Kelly, Martina; Bennett, Deirdre; Muijtjens, Arno; O'Flynn, Siun; Dornan, Tim

    2015-01-01

    Clinical clerks learn more than they are taught and not all they learn can be measured. As a result, curriculum leaders evaluate clinical educational environments. The quantitative Dundee Ready Environment Measure (DREEM) is a "de facto" standard for that purpose. Its 50 items and 5 subscales were developed by consensus. Reasoning that…

  13. THE CASE AGAINST DE FACTO SEGREGATION EDUCATION IN THE NORTH AND WEST--A CONTEMPORARY CASE STUDY.

    ERIC Educational Resources Information Center

    TILLMAN, JAMES A., JR.

    IT IS THE RESPONSIBILITY OF THE LOCAL GOVERNMENT TO PROVIDE EDUCATION WHICH ENABLES CHILDREN TO ADJUST NORMALLY TO AMERICAN DEMOCRACY. THIS CANNOT BE DONE IN A SCHOOL SYSTEM THAT IS RACIALLY IMBALANCED. THE MINNEAPOLIS HUMAN RELATIONS TASK FORCE FEELS THAT THE BOARD OF EDUCATION MUST NOT WAIT FOR A SOUND COMMUNITY BUT MUST TAKE POSITIVE STEPS TO…

  14. Languages of Science in the Era of Nation-State Formation: The Israeli Universities and Their (Non)Participation in the Revival of Hebrew

    ERIC Educational Resources Information Center

    Kheimets, Nina G.; Epstein, Alek D.

    2005-01-01

    This paper presents sociological analysis of the linguistic and cultural identity of two of Israel's most influential and high-ranked universities during their formative years, that were also the "de facto" formative years of the Israeli state-in-the-making (1924-1948). We argue that the influence of external universal factors on a…

  15. The Freedom of Religion and the Freedom of Education in Twenty-First-Century Belgium: A Critical Approach

    ERIC Educational Resources Information Center

    Franken, Leni

    2016-01-01

    In spite of recent tendencies of secularisation and religious pluralism, most Belgian schools are Catholic schools, where Roman Catholic religious education is a compulsory subject. As we will argue, this can lead to a "de facto" undermining of the freedom of religion and education and a shift in the system is therefore required. In the…

  16. The Divisive Gate-Keeping Role of Languages in Jamaica: Establishing Post Primary Schools as Centres for Immersion in the Target Language

    ERIC Educational Resources Information Center

    Davids, Melva P.

    2013-01-01

    This paper seeks to examine the role of the official and "de facto" languages of anglophone Caribbean islands such as Jamaica. In reflecting on their statuses as users of language, tertiary level students registered in a Year One Performing Arts Program provided much insights into the interplay of both languages that coexist in society…

  17. The present and future of de novo whole-genome assembly.

    PubMed

    Sohn, Jang-Il; Nam, Jin-Wu

    2018-01-01

    As the advent of next-generation sequencing (NGS) technology, various de novo assembly algorithms based on the de Bruijn graph have been developed to construct chromosome-level sequences. However, numerous technical or computational challenges in de novo assembly still remain, although many bright ideas and heuristics have been suggested to tackle the challenges in both experimental and computational settings. In this review, we categorize de novo assemblers on the basis of the type of de Bruijn graphs (Hamiltonian and Eulerian) and discuss the challenges of de novo assembly for short NGS reads regarding computational complexity and assembly ambiguity. Then, we discuss how the limitations of the short reads can be overcome by using a single-molecule sequencing platform that generates long reads of up to several kilobases. In fact, the long read assembly has caused a paradigm shift in whole-genome assembly in terms of algorithms and supporting steps. We also summarize (i) hybrid assemblies using both short and long reads and (ii) overlap-based assemblies for long reads and discuss their challenges and future prospects. This review provides guidelines to determine the optimal approach for a given input data type, computational budget or genome. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  18. The Effects of Read 180 on Student Achievement

    ERIC Educational Resources Information Center

    Plony, Doreen A.

    2013-01-01

    The purpose of this ex post facto study was to analyze archive data to investigate the effects of Read 180, a computer-based supplemental reading intervention, on students' academic achievement for the academic school year 2011-2012. Further analyses examined if influences existed in variables such as grade level, gender, and ethnicity of the…

  19. Personal Services Contracts. Is It Time to Lift the Ban

    DTIC Science & Technology

    2016-03-01

    Defense AT&L: March-April 2016 42 Personal Services Contracts Is It Time to Lift the Ban? Steven A. Fasko Fasko is a professor of Contract...Carbondale and has extensive professional experience in both U.S. Army global logistics services and Veterans Administration personal services...integrated offices. One issue has remained unchanged: the risk of creating a de facto personal services contract due to this relationship. Personal

  20. Enacting English Language Ownership in the Outer Circle: A Study of Singaporean Indians' Orientations to English Norms

    ERIC Educational Resources Information Center

    Rubdy, Rani; Mckay, Sandra Lee; Alsagoff, Lubna; Bokhorst-Heng, Wendy D.

    2008-01-01

    Singapore is unique in that it has not only embraced English as one of its official languages, but has made the language of its colonizers the "de facto" working language of the nation and the sole medium of instruction in all its schools, while assigning its other three official languages, Mandarin, Malay, and Tamil, an L2 status in the…

  1. PLA Reforms and Chinas Nuclear Forces

    DTIC Science & Technology

    2016-10-01

    mission set. In some respects, the formal eleva- tion of the Rocket Force to the level of a service merely codifies its de facto status. The Second...Zhiyuan, then-commander of the Second Artillery, and his navy and air force counterparts became ex officio members of the CMC. Rocket Force...class SSBN, which never conducted a single operational patrol.44 To the extent that greater operational ex - perience with nuclear weapons increases

  2. Gaols or De Facto Mental Institutions? Why Individuals with a Mental Illness Are Over-Represented in the Criminal Justice System in New South Wales, Australia

    ERIC Educational Resources Information Center

    Henderson, Corinne

    2007-01-01

    The over-representation of people with mental illness in the criminal justice system highlights the need for legislative reform and the implementation of programs breaking the cycle of mental illness, poverty, unemployment and substance abuse across Australia. Whilst there is no inherent association between mental illness and crime, there is a…

  3. Russian Energy Policy vis-a-vis Europe: Natural Resources as a Means of Foreign Policy

    DTIC Science & Technology

    2012-06-01

    that lag. For instance, after the March 2011 Fukushima disaster , German political leadership decided...memories for the rest of my life. xii THIS PAGE INTENTIONALLY LEFT BLANK 1 I. INTRODUCTION A. BACKGROUND After the resignation of Boris Yeltsin...the first president of the Russian Federation after the dissolution of the USSR, on December 31, 1999, and de facto appointment of his successor

  4. Machine Learning with Distances

    DTIC Science & Technology

    2015-02-16

    of training class-wise densities p(x|y) to test input density p′(x). For the fitting of qπ to p ′, we may use the Kullback - Leibler (KL...Problems of Information Transmission, 23(9):95–101, 1987. [103] S. Kullback and R. A. Leibler . On information and sufficiency. The Annals of ...distributions. The Kullback - Leibler (KL) distance is the de-facto standard distance measure in statis- tics and machine learning, because

  5. Why Thailand’s Military Stepped In

    DTIC Science & Technology

    2011-03-01

    so the political leader is the de facto moral leader with some divine legitimacy.38 Mongkut, the monk turned monarch, shifted the cosmology of...might not simply be accepted as reflections of an unchanging cosmology but might be changed by humans through application of knowledge of nature.”39...king was allowed all the sources of legitimization he needed. In return, Sarit as prime minister enjoyed the cosmological sanction of the throne

  6. Cross-Border Higher Education for Regional Integration:Analysis of the JICA-RI Survey on Leading Universities in East Asia. JICA-RI Working Paper. No. 26

    ERIC Educational Resources Information Center

    Kuroda, Kazuo; Yuki, Takako; Kang, Kyuwon

    2010-01-01

    Set against the backdrop of increasing economic interdependence in East Asia, the idea of regional integration is now being discussed as a long-term political process in the region. As in the field of the international economy, de facto integration and interdependence exist with respect to the internationalization of the higher education system…

  7. Comparison of USDA Forest Service and stakeholder motivations and experiences in collaborative federal forest governance in the Western United States

    Treesearch

    Emily Jane Davis; Eric M. White; Lee K. Cerveny; David Seesholtz; Meagan L. Nuss; Donald R. Ulrich

    2017-01-01

    In the United States, over 191 million acres of land is managed by the United States Department of Agriculture Forest Service, a federal government agency. In several western U.S. states, organized collaborative groups have become a de facto governance approach to providing sustained input on management decisions on much public land. This is most extensive in Oregon,...

  8. The Illusion of Control: Great Powers Interacting with Tribal Societies and Weak Nation-states

    DTIC Science & Technology

    2009-12-01

    Carnatic region around Pondicherry. The Indian troops, led by French company officers, attacked and defeated the minimally defended British Company... Orange (Netherlands) became William III, King of England, Scotland and Ireland. Refer to Williamson Short History: The Old Colonial Empire Chapter VI...the French absence to extend their influence in the Carnatic region. Simultaneously, the British became the de facto rulers of Bengal. During the

  9. The Ideals of Today’s Modernizing People’s Liberation Army

    DTIC Science & Technology

    2010-06-01

    Government. IRB Protocol number ___N/A_____________. 12a. DISTRIBUTION / AVAILABILITY STATEMENT Approved for public release; distribution is unlimited...was a tutor and teacher for the children of the de facto ruler of Lu, Meng Xizi, where he taught ritual etiquette observed at state religious...appropriately to one’s circumstances and role. One must observe the proper rituals, ceremonies, and protocols to promote social cohesion and individual

  10. Design and Implementation of a Metadata-rich File System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ames, S; Gokhale, M B; Maltzahn, C

    2010-01-19

    Despite continual improvements in the performance and reliability of large scale file systems, the management of user-defined file system metadata has changed little in the past decade. The mismatch between the size and complexity of large scale data stores and their ability to organize and query their metadata has led to a de facto standard in which raw data is stored in traditional file systems, while related, application-specific metadata is stored in relational databases. This separation of data and semantic metadata requires considerable effort to maintain consistency and can result in complex, slow, and inflexible system operation. To address thesemore » problems, we have developed the Quasar File System (QFS), a metadata-rich file system in which files, user-defined attributes, and file relationships are all first class objects. In contrast to hierarchical file systems and relational databases, QFS defines a graph data model composed of files and their relationships. QFS incorporates Quasar, an XPATH-extended query language for searching the file system. Results from our QFS prototype show the effectiveness of this approach. Compared to the de facto standard, the QFS prototype shows superior ingest performance and comparable query performance on user metadata-intensive operations and superior performance on normal file metadata operations.« less

  11. Cost-of-living indexes and demographic change.

    PubMed

    Diamond, C A

    1990-06-01

    The Consumer Price Index (CPI), although not without problems, is the most often used mechanism for adjusting contracts for cost-of-living changes in the US. The US Bureau of Labor Statistics lists several problems associated with using the CPI as a cost-of-living index where the proportion of 2-worker families is increasing, population is shifting, and work week hours are changing. This study shows how to compute cost-of-living indexes which are inexpensive to update, use less restrictive assumptions about consumer preferences, do not require statistical estimation, and handle the problem of increasing numbers of families where both the husband and wife work. This study attempts to how widely in fact the CPI varies with alternative true cost-of-living varies with alternative true cost-of-living indexes although in the end this de facto cost-of-living measure holds up quite well. In times of severe price inflation people change their preferences by substitution, necessitating a flexible cost-of-living index that accounts for this fundamental economic behavior.

  12. SSME Investment in Turbomachinery Inducer Impeller Design Tools and Methodology

    NASA Technical Reports Server (NTRS)

    Zoladz, Thomas; Mitchell, William; Lunde, Kevin

    2010-01-01

    Within the rocket engine industry, SSME turbomachines are the de facto standards of success with regard to meeting aggressive performance requirements under challenging operational environments. Over the Shuttle era, SSME has invested heavily in our national inducer impeller design infrastructure. While both low and high pressure turbopump failures/anomaly resolution efforts spurred some of these investments, the SSME program was a major benefactor of key areas of turbomachinery inducer-impeller research outside of flight manifest pressures. Over the past several decades, key turbopump internal environments have been interrogated via highly instrumented hot-fire and cold-flow testing. Likewise, SSME has sponsored the advancement of time accurate and cavitating inducer impeller computation fluid dynamics (CFD) tools. These investments together have led to a better understanding of the complex internal flow fields within aggressive high performing inducers and impellers. New design tools and methodologies have evolved which intend to provide confident blade designs which strike an appropriate balance between performance and self induced load management.

  13. Automatic Parallelization of Numerical Python Applications using the Global Arrays Toolkit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Daily, Jeffrey A.; Lewis, Robert R.

    2011-11-30

    Global Arrays is a software system from Pacific Northwest National Laboratory that enables an efficient, portable, and parallel shared-memory programming interface to manipulate distributed dense arrays. The NumPy module is the de facto standard for numerical calculation in the Python programming language, a language whose use is growing rapidly in the scientific and engineering communities. NumPy provides a powerful N-dimensional array class as well as other scientific computing capabilities. However, like the majority of the core Python modules, NumPy is inherently serial. Using a combination of Global Arrays and NumPy, we have reimplemented NumPy as a distributed drop-in replacement calledmore » Global Arrays in NumPy (GAiN). Serial NumPy applications can become parallel, scalable GAiN applications with only minor source code changes. Scalability studies of several different GAiN applications will be presented showing the utility of developing serial NumPy codes which can later run on more capable clusters or supercomputers.« less

  14. Efficient Stochastic Rendering of Static and Animated Volumes Using Visibility Sweeps.

    PubMed

    von Radziewsky, Philipp; Kroes, Thomas; Eisemann, Martin; Eisemann, Elmar

    2017-09-01

    Stochastically solving the rendering integral (particularly visibility) is the de-facto standard for physically-based light transport but it is computationally expensive, especially when displaying heterogeneous volumetric data. In this work, we present efficient techniques to speed-up the rendering process via a novel visibility-estimation method in concert with an unbiased importance sampling (involving environmental lighting and visibility inside the volume), filtering, and update techniques for both static and animated scenes. Our major contributions include a progressive estimate of partial occlusions based on a fast sweeping-plane algorithm. These occlusions are stored in an octahedral representation, which can be conveniently transformed into a quadtree-based hierarchy suited for a joint importance sampling. Further, we propose sweep-space filtering, which suppresses the occurrence of fireflies and investigate different update schemes for animated scenes. Our technique is unbiased, requires little precomputation, is highly parallelizable, and is applicable to a various volume data sets, dynamic transfer functions, animated volumes and changing environmental lighting.

  15. Integrating Multimodal Radiation Therapy Data into i2b2.

    PubMed

    Zapletal, Eric; Bibault, Jean-Emmanuel; Giraud, Philippe; Burgun, Anita

    2018-04-01

     Clinical data warehouses are now widely used to foster clinical and translational research and the Informatics for Integrating Biology and the Bedside (i2b2) platform has become a de facto standard for storing clinical data in many projects. However, to design predictive models and assist in personalized treatment planning in cancer or radiation oncology, all available patient data need to be integrated into i2b2, including radiation therapy data that are currently not addressed in many existing i2b2 sites.  To use radiation therapy data in projects related to rectal cancer patients, we assessed the feasibility of integrating radiation oncology data into the i2b2 platform.  The Georges Pompidou European Hospital, a hospital from the Assistance Publique - Hôpitaux de Paris group, has developed an i2b2-based clinical data warehouse of various structured and unstructured clinical data for research since 2008. To store and reuse various radiation therapy data-dose details, activities scheduling, and dose-volume histogram (DVH) curves-in this repository, we first extracted raw data by using some reverse engineering techniques and a vendor's application programming interface. Then, we implemented a hybrid storage approach by combining the standard i2b2 "Entity-Attribute-Value" storage mechanism with a "JavaScript Object Notation (JSON) document-based" storage mechanism without modifying the i2b2 core tables. Validation was performed using (1) the Business Objects framework for replicating vendor's application screens showing dose details and activities scheduling data and (2) the R software for displaying the DVH curves.  We developed a pipeline to integrate the radiation therapy data into the Georges Pompidou European Hospital i2b2 instance and evaluated it on a cohort of 262 patients. We were able to use the radiation therapy data on a preliminary use case by fetching the DVH curve data from the clinical data warehouse and displaying them in a R chart.  By adding radiation therapy data into the clinical data warehouse, we were able to analyze radiation therapy response in cancer patients and we have leveraged the i2b2 platform to store radiation therapy data, including detailed information such as the DVH to create new ontology-based modules that provides research investigators with a wider spectrum of clinical data. Schattauer GmbH Stuttgart.

  16. A Kaiser Family Foundation and Children Now National Survey: Parents Speak Up about Television Today. A Summary of Findings.

    ERIC Educational Resources Information Center

    Kaiser Foundation, Oakland, CA.

    In the midst of a growing national debate about the role of television as a de facto "sex educator" for young people today, this survey asked parents nationwide in the fall of 1996 about their views on kids and television. A random sample of 853 parents and children ages 6 to 15 were surveyed by telephone (the data reported here focus on…

  17. Greed and Grievance and Drug Cartels: Mexico’s Commercial Insurgency

    DTIC Science & Technology

    2017-05-25

    impunity as signs the problem has grown beyond mere organized crime. 23 As Sullivan and Elkus summarize, The fragmented and post ideological quality...are likely to dominate the post -Cold War world. Spiritual insurgency is the descent of the Cold War-era revolutionary insurgency. It will be driven by...the legitimacy of the organization as, “the de facto authority… [guaranteeing] living conditions for its inhabitants.”61 In areas where the

  18. The Quest for Military Cooperation in North Africa: Prospects and Challenges

    DTIC Science & Technology

    2016-10-01

    transi- tion for the new political regimes, but rather the op- posite. Despite the tremendous differences between their post -2011 revolution political...inability of its post -revo- lution governments to maintain order and control the country’s borders, has had an immense impact on re- gional security...as Libya’s de facto government was granted only after the death of Qadhafi. Algerian estimates of the security risks were vindi- cated by events

  19. JPRS Report, East Europe.

    DTIC Science & Technology

    1990-08-24

    solve global problems. 4. It is a fact of history that Stalinism, totalitarian "command socialism," has strongly affected the devel- opment of the...parties are concerned, there we must differentiate. The People’s Party and the Socialist Party paid for the fact that in the past they were de facto...subordinated to the CPCZ, and not just politically. The Social Democratic Party, which was banned for 40 years, on the other hand paid for the fact

  20. United States Security Policy Implications of a Post-Fidel Cuba

    DTIC Science & Technology

    2012-03-10

    that displaced any real civil or economic sovereignty. A de facto colony was in the making and America had no qualms flexing its hard and soft ...independence remained from the cancelation of the Platt Amendment in 1934 through 1959, all critical elements of that self -rule remained contingent...Fidel, the Maximum Leader announced just days before the 2008 single-party, rubber -stamping National Assembly that he would not accept a nomination to

  1. Providing Comfort to Iraq’s Kurds: Forming a De Facto Relationship

    DTIC Science & Technology

    2016-03-01

    See Kenneth Waltz, Theory of International Politics (New York: McGraw-Hill, 1979 ). 3 (economic/demographic) and tangible (military) terms,4 and...policy objectives soon became eclipsed by the Iranian Revolution in 1979 and Iraq’s invasion of Iran in 1980. Still scarred from the ignominy of the... 1979 U.S. Embassy takeover and hostage crisis in Tehran, the Reagan administration began to tilt its foreign policy toward supporting Iraq during

  2. Patterns of Global Terrorism 1999

    DTIC Science & Technology

    2000-04-01

    Italian labor leader Massimo D’Antono in May. In the United Kingdom, the Good Friday accords effectively prolonged the de facto peace while the various...French court as an accessory in the Paris metro bombings in 1995—for attempted murder, criminal association, sedition, and forgery and sentenced him to...outstanding at yearend. In addition, the French Government’s nationwide “Vigi-Pirate” plan—launched in 1998 to prevent a repeat of the Paris metro

  3. The UNIX Operating System: A Model for Software Design.

    PubMed

    Kernighan, B W; Morgan, S P

    1982-02-12

    The UNIX operating system, a general-purpose time-sharng system, has, without marketing, advertising, or technical support, become widely used by universities and scientific research establishments. It is the de facto standard of comparison for such systems and has spawned a small industry of suppliers of UNIX variants and look-alikes. This article attempts to uncover the reasons for its success and to draw some lessons for the future of operating systems.

  4. The growing importance of costs and ways to maintain cost control on a large program in today's competitive environment

    NASA Technical Reports Server (NTRS)

    Newman, J. J.; Grimes, D. W.; Gaetano, F. W.

    1973-01-01

    Discussion of management techniques that make it possible to overcome inflationary and developmental cost rises while holding schedule and performance fixed in scientific space programs. The techniques reviewed pertain to high personnel motivation, continual review of contract rigidity for de facto modification by senior judgment, standardization vs design innovation, cooperative customer/contractor goal orientation vs task orientation, and deep real-time management visibility.

  5. Gaining Leverage Over Vendor Lock to Improve Acquisition Performance and Cost Efficiencies

    DTIC Science & Technology

    2014-04-30

    Virginia Wydler, MITRE Corporation Open Systems Architecture License Rights: A New Era for the Public– Private Market -Place Nickolas Guertin, DASN RDT...if only one vendor can replace or upgrade those key components, that de facto monopolist may be able to exert excess negotiating leverage over...instead of sole source subcontractor procurement. The Program Office should have the prime vendor to provide a full market research data in accordance

  6. Treatment of Battlefield Detainees in the War on Terrorism

    DTIC Science & Technology

    2006-11-14

    2002), available at [http://hrw.org/press/2002/01/us012802-ltr.htm] (last visited March 22, 2006). 24 See Civil and Political Rights, Including the...expert Hayes Parks, who advocates a purely de facto standard, without regard to political factors). 72 See INTERNATIONAL COMMITTEE OF THE RED CROSS...likely meant to be covered, but recognizing ambiguity with respect to groups supporting the invading army). 140 Military Prosecutor v. Kassem , 47 I.L.R

  7. Venezuela: Issues in the 111th Congress

    DTIC Science & Technology

    2010-09-03

    in January 2010 under the de facto government of Roberto Micheletti. In June 2009, three additional countries joined— Ecuador , St. Vincent and the...files from Colombia’s March 2008 raid of a FARC camp in Ecuador had raised questions about potential support of the FARC by the Chávez government ...has been friction in relations for almost a decade under the government of populist President Hugo Chávez. U.S. officials have expressed concerns

  8. Non-Parametric Model Drift Detection

    DTIC Science & Technology

    2016-07-01

    de,la,n,o,the,m, facto ,et,des,of Group num: 42, TC(X;Y_j): 0.083 42:resolution,council,resolutions,draft,recalling,pursuant,reaffirming,sponsors...Y_j): 0.019 80: posts ,cost, post ,expenditure,overall,infrastructure,expected,operational,external, savings Group num: 81, TC(X;Y_j): 0.018 81...90, TC(X;Y_j): 0.014 90:its,expresses,mandate,reiterates,appreciation,expressing,endorsed,reiterated, ex peditiously,literature Group num: 91, TC(X

  9. From Signature-Based Towards Behaviour-Based Anomaly Detection (Extended Abstract)

    DTIC Science & Technology

    2010-11-01

    data acquisition can serve as sensors. De- facto standard for IP flow monitoring is NetFlow format. Although NetFlow was originally developed by Cisco...packets with some common properties that pass through a network device. These collected flows are exported to an external device, the NetFlow ...Thanks to the network-based approach using NetFlow data, the detection algorithm is host independent and highly scalable. Deep Packet Inspection

  10. Need low-cost networking? Consider DeviceNet

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moss, W.H.

    1996-11-01

    The drive to reduce production costs and optimize system performance in manufacturing facilities causes many end users to invest in network solutions. Because of distinct differences between the way tasks are performed and the way data are handled for various applications, it is clear than more than one network will be needed in most facilities. What is not clear is which network is most appropriate for a given application. The information layer is the link between automation and information environments via management information systems (MISs) and manufacturing execution systems (MESs) and manufacturing execution systems (MESs). Here the market has chosenmore » a de facto standard in Ethernet, primarily transmission control protocol/internet protocol (TCP/IP) and secondarily manufacturing messaging system (MMS). There is no single standard at the device layer. However, the DeviceNet communication standard has made strides to reach this goal. This protocol eliminates expensive hardwiring and provides improved communication between devices and important device-level diagnostics not easily accessible or available through hardwired I/O interfaces. DeviceNet is a low-cost communications link connecting industrial devices to a network. Many original equipment manufacturers and end users have chosen the DeviceNet platform for several reasons, but most frequently because of four key features: interchangeability; low cost; advanced diagnostics; insert devices under power.« less

  11. New Interfaces to Web Documents and Services

    NASA Technical Reports Server (NTRS)

    Carlisle, W. H.

    1996-01-01

    This paper reports on investigations into how to extend capabilities of the Virtual Research Center (VRC) for NASA's Advanced Concepts Office. The work was performed as part of NASA's 1996 Summer Faculty Fellowship program, and involved research into and prototype development of software components that provide documents and services for the World Wide Web (WWW). The WWW has become a de-facto standard for sharing resources over the internet, primarily because web browsers are freely available for the most common hardware platforms and their operating systems. As a consequence of the popularity of the internet, tools, and techniques associated with web browsers are changing rapidly. New capabilities are offered by companies that support web browsers in order to achieve or remain a dominant participant in internet services. Because a goal of the VRC is to build an environment for NASA centers, universities, and industrial partners to share information associated with Advanced Concepts Office activities, the VRC tracks new techniques and services associated with the web in order to determine the their usefulness for distributed and collaborative engineering research activities. Most recently, Java has emerged as a new tool for providing internet services. Because the major web browser providers have decided to include Java in their software, investigations into Java were conducted this summer.

  12. [Synergy retween D.P.R. 1124/65 and D.Lgs 81/2008: current events and perspectives].

    PubMed

    Savino, E; Miccio, A; Ossicini, A

    2010-01-01

    Already by the enactment of D.P.R. 1124/65 Legislator has, de iure e de facto, carried out an interaction between INAIL and the Competent Doctor (once Factory Doctor). By the enactment of the new Consolidated Act about Safety, the synergy between D.P.R. 1124/65 and D.Lgs. 81/2008 and, consequently, between INAIL doctor and Competent Doctor has further been enhanced, laying the essential requirements, according to the AA.s for a greater and greater collaboration between the two professional figures, with the objective of a more and more caring guardianship of the worker's health in the occupational field.

  13. Simulation of two dimensional electrophoresis and tandem mass spectrometry for teaching proteomics.

    PubMed

    Fisher, Amanda; Sekera, Emily; Payne, Jill; Craig, Paul

    2012-01-01

    In proteomics, complex mixtures of proteins are separated (usually by chromatography or electrophoresis) and identified by mass spectrometry. We have created 2DE Tandem MS, a computer program designed for use in the biochemistry, proteomics, or bioinformatics classroom. It contains two simulations-2D electrophoresis and tandem mass spectrometry. The two simulations are integrated together and are designed to teach the concept of proteome analysis of prokaryotic and eukaryotic organisms. 2DE-Tandem MS can be used as a freestanding simulation, or in conjunction with a wet lab, to introduce proteomics in the undergraduate classroom. 2DE Tandem MS is a free program available on Sourceforge at https://sourceforge.net/projects/jbf/. It was developed using Java Swing and functions in Mac OSX, Windows, and Linux, ensuring that every student sees a consistent and informative graphical user interface no matter the computer platform they choose. Java must be installed on the host computer to run 2DE Tandem MS. Example classroom exercises are provided in the Supporting Information. Copyright © 2012 Wiley Periodicals, Inc.

  14. Conventional-Flow Liquid Chromatography-Mass Spectrometry for Exploratory Bottom-Up Proteomic Analyses.

    PubMed

    Lenčo, Juraj; Vajrychová, Marie; Pimková, Kristýna; Prokšová, Magdaléna; Benková, Markéta; Klimentová, Jana; Tambor, Vojtěch; Soukup, Ondřej

    2018-04-17

    Due to its sensitivity and productivity, bottom-up proteomics based on liquid chromatography-mass spectrometry (LC-MS) has become the core approach in the field. The de facto standard LC-MS platform for proteomics operates at sub-μL/min flow rates, and nanospray is required for efficiently introducing peptides into a mass spectrometer. Although this is almost a "dogma", this view is being reconsidered in light of developments in highly efficient chromatographic columns, and especially with the introduction of exceptionally sensitive MS instruments. Although conventional-flow LC-MS platforms have recently penetrated targeted proteomics successfully, their possibilities in discovery-oriented proteomics have not yet been thoroughly explored. Our objective was to determine what are the extra costs and what optimization and adjustments to a conventional-flow LC-MS system must be undertaken to identify a comparable number of proteins as can be identified on a nanoLC-MS system. We demonstrate that the amount of a complex tryptic digest needed for comparable proteome coverage can be roughly 5-fold greater, providing the column dimensions are properly chosen, extra-column peak dispersion is minimized, column temperature and flow rate are set to levels appropriate for peptide separation, and the composition of mobile phases is fine-tuned. Indeed, we identified 2 835 proteins from 2 μg of HeLa cells tryptic digest separated during a 60 min gradient at 68 μL/min on a 1.0 mm × 250 mm column held at 55 °C and using an aqua-acetonitrile mobile phases containing 0.1% formic acid, 0.4% acetic acid, and 3% dimethyl sulfoxide. Our results document that conventional-flow LC-MS is an attractive alternative for bottom-up exploratory proteomics.

  15. Military Contingencies in Megacities and Sub megacities

    DTIC Science & Technology

    2016-12-01

    character- ized by violence and disorder. In between these ex - tremes are fragile cities that can tip in either direction. This has led to some...some point become a difference in kind. A megacity, for ex - ample, could swallow up a military division in a way that a city of a million people could...seek to reclaim land that, in their view, has been illegally occupied, or, those who have occupied land demand that their de facto own- ership

  16. A Randomized Effectiveness Trial of a Systems-Level Approach to Stepped Care for War-Related PTSD

    DTIC Science & Technology

    2016-05-01

    period rather than storing the hard copies at their respective posts was approved. Also, an amendment changing the study Initiating PI from COL...care is the de facto mental health system; in Collaborative Medicine Case Studies : Evidence in Prac- tice. Edited by Kessler R, Stafford D. New York...Public Release; Distribution Unlimited 13. SUPPLEMENTARY NOTES 14. ABSTRACT During the 6.5 year study period, investigators developed the STEPS UP

  17. Iraq’s Shia Warlords and Their Militias: Political and Security Challenges and Options

    DTIC Science & Technology

    2015-06-01

    Middle East his- tory and Arabic. He has studied and traveled widely in the Middle East. ix SUMMARY As America’s de facto co-belligerents who often...5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) U.S. Army War College,Strategic Studies Institute,47 Ashburn Drive,Carlisle...routinely engage in discourse and debate concerning the role of ground forces in achieving national security objectives. The Strategic Studies Institute

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haynes, R.A.

    The Network File System (NFS) is used in UNIX-based networks to provide transparent file sharing between heterogeneous systems. Although NFS is well-known for being weak in security, it is widely used and has become a de facto standard. This paper examines the user authentication shortcomings of NFS and the approach Sandia National Laboratories has taken to strengthen it with Kerberos. The implementation on a Cray Y-MP8/864 running UNICOS is described and resource/performance issues are discussed. 4 refs., 4 figs.

  19. China and the Great Power Balance.

    DTIC Science & Technology

    1983-08-18

    analyst assigned to US Army Japan/IX Corps. iv I./ iv4 • ~ , ’ ’K". SUMMARY Recent indications of a thaw in Sino-Soviet relations, coupled with...of Chinese leaders indicates that their basic assessment of the Soviet Union as a hegemonist power 25 has been altered in the least. Given the...allowed itself to be absorbed as a de facto Soviet satellite, or if another nation (presumably the United States) somehow supplanted the Soviet Union

  20. The Future of US Nuclear Deterrence and the Impact of the Global Zero Movement

    DTIC Science & Technology

    2013-02-10

    The most appealing GZC recommendation is for a de facto minimum deterrence model en route to GZ. To its detriment, however, the GZC proposal relies...difficult to dispute that this path leads, at least eventually, to a minimum deterrence model . It is likely that the US can continue to wield a...instead, a capable, credible deterrent is critical to countering these threats and is especially so in a minimum deterrence model . Two key

  1. Treatment of Battlefield Detainees in the War on Terrorism

    DTIC Science & Technology

    2005-01-13

    available at [http://hrw.org/press/2002/01/us012802-ltr.htm]. 19 See Civil and Political Rights, Including the Question of Torture and Detention, Report of...war expert Hayes Parks, who advocates a purely de facto standard, without regard to political factors). 64 See INTERNATIONAL COMMITTEE OF THE RED...Prosecutor v. Kassem , 47 I.L.R. 470 (1971) (excerpts reprinted in DOCUMENTS ON PRISONERS OF WAR, document no. 160 (U.S. Naval War College 1979

  2. Treatment of Battlefield Detainees in the War on Terrorism

    DTIC Science & Technology

    2006-03-27

    Advisor (Jan. 28, 2002), available at [http://hrw.org/press/2002/01/us012802-ltr.htm](last visited March 22, 2006). 23 See Civil and Political Rights...advocates a purely de facto standard, without regard to political factors). 71 See INTERNATIONAL COMMITTEE OF THE RED CROSS, COMMENTARY ON THE GENEVA...the invading army). 137 Military Prosecutor v. Kassem , 47 I.L.R. 470 (1971) (excerpts reprinted in DOCUMENTS ON PRISONERS OF WAR, document no. 160 (U.S

  3. Treatment of ’Battlefield Detainees’ in the War on Terrorism

    DTIC Science & Technology

    2007-01-23

    Political Rights, Including the Question of Torture and Detention, Report of the Working Group on Arbitrary Detention, U.N. Commission on Human Rights...June 1999 (citing interview with DOD law of war expert Hayes Parks, who advocates a purely de facto standard, without regard to political factors). 72...Military Prosecutor v. Kassem , 47 I.L.R. 470 (1971) (excerpts reprinted in DOCUMENTS (continued...) GPW Art. 4A(1): Does Al Qaeda Form “Part of” the

  4. Treatment of Battlefield Detainees in the War on Terrorism

    DTIC Science & Technology

    2003-09-17

    Security Advisor (Jan. 28, 2002), available at http://hrw.org/press/2002/01/us012802-ltr.htm. 8See Civil and Political Rights, Including the Question...June 1999 (citing interview with DoD law of war expert Hayes Parks, who advocates a purely de facto standard, without regard to political factors...117Military Prosecutor v. Kassem , 47 I.L.R. 470 (1971) (excerpts reprinted in DOCUMENTS ON PRISONERS OF WAR, document no. 160 (U.S. Naval War

  5. Treatment of Battlefield Detainees in the War on Terrorism

    DTIC Science & Technology

    2002-04-11

    DoD law of war expert Hayes Parks, who advocates a purely de facto standard, without regard to political factors). 93See INTERNATIONAL COMMITTEE OF...108Military Prosecutor v. Kassem , 47 I.L.R. 470 (1971) (excerpts reprinted in DOCUMENTS ON PRISONERS OF WAR, document no. 160 (U.S. Naval War College 1979...most important consideration given to POW status has been whether there is evidence that they serve a government or political entity that exercises

  6. Gangs in Honduras: A Threat to National Security

    DTIC Science & Technology

    2012-03-22

    police, judiciary and political elites. In some parts of the country, “drug cartels act as de facto authority and there is evidence of a training camp...practical to politicians…to expect the military to solve the problem by force, nor is it practical for the military to plan and execute a purely military...Honduras: Tegucigalpa, National Defense College and El Salvador University, October 30, 2007), http://cdn.U.S.alnet.org/ tesis /files/osorio.pdf

  7. 78 FR 51821 - Sentencing Guidelines for United States Courts

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-21

    ... inserting ``However, the Supreme Court has held that the ex post facto clause applies to sentencing.... Ct. 2072, 2078 (2013) (holding that 'there is an ex post facto violation when a defendant is... ex post facto clause, in which case the court shall apply the Guidelines Manual in effect on the date...

  8. 75 FR 81849 - Office of the Attorney General; Applicability of the Sex Offender Registration and Notification Act

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-29

    ... convictions would violate the Constitution's prohibition of ex post facto laws or other provisions of the... purposes, and accordingly do not implicate the Constitution's prohibition of ex post facto laws. See 42 U.S... distinction for ex post facto purposes between the SORNA requirements and the sex offender registration and...

  9. Computational drug discovery

    PubMed Central

    Ou-Yang, Si-sheng; Lu, Jun-yan; Kong, Xiang-qian; Liang, Zhong-jie; Luo, Cheng; Jiang, Hualiang

    2012-01-01

    Computational drug discovery is an effective strategy for accelerating and economizing drug discovery and development process. Because of the dramatic increase in the availability of biological macromolecule and small molecule information, the applicability of computational drug discovery has been extended and broadly applied to nearly every stage in the drug discovery and development workflow, including target identification and validation, lead discovery and optimization and preclinical tests. Over the past decades, computational drug discovery methods such as molecular docking, pharmacophore modeling and mapping, de novo design, molecular similarity calculation and sequence-based virtual screening have been greatly improved. In this review, we present an overview of these important computational methods, platforms and successful applications in this field. PMID:22922346

  10. Location or Hukou: What Most Limits Fertility of Urban Women in China?

    PubMed Central

    Liang, Yun

    2017-01-01

    Abstract China's fertility rate is below replacement level. The government is attempting to increase this rate by relaxing the one‐child policy. China faces a possible tradeoff because further urbanization is needed to raise incomes but may reduce future fertility. We decompose China's rural–urban fertility gaps using both de facto and de jure criteria for defining the urban population. The fertility‐depressing effects of holding urban hukou are more than three times larger than effects of urban residence. Less of the rural–urban fertility gap by hukou status is due to differences in characteristics than is the case for the fertility gap by place of residence. PMID:29081975

  11. One-Party Dominance: Future Political Implications for the Conservatives in South Korea

    DTIC Science & Technology

    2016-12-01

    the vote to 26.1%.48 One of the reasons for the progressive defeat was the failure of ex -President Roh Moo-hyun to uphold his anti- corruption...appoint individuals to posts in a way that creates an advantage in implementing policies.65 The biggest advantage of the presidency is the ability to self...were necessary for South Korea to grow; he legitimized his rule with results.114 One method was the distribution of de facto pork barrel not only

  12. Two Faces of Attrition: Analysis of a Mismatched Strategy against Mexican and Central American Drug Traffickers

    DTIC Science & Technology

    2017-03-31

    Washington Post , August 15, 2014. 30 William Kelleher Storey, The First World War: A concise Global History (USA, Maryland: Rowman & Littlefield, 2014...Hector became the de- facto leader to replace Arturo because of his familial relationship. The group’s operations continued to smuggling drugs because...through a “narco banner” hung across Nuevo Laredo’s Avenida Reforma reading “Los Zetas wants you, soldier or ex -soldier. We offer a good salary

  13. First War Syndrome: Military Culture, Professionalization, and Counterinsurgency Doctrine

    DTIC Science & Technology

    2010-02-01

    in Ivan Musicant, The Banana Wars: A History of U.S. Intervention in Latin America from the Spanish-A merican War to the Invasion of Panama, (New...the " banana wars," many Marines grew very comfortable with these conflicts and Corps’ role as de facto imperial police force. 2 289 Millet, pp. 278-280...focus on the " banana wars" under the leadership of World War I veterans like John Lejeune. See also Bickel, pg. 54-55. 157 The war predictably disrupted

  14. U.S. Military Arms Sales to Taiwan: Deterrent or Provocation?

    DTIC Science & Technology

    2002-03-01

    de facto Sino-U.S. alliance could provide a means of containing the USSR.” See Derek McDougall, The International Politics of the New Asia Pacific...Survey, July/August 2000, vol. 40, no. 4, pp. 622-640. McDougall, Derek . The International Politics of the New Asia Pacific. Boulder: Lynne Rienner...Corporation. E-mail correspondence on 25 & 31 January 2002. Ong , R.C.M. “Japan and China: Security Interest in the Post-Cold War Era.” East Asia

  15. Calculation of Protein Heat Capacity from Replica-Exchange Molecular Dynamics Simulations with Different Implicit Solvent Models

    DTIC Science & Technology

    2008-10-30

    rigorous Poisson-based methods generally apply a Lee-Richards mo- lecular surface.9 This surface is considered the de facto description for continuum...definition and calculation of the Born radii. To evaluate the Born radii, two approximations are invoked. The first is the Coulomb field approximation (CFA...energy term, and depending on the particular GB formulation, higher-order non- Coulomb correction terms may be added to the Born radii to account for the

  16. Assessment of Technologies for Forensic Auditing to Combat Money Laundering in the U.S. Banking Industry

    DTIC Science & Technology

    1999-12-17

    Officer. 5 years. (Referred to as: Federal Bank Examiner) Respondent #5: Former FBI Supervisory Agent, Asian Organized Crime & La Cosa Nostra. 18 years...controlled by "Mr. Big" at Internet banks that accept cyber- cash.11 To be safe, Larry has these transfers limited to a maximum of $8,000 each. Once...difficult if not impossible for government authorities to monitor or detect. Therefore, bank auditors become the de facto mainline of defense

  17. The Lome Conventions and Their Implications for the United States,

    DTIC Science & Technology

    1981-11-01

    treatment when international obligations and/or changed de facto circumstances so necessitate." 196 J. Moss LOME Study Chapter VI Not only does the host...uboul not be urterprete as nnprnruldn ths ,u u or poiacy of Department ct State ,𔃻l 12 110.C{ V I The preparation of this study has been a challenging...and rewarding experience. As with any study of this magnitude, one can’t do it alone. I would like to express very special appreciation to John

  18. [Light pollution. A connection between ecology and health].

    PubMed

    Jedidi, H; Depierreux, F; Jedidi, Z; Beckers, A

    2015-11-01

    Light pollution is defined as the abnormal and disturbing nocturnal presence of light, its adverse consequences on flora, fauna, and, ecosystems, and its suspected or proven effects on human health. Light pollution is a quite recent and increasing phenomenon within our society; it leads to a major environmental damage not only on wildlife, but also on human health (cancers, obesity, fatigue, depression...). The solutions to this problem are however simple, efficient and, de facto, inexpensive because they involve a substantial energy saving.

  19. Overview of TPC Benchmark E: The Next Generation of OLTP Benchmarks

    NASA Astrophysics Data System (ADS)

    Hogan, Trish

    Set to replace the aging TPC-C, the TPC Benchmark E is the next generation OLTP benchmark, which more accurately models client database usage. TPC-E addresses the shortcomings of TPC-C. It has a much more complex workload, requires the use of RAID-protected storage, generates much less I/O, and is much cheaper and easier to set up, run, and audit. After a period of overlap, it is expected that TPC-E will become the de facto OLTP benchmark.

  20. Responsibility Determinations of Department of Defense Environmental Cleanup Contractors: Caveat Vendori

    DTIC Science & Technology

    1993-09-01

    114 b. DE FACTO SUSPENSIONS/DEBARMENTS ............ 117 c. INTEGRITY & ENVIRONMENTAL CONTRACTS ........ 119 7. COMPLIANCE WITH APPLICABLE LAWS...Dec. 28, 1973); Comp. Gen. Dec. B-135718, 37 Comp. Gen. 798 (May 28, 1958). 294 4 C.F.R. S 21.3(m)(5) (1991); Ingenieria Y Construcciones, S.A., B...Marathon Watch Co., Ltd., B-247043, Apr. 23, 1992, 92- 1 CPD 11 384; Ingenieria Y Construcciones, S.A., B-241043, Dec. 28, 1990, 90-2 CPD 1 524; Garten-und

  1. Use of Unified Modeling Language (UML) in Model-Based Development (MBD) For Safety-Critical Applications

    DTIC Science & Technology

    2014-12-01

    appears that UML is becoming the de facto MBD language. OMG® states the following on the MDA® FAQ page: “Although not formally required [for MBD], UML...a known limitation [42], so UML users should plan accordingly, especially for safety-critical programs. For example, “models are not used to...description of the MBD tool chain can be produced. That description could be resident in a Plan for Software Aspects of Certification (PSAC) or Software

  2. Exploring Cultural Predictors of Military Intervention Success

    DTIC Science & Technology

    2015-04-01

    research employed a sequential, mixed method analysis consisting of a quantitative ex post facto analysis of United Nation’s (UN) interventions... research . Results In spite of the many assumptions and limitation forced upon the research by its ex post facto design, it nonetheless provided some... post facto exploration of predictors of military intervention success. As such, the research examined pre- and post -intervention

  3. PVM Wrapper

    NASA Technical Reports Server (NTRS)

    Katz, Daniel

    2004-01-01

    PVM Wrapper is a software library that makes it possible for code that utilizes the Parallel Virtual Machine (PVM) software library to run using the message-passing interface (MPI) software library, without needing to rewrite the entire code. PVM and MPI are the two most common software libraries used for applications that involve passing of messages among parallel computers. Since about 1996, MPI has been the de facto standard. Codes written when PVM was popular often feature patterns of {"initsend," "pack," "send"} and {"receive," "unpack"} calls. In many cases, these calls are not contiguous and one set of calls may even exist over multiple subroutines. These characteristics make it difficult to obtain equivalent functionality via a single MPI "send" call. Because PVM Wrapper is written to run with MPI- 1.2, some PVM functions are not permitted and must be replaced - a task that requires some programming expertise. The "pvm_spawn" and "pvm_parent" function calls are not replaced, but a programmer can use "mpirun" and knowledge of the ranks of parent and child tasks with supplied macroinstructions to enable execution of codes that use "pvm_spawn" and "pvm_parent."

  4. Organizing Community-Based Data Standards: Lessons from Developing a Successful Open Standard in Systems Biology

    NASA Astrophysics Data System (ADS)

    Hucka, M.

    2015-09-01

    In common with many fields, including astronomy, a vast number of software tools for computational modeling and simulation are available today in systems biology. This wealth of resources is a boon to researchers, but it also presents interoperability problems. Despite working with different software tools, researchers want to disseminate their work widely as well as reuse and extend the models of other researchers. This situation led in the year 2000 to an effort to create a tool-independent, machine-readable file format for representing models: SBML, the Systems Biology Markup Language. SBML has since become the de facto standard for its purpose. Its success and general approach has inspired and influenced other community-oriented standardization efforts in systems biology. Open standards are essential for the progress of science in all fields, but it is often difficult for academic researchers to organize successful community-based standards. I draw on personal experiences from the development of SBML and summarize some of the lessons learned, in the hope that this may be useful to other groups seeking to develop open standards in a community-oriented fashion.

  5. Relying on experts as we reason together.

    PubMed

    Richardson, Henry S

    2012-06-01

    In various contexts, it is thought to be important that we reason together. For instance, an attractive conception of democracy requires that citizens reach lawmaking decisions by reasoning with one another. Reasoning requires that reasoners survey the considerations that they take to be reasons, proceed by a coherent train of thought, and reach conclusions freely. De facto reliance on experts threatens the possibility of collective reasoning by making some reasons collectively unsurveyable, raising questions about the coherence of the resulting train of thought. De jure reliance on experts threatens the possibility of collective reasoning by seeming to make some conclusions irreversible. The paper argues that collective reasoning that relies on experts would nonetheless be possible if the unsurveyable reasons "mesh," if the expert considerations are at least in principle publicly recoverable, and if de jure authority of expert decision is always subject to appeal.

  6. A survey to identify the clinical coding and classification systems currently in use across Europe.

    PubMed

    de Lusignan, S; Minmagh, C; Kennedy, J; Zeimet, M; Bommezijn, H; Bryant, J

    2001-01-01

    This is a survey to identify what clinical coding systems are currently in use across the European Union, and the states seeking membership to it. We sought to identify what systems are currently used and to what extent they were subject to local adaptation. Clinical coding should facilitate identifying key medical events in a computerised medical record, and aggregating information across groups of records. The emerging new driver is as the enabler of the life-long computerised medical record. A prerequisite for this level of functionality is the transfer of information between different computer systems. This transfer can be facilitated either by working on the interoperability problems between disparate systems or by harmonising the underlying data. This paper examines the extent to which the latter has occurred across Europe. Literature and Internet search. Requests for information via electronic mail to pan-European mailing lists of health informatics professionals. Coding systems are now a de facto part of health information systems across Europe. There are relatively few coding systems in existence across Europe. ICD9 and ICD 10, ICPC and Read were the most established. However the local adaptation of these classification systems either on a by country or by computer software manufacturer basis; significantly reduces the ability for the meaning coded with patients computer records to be easily transferred from one medical record system to another. There is no longer any debate as to whether a coding or classification system should be used. Convergence of different classifications systems should be encouraged. Countries and computer manufacturers within the EU should be encouraged to stop making local modifications to coding and classification systems, as this practice risks significantly slowing progress towards easy transfer of records between computer systems.

  7. Extra dimensions: 3D in PDF documentation

    DOE PAGES

    Graf, Norman A.

    2011-01-11

    Experimental science is replete with multi-dimensional information which is often poorly represented by the two dimensions of presentation slides and print media. Past efforts to disseminate such information to a wider audience have failed for a number of reasons, including a lack of standards which are easy to implement and have broad support. Adobe's Portable Document Format (PDF) has in recent years become the de facto standard for secure, dependable electronic information exchange. It has done so by creating an open format, providing support for multiple platforms and being reliable and extensible. By providing support for the ECMA standard Universalmore » 3D (U3D) file format in its free Adobe Reader software, Adobe has made it easy to distribute and interact with 3D content. By providing support for scripting and animation, temporal data can also be easily distributed to a wide, non-technical audience. We discuss how the field of radiation imaging could benefit from incorporating full 3D information about not only the detectors, but also the results of the experimental analyses, in its electronic publications. In this article, we present examples drawn from high-energy physics, mathematics and molecular biology which take advantage of this functionality. Furthermore, we demonstrate how 3D detector elements can be documented, using either CAD drawings or other sources such as GEANT visualizations as input.« less

  8. Extra dimensions: 3D and time in PDF documentation

    NASA Astrophysics Data System (ADS)

    Graf, N. A.

    2011-01-01

    Experimental science is replete with multi-dimensional information which is often poorly represented by the two dimensions of presentation slides and print media. Past efforts to disseminate such information to a wider audience have failed for a number of reasons, including a lack of standards which are easy to implement and have broad support. Adobe's Portable Document Format (PDF) has in recent years become the de facto standard for secure, dependable electronic information exchange. It has done so by creating an open format, providing support for multiple platforms and being reliable and extensible. By providing support for the ECMA standard Universal 3D (U3D) file format in its free Adobe Reader software, Adobe has made it easy to distribute and interact with 3D content. By providing support for scripting and animation, temporal data can also be easily distributed to a wide, non-technical audience. We discuss how the field of radiation imaging could benefit from incorporating full 3D information about not only the detectors, but also the results of the experimental analyses, in its electronic publications. In this article, we present examples drawn from high-energy physics, mathematics and molecular biology which take advantage of this functionality. We demonstrate how 3D detector elements can be documented, using either CAD drawings or other sources such as GEANT visualizations as input.

  9. Extra Dimensions: 3D and Time in PDF Documentation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Graf, N.A.; /SLAC

    2012-04-11

    Experimental science is replete with multi-dimensional information which is often poorly represented by the two dimensions of presentation slides and print media. Past efforts to disseminate such information to a wider audience have failed for a number of reasons, including a lack of standards which are easy to implement and have broad support. Adobe's Portable Document Format (PDF) has in recent years become the de facto standard for secure, dependable electronic information exchange. It has done so by creating an open format, providing support for multiple platforms and being reliable and extensible. By providing support for the ECMA standard Universalmore » 3D (U3D) file format in its free Adobe Reader software, Adobe has made it easy to distribute and interact with 3D content. By providing support for scripting and animation, temporal data can also be easily distributed to a wide, non-technical audience. We discuss how the field of radiation imaging could benefit from incorporating full 3D information about not only the detectors, but also the results of the experimental analyses, in its electronic publications. In this article, we present examples drawn from high-energy physics, mathematics and molecular biology which take advantage of this functionality. We demonstrate how 3D detector elements can be documented, using either CAD drawings or other sources such as GEANT visualizations as input.« less

  10. Evaluating the Potential for Marine and Hydrokinetic Devices to Act As Artificial Reefs or Fish Aggregating Devices

    NASA Astrophysics Data System (ADS)

    Kramer, S.; Nelson, P.

    2016-02-01

    Wave energy converters (WECs) and tidal energy converters (TECs) are only beginning to be deployed along the U.S. West Coast and in Hawai'i, and a better understanding of their ecological effects on fish, particularly on special status fish is needed to facilitate project siting, design and environmental permitting. The structures of WECs and TECs placed on to the seabed, such as anchors and foundations, may function as artificial reefs that attract reef associated fishes, while the midwater and surface structures, such as mooring lines, buoys, and wave or tidal power devices, may function as fish aggregating devices (FADs). We evaluated these potential ecological interactions by comparing them to surrogate structures, such as artificial reefs, natural reefs, kelp vegetation, floating and sunken debris, oil and gas platforms, anchored FADs deployed to enhance fishing opportunities, net cages used for mariculture, and piers and marinas. We also conducted guided discussions with scientists and resource managers to provide unpublished observations. Our findings indicate the structures of WECs and TECs placed on or near the seabed in coastal waters of the U.S. West Coast and Hawai`i likely will function as small scale artificial reefs and attract potentially high densities of reef associated fishes and the midwater and surface structures of WECs placed in the tropical waters of Hawai`i likely will function as de facto FADs.

  11. Spatial heterogeneity in fishing creates de facto refugia for endangered Celtic Sea elasmobranchs.

    PubMed

    Shephard, Samuel; Gerritsen, Hans; Kaiser, Michel J; Reid, David G

    2012-01-01

    The life history characteristics of some elasmobranchs make them particularly vulnerable to fishing mortality; about a third of all species are listed by the IUCN as Threatened or Near Threatened. Marine Protected Areas (MPAs) have been suggested as a tool for conservation of elasmobranchs, but they are likely to be effective only if such populations respond to fishing impacts at spatial-scales corresponding to MPA size. Using the example of the Celtic Sea, we modelled elasmobranch biomass (kg h(-1)) in fisheries-independent survey hauls as a function of environmental variables and 'local' (within 20 km radius) fishing effort (h y(-1)) recorded from Vessel Monitoring Systems data. Model selection using AIC suggested strongest support for linear mixed effects models in which the variables (i) fishing effort, (ii) geographic location and (iii) demersal fish assemblage had approximately equal importance in explaining elasmobranch biomass. In the eastern Celtic Sea, sampling sites that occurred in the lowest 10% of the observed fishing effort range recorded 10 species of elasmobranch including the critically endangered Dipturus spp. The most intensely fished 10% of sites had only three elasmobranch species, with two IUCN listed as Least Concern. Our results suggest that stable spatial heterogeneity in fishing effort creates de facto refugia for elasmobranchs in the Celtic Sea. However, changes in the present fisheries management regime could impair the refuge effect by changing fisher's behaviour and displacing effort into these areas.

  12. Spatial Heterogeneity in Fishing Creates de facto Refugia for Endangered Celtic Sea Elasmobranchs

    PubMed Central

    Shephard, Samuel; Gerritsen, Hans; Kaiser, Michel J.; Reid, David G.

    2012-01-01

    The life history characteristics of some elasmobranchs make them particularly vulnerable to fishing mortality; about a third of all species are listed by the IUCN as Threatened or Near Threatened. Marine Protected Areas (MPAs) have been suggested as a tool for conservation of elasmobranchs, but they are likely to be effective only if such populations respond to fishing impacts at spatial-scales corresponding to MPA size. Using the example of the Celtic Sea, we modelled elasmobranch biomass (kg h−1) in fisheries-independent survey hauls as a function of environmental variables and ‘local’ (within 20 km radius) fishing effort (h y−1) recorded from Vessel Monitoring Systems data. Model selection using AIC suggested strongest support for linear mixed effects models in which the variables (i) fishing effort, (ii) geographic location and (iii) demersal fish assemblage had approximately equal importance in explaining elasmobranch biomass. In the eastern Celtic Sea, sampling sites that occurred in the lowest 10% of the observed fishing effort range recorded 10 species of elasmobranch including the critically endangered Dipturus spp. The most intensely fished 10% of sites had only three elasmobranch species, with two IUCN listed as Least Concern. Our results suggest that stable spatial heterogeneity in fishing effort creates de facto refugia for elasmobranchs in the Celtic Sea. However, changes in the present fisheries management regime could impair the refuge effect by changing fisher's behaviour and displacing effort into these areas. PMID:23166635

  13. Methods for the scientific study of discrimination and health: an ecosocial approach.

    PubMed

    Krieger, Nancy

    2012-05-01

    The scientific study of how discrimination harms health requires theoretically grounded methods. At issue is how discrimination, as one form of societal injustice, becomes embodied inequality and is manifested as health inequities. As clarified by ecosocial theory, methods must address the lived realities of discrimination as an exploitative and oppressive societal phenomenon operating at multiple levels and involving myriad pathways across both the life course and historical generations. An integrated embodied research approach hence must consider (1) the structural level-past and present de jure and de facto discrimination; (2) the individual level-issues of domains, nativity, and use of both explicit and implicit discrimination measures; and (3) how current research methods likely underestimate the impact of racism on health.

  14. The Joy of Playing with Oceanographic Data

    NASA Astrophysics Data System (ADS)

    Smith, A. T.; Xing, Z.; Armstrong, E. M.; Thompson, C. K.; Huang, T.

    2013-12-01

    The web is no longer just an after thought. It is no longer just a presentation layer filled with HTML, CSS, JavaScript, Frameworks, 3D, and more. It has become the medium of our communication. It is the database of all databases. It is the computing platform of all platforms. It has transformed the way we do science. Web service is the de facto method for communication between machines over the web. Representational State Transfer (REST) has standardized the way we architect services and their interfaces. In the Earth Science domain, we are familiar with tools and services such as Open-Source Project for Network Data Access Protocol (OPeNDAP), Thematic Realtime Environmental Distributed Data Services (THREDDS), and Live Access Server (LAS). We are also familiar with various data formats such as NetCDF3/4, HDF4/5, GRIB, TIFF, etc. One of the challenges for the Earth Science community is accessing information within these data. There are community-accepted readers that our users can download and install. However, the Application Programming Interface (API) between these readers is not standardized, which leads to non-portable applications. Webification (w10n) is an emerging technology, developed at the Jet Propulsion Laboratory, which exploits the hierarchical nature of a science data artifact to assign a URL to each element within the artifact. (e.g. a granule file). By embracing standards such as JSON, XML, and HTML5 and predictable URL, w10n provides a simple interface that enables tool-builders and researchers to develop portable tools/applications to interact with artifacts of various formats. The NASA Physical Oceanographic Distributed Active Archive Center (PO.DAAC) is the designated data center for observational products relevant to the physical state of the ocean. Over the past year PO.DAAC has been evaluating w10n technology by webifying its archive holdings to provide simplified access to oceanographic science artifacts and as a service to enable future tools and services development. In this talk, we will focus on a w10n-based system called Distributed Oceanographic Webification Service (DOWS) being developed at PO.DAAC to provide a newer and simpler method for working with observational data artifacts. As a continued effort at PO.DAAC to provide better tools and services to visualize our data, the talk will discuss the latest in web-based data visualization tools/frameworks (such as d3.js, Three.js, Leaflet.js, and more) and techniques for working with webified oceanographic science data in both a 2D and 3D web approach.

  15. Melanie II--a third-generation software package for analysis of two-dimensional electrophoresis images: I. Features and user interface.

    PubMed

    Appel, R D; Palagi, P M; Walther, D; Vargas, J R; Sanchez, J C; Ravier, F; Pasquali, C; Hochstrasser, D F

    1997-12-01

    Although two-dimensional electrophoresis (2-DE) computer analysis software packages have existed ever since 2-DE technology was developed, it is only now that the hardware and software technology allows large-scale studies to be performed on low-cost personal computers or workstations, and that setting up a 2-DE computer analysis system in a small laboratory is no longer considered a luxury. After a first attempt in the seventies and early eighties to develop 2-DE analysis software systems on hardware that had poor or even no graphical capabilities, followed in the late eighties by a wave of innovative software developments that were possible thanks to new graphical interface standards such as XWindows, a third generation of 2-DE analysis software packages has now come to maturity. It can be run on a variety of low-cost, general-purpose personal computers, thus making the purchase of a 2-DE analysis system easily attainable for even the smallest laboratory that is involved in proteome research. Melanie II 2-D PAGE, developed at the University Hospital of Geneva, is such a third-generation software system for 2-DE analysis. Based on unique image processing algorithms, this user-friendly object-oriented software package runs on multiple platforms, including Unix, MS-Windows 95 and NT, and Power Macintosh. It provides efficient spot detection and quantitation, state-of-the-art image comparison, statistical data analysis facilities, and is Internet-ready. Linked to proteome databases such as those available on the World Wide Web, it represents a valuable tool for the "Virtual Lab" of the post-genome area.

  16. The Army vs. the People: The Opposition of the Soviet Military to Baltic Independence

    DTIC Science & Technology

    1991-04-01

    untouched. General Fyodor Kuz’min, the commander of the Baltic Military District became Gorbachev’s de facto "Military Governer", the only powerful and... Sindrome ’, Neustavshchinye i Natsional’noi Aspekte Voinskoi Sluzhby" [Youth Try on Overcoats: on the ’Occupation Syndrome’, Non-Regulation Behavior, and...that despite the 4 V. Sein, "Komu na Ruku ’Antiarmeiskyi Sindrom ’?" [Who is Responsible for the ’Anti-Army Syndrome’?], Sovetskaya Latvia, July 5, 1989

  17. Modular analysis of biological networks.

    PubMed

    Kaltenbach, Hans-Michael; Stelling, Jörg

    2012-01-01

    The analysis of complex biological networks has traditionally relied on decomposition into smaller, semi-autonomous units such as individual signaling pathways. With the increased scope of systems biology (models), rational approaches to modularization have become an important topic. With increasing acceptance of de facto modularity in biology, widely different definitions of what constitutes a module have sparked controversies. Here, we therefore review prominent classes of modular approaches based on formal network representations. Despite some promising research directions, several important theoretical challenges remain open on the way to formal, function-centered modular decompositions for dynamic biological networks.

  18. Regulations which govern the availability of therapeutic drugs in Australia.

    PubMed

    Shaw, J

    1988-10-01

    This article looks at the history and development of drug regulation in Australia, concentrating on the changes that have occurred in the last few years. It touches briefly on the system by which many drugs are subsidized and examines the way in which a de facto "restricted" list became less restricted. It deals with clinical trial approval procedures as these have changed markedly and turns to consider advertising and promotion of drugs. In concluding, it discusses the advantages and some of the shortcomings of the system.

  19. What about ex post facto

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McKee, H.C.

    This note presents a legal question in an attempt to find an answer. The author hopes that some legal scholar can explain the principle of ex post facto in the US Constitution as it applies to some of the environmental laws. One of the major advantages of the Association is that it provides a forum for communication among members with different backgrounds and experience. An ex post facto law is one that is applied retroactively. Under such a law, a person could be punished for some act that violated a law passed after the act was committed, even though themore » act was legal when committed. The US Constitution prohibits such laws, as will be discussed later. States are also prohibited from enacting ex post facto laws.« less

  20. Quality of life is significantly associated with survival in women with advanced epithelial ovarian cancer: An ancillary data analysis of the NRG Oncology/Gynecologic Oncology Group (GOG-0218) study.

    PubMed

    Phippen, N T; Secord, A A; Wolf, S; Samsa, G; Davidson, B; Abernethy, A P; Cella, D; Havrilesky, L J; Burger, R A; Monk, B J; Leath, C A

    2017-10-01

    Evaluate association between baseline quality of life (QOL) and changes in QOL measured by FACT-O TOI with progression-free disease (PFS) and overall survival (OS) in advanced epithelial ovarian cancer (EOC). Patients enrolled in GOG-0218 with completed FACT-O TOI assessments at baseline and at least one follow-up assessment were eligible. Baseline FACT-O TOI scores were sorted by quartiles (Q1-4) and outcomes compared between Q1 and Q2-4 with log-rank statistic and multivariate Cox regression adjusting for age, stage, post-surgical residual disease size, and performance status (PS). Trends in FACT-O TOI scores from baseline to the latest follow-up assessment were evaluated for impact on intragroup (Q1 or Q2-4) outcome by log-rank analysis. Of 1152 eligible patients, 283 formed Q1 and 869 formed Q2-4. Mean baseline FACT-O TOI scores were 47.5 for Q1 vs. 74.7 for Q2-4 (P<0.001). Q1 compared to Q2-4 had worse median OS (37.5 vs. 45.6months, P=0.001) and worse median PFS (12.5 vs. 13.1months, P=0.096). Q2-4 patients had decreased risks of disease progression (HR 0.974, 95% CI 0.953-0.995, P=0.018), and death (HR 0.963, 95% CI 0.939-0.987, P=0.003) for each five-point increase in baseline FACT-O TOI. Improving versus worsening trends in FACT-O TOI scores were associated with longer median PFS (Q1: 12.7 vs. 8.6months, P=0.001; Q2-4: 16.7 vs. 11.1months, P<0.001) and median OS (Q1: 40.8 vs. 16months, P<0.001; Q2-4: 54.4 vs. 33.6months, P<0.001). Baseline FACT-O TOI scores were independently prognostic of PFS and OS while improving compared to worsening QOL was associated with significantly better PFS and OS in women with EOC. Copyright © 2017 Elsevier Inc. All rights reserved.

  1. De facto Privatization and Inequalities in Educational Opportunity in the Transition to Secondary School in Rural Malawi.

    PubMed

    Grant, Monica J

    2017-09-01

    There has been a recent, rapid de facto privatization of education in many African countries, as the number of private secondary schools operating in the region grew. The majority of these schools are "low-cost" private schools where tuition and fees are set as low as possible to cover operating costs and still generate profit. Proponents of low-cost private schools argue that these schools have proliferated in impoverished areas to meet unmet demand for access to education and where private schools may offer better quality than locally available public schools. Theories of inequality of educational opportunity suggest that if private schools offer better quality education, students from more advantaged families will be more likely to enroll at these institutions, potentially exacerbating educational inequality in the region. This analysis uses data from a school-based longitudinal survey, the Malawi Schooling and Adolescent Study, to examine socio-economic inequalities in the transition to secondary school and on-time enrollment in upper secondary. My findings indicate that youth from non-poor households are not only more likely to enroll in secondary school than poor youth, but they are also more likely to substitute enrollment in private schools for enrollment in second-tier government schools. Enrollment at private schools, however, does not yield schooling advantages; relative to both tiers of government secondary schooling, students who initially enrolled at private schools were the least likely to enroll on time in upper secondary school. These patterns suggest that these schooling circumstances may yield less segregation of opportunity than might otherwise be assumed.

  2. Applying rapid 'de-facto' HTA in resource-limited settings: experience from Romania.

    PubMed

    Lopert, Ruth; Ruiz, Francis; Chalkidou, Kalipso

    2013-10-01

    In attempting to constrain healthcare expenditure growth, health technology assessment (HTA) can enable policy-makers to look beyond budget impact and facilitate more rational decision-making. However lack of technical capacity and poor governance can limit use in some countries. Undertaking de facto HTA by adapting decisions taken in countries with established processes is a method that may be applied while building domestic HTA capacity. We explored the potential for applying this approach in Romania. As part of a review of the basic health benefits available to insured Romanians we examined the listing process and content of the Romanian drug reimbursement formulary. We assessed value for money indirectly by drawing on appraisals by UK's NICE, and for products considered cost effective in the UK, adjusting prices by the ratio of Romanian per capita GDP to UK per capita GDP. We found more than 30 of the top 50 medicines on the Romanian formulary unlikely to be cost-effective, suggesting that existing external reference pricing mechanisms may not be delivering good value for money. While not taking into account local costs or treatment patterns, absent local considerations of value for money, this method offers a guide for both drug selection and pricing. Until robust local HTA processes are established this approach could support further analysis of existing prices and pricing mechanisms. Applied more generally, it is arguably preferable to external reference pricing, product delisting or arbitrary price cuts, and may support the future development of more rigorous, evidence-based decision-making. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  3. Household headship and child nutrition: a case study in western Kenya.

    PubMed

    Onyango, A; Tucker, K; Eisemon, T

    1994-12-01

    The effect of female household headship on child nutrition has been hypothesized by some to be negative, based on the assumption that mothers alone will be poorer and will have greater demands on their time and resources. On the other hand, there is some evidence in Kenya that the nutritional status of children of female heads may be better than that of children of male heads, suggesting that when women have more control over resources, more goes to the children. In Kenya, de facto female headship is common due to male labor migration. This study examines the relationship between child nutrition and de facto female vs male household headship in western Kenya through the examination of family income and decision making patterns. Women in male-headed households had greater financial responsibility for household maintenance. Female heads assumed more farming responsibilities and had higher remittances from husbands. Husbands of female heads purchased food and other goods in the city for use by the household. Male headed households produced more food crops and used a larger proportion of them for home consumption. In this study, children of female heads consumed a greater variety of foods. Despite a greater prevalence of stunting, there was a lower prevalence of low weight for age among children of female heads. However, in statistical analyses, headship did not relate significantly to nutritional intake or status. In attempting to understand the possible factors influencing the relationship between headship and nutritional status, we found trade-offs in the ways families were coping, which appeared to balance some of the negative and positive effects of each situation.

  4. Methods for the Scientific Study of Discrimination and Health: An Ecosocial Approach

    PubMed Central

    2012-01-01

    The scientific study of how discrimination harms health requires theoretically grounded methods. At issue is how discrimination, as one form of societal injustice, becomes embodied inequality and is manifested as health inequities. As clarified by ecosocial theory, methods must address the lived realities of discrimination as an exploitative and oppressive societal phenomenon operating at multiple levels and involving myriad pathways across both the life course and historical generations. An integrated embodied research approach hence must consider (1) the structural level—past and present de jure and de facto discrimination; (2) the individual level—issues of domains, nativity, and use of both explicit and implicit discrimination measures; and (3) how current research methods likely underestimate the impact of racism on health. PMID:22420803

  5. Electromagnetic gauge as an integration condition: De Broglie's argument revisited and expanded

    NASA Astrophysics Data System (ADS)

    Costa de Beauregard, O.

    1992-12-01

    Einstein's mass-energy equivalence law, argues de Broglie, by fixing the zero of the potential energy of a system, ipso facto selects a gauge in electromagnetism. We examine how this works in electrostatics and in magnetostatics and bring in, as a “trump card,” the familiar, but highly peculiar, system consisting of a toroidal magnet m and a current coil c, where none of the mutual energy W resides in the vacuum. We propose the principle of a crucial test for measuring the fractions of W residing in m and in c; if the latter is nonzero, the (fieldless) vector potential has physicality. Also, using induction for transferring energy from the magnet to a superconducting current, we prove that W is equipartitioned between m and c.

  6. Development of telescope control system for the 50cm telescope of UC Observatory Santa Martina

    NASA Astrophysics Data System (ADS)

    Shen, Tzu-Chiang; Soto, Ruben; Reveco, Johnny; Vanzi, Leonardo; Fernández, Jose M.; Escarate, Pedro; Suc, Vincent

    2012-09-01

    The main telescope of the UC Observatory Santa Martina is a 50cm optical telescope donated by ESO to Pontificia Universidad Catolica de Chile. During the past years the telescope has been refurbished and used as the main facility for testing and validating new instruments under construction by the center of Astro-Engineering UC. As part of this work, the need to develop a more efficient and flexible control system arises. The new distributed control system has been developed on top of Internet Communication Engine (ICE), a framework developed by Zeroc Inc. This framework features a lightweight but powerful and flexible inter-process communication infrastructure and provides binding to classic and modern programming languages, such as, C/C++, java, c#, ruby-rail, objective c, etc. The result of this work shows ICE as a real alternative for CORBA and other de-facto distribute programming framework. Classical control software architecture has been chosen and comprises an observation control system (OCS), the orchestrator of the observation, which controls the telescope control system (TCS), and detector control system (DCS). The real-time control and monitoring system is deployed and running over ARM based single board computers. Other features such as logging and configuration services have been developed as well. Inter-operation with other main astronomical control frameworks are foreseen in order achieve a smooth integration of instruments when they will be integrated in the main observatories in the north of Chile

  7. Economic Analysis Model Evaluation for Technology Modernization Programs.

    DTIC Science & Technology

    1983-09-01

    program and ’ expost - facto ’ utilized the ASD model to evaluate the accuracy of the ASD model. The following chapter reviews the literature on Tech Mod and...was applied to the CAR 80 Tech Mod project ’ expost - facto ’ in order to develop an ASD model estimated rate o . return. The ASD model was applied in...ASD estimate to achieve a range for the actual IRR. The ASD model was applied expost - facto to the CAR 80 Tech Mod program to derive an IRR. Initial data

  8. Cleanup liability and the Constitution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Friedland, D.M.; Hagen, P.E.

    It was observed in the July 1992 issue of this Journal that a plain reading of the Constitution's prohibition on [open quotes][ital ex post facto][close quotes] suggests that some environmental statutes such as the Comprehensive Environmental Response, Compensation, and Liability Act of 1980 (CERCLA or Superfund) and the Resource Conservation and Recovery Act of 1976 (RCRA) conflict with Constitutional principles. Like many Constitutional principles, however, the Supreme Court's interpretation of the Constitution's bar on [ital ex post facto] laws has a long history. The Court has consistently interpreted this clause as limited to criminal or penal statutes. This article discussesmore » the history of the [ital ex post facto] clause, the retroactive application of CERCLA and RCRA, the decision that retroactive application of CERCLA and RCRA does not violate the [ital ex post facto] clause, and laws, regulations, and guidance. 27 refs.« less

  9. Xi-cam: a versatile interface for data visualization and analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pandolfi, Ronald J.; Allan, Daniel B.; Arenholz, Elke

    Xi-cam is an extensible platform for data management, analysis and visualization.Xi-camaims to provide a flexible and extensible approach to synchrotron data treatment as a solution to rising demands for high-volume/high-throughput processing pipelines. The core ofXi-camis an extensible plugin-based graphical user interface platform which provides users with an interactive interface to processing algorithms. Plugins are available for SAXS/WAXS/GISAXS/GIWAXS, tomography and NEXAFS data. WithXi-cam's `advanced' mode, data processing steps are designed as a graph-based workflow, which can be executed live, locally or remotely. Remote execution utilizes high-performance computing or de-localized resources, allowing for the effective reduction of high-throughput data.Xi-cam's plugin-based architecture targetsmore » cross-facility and cross-technique collaborative development, in support of multi-modal analysis.Xi-camis open-source and cross-platform, and available for download on GitHub.« less

  10. Xi-cam: a versatile interface for data visualization and analysis

    DOE PAGES

    Pandolfi, Ronald J.; Allan, Daniel B.; Arenholz, Elke; ...

    2018-05-31

    Xi-cam is an extensible platform for data management, analysis and visualization.Xi-camaims to provide a flexible and extensible approach to synchrotron data treatment as a solution to rising demands for high-volume/high-throughput processing pipelines. The core ofXi-camis an extensible plugin-based graphical user interface platform which provides users with an interactive interface to processing algorithms. Plugins are available for SAXS/WAXS/GISAXS/GIWAXS, tomography and NEXAFS data. WithXi-cam's `advanced' mode, data processing steps are designed as a graph-based workflow, which can be executed live, locally or remotely. Remote execution utilizes high-performance computing or de-localized resources, allowing for the effective reduction of high-throughput data.Xi-cam's plugin-based architecture targetsmore » cross-facility and cross-technique collaborative development, in support of multi-modal analysis.Xi-camis open-source and cross-platform, and available for download on GitHub.« less

  11. Progress toward Modular UAS for Geoscience Applications

    NASA Astrophysics Data System (ADS)

    Dahlgren, R. P.; Clark, M. A.; Comstock, R. J.; Fladeland, M.; Gascot, H., III; Haig, T. H.; Lam, S. J.; Mazhari, A. A.; Palomares, R. R.; Pinsker, E. A.; Prathipati, R. T.; Sagaga, J.; Thurling, J. S.; Travers, S. V.

    2017-12-01

    Small Unmanned Aerial Systems (UAS) have become accepted tools for geoscience, ecology, agriculture, disaster response, land management, and industry. A variety of consumer UAS options exist as science and engineering payload platforms, but their incompatibilities with one another contribute to high operational costs compared with those of piloted aircraft. This research explores the concept of modular UAS, demonstrating airframes that can be reconfigured in the field for experimental optimization, to enable multi-mission support, facilitate rapid repair, or respond to changing field conditions. Modular UAS is revolutionary in allowing aircraft to be optimized around the payload, reversing the conventional wisdom of designing the payload to accommodate an unmodifiable aircraft. UAS that are reconfigurable like Legos™ are ideal for airborne science service providers, system integrators, instrument designers and end users to fulfill a wide range of geoscience experiments. Modular UAS facilitate the adoption of open-source software and rapid prototyping technology where design reuse is important in the context of a highly regulated industry like aerospace. The industry is now at a stage where consolidation, acquisition, and attrition will reduce the number of small manufacturers, with a reduction of innovation and motivation to reduce costs. Modularity leads to interface specifications, which can evolve into de facto or formal standards which contain minimum (but sufficient) details such that multiple vendors can then design to those standards and demonstrate interoperability. At that stage, vendor coopetition leads to robust interface standards, interoperability standards and multi-source agreements which in turn drive costs down significantly.

  12. Extra dimensions: 3d and time in pdf documentation

    NASA Astrophysics Data System (ADS)

    Graf, N. A.

    2008-07-01

    High energy physics is replete with multi-dimensional information which is often poorly represented by the two dimensions of presentation slides and print media. Past efforts to disseminate such information to a wider audience have failed for a number of reasons, including a lack of standards which are easy to implement and have broad support. Adobe's Portable Document Format (PDF) has in recent years become the de facto standard for secure, dependable electronic information exchange. It has done so by creating an open format, providing support for multiple platforms and being reliable and extensible. By providing support for the ECMA standard Universal 3D (U3D) file format in its free Adobe Reader software, Adobe has made it easy to distribute and interact with 3D content. By providing support for scripting and animation, temporal data can also be easily distributed to a wide audience. In this talk, we present examples of HEP applications which take advantage of this functionality. We demonstrate how 3D detector elements can be documented, using either CAD drawings or other sources such as GEANT visualizations as input. Using this technique, higher dimensional data, such as LEGO plots or time-dependent information can be included in PDF files. In principle, a complete event display, with full interactivity, can be incorporated into a PDF file. This would allow the end user not only to customize the view and representation of the data, but to access the underlying data itself.

  13. Extra Dimensions: 3D and Time in PDF Documentation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Graf, Norman A.; /SLAC

    2011-11-10

    High energy physics is replete with multi-dimensional information which is often poorly represented by the two dimensions of presentation slides and print media. Past efforts to disseminate such information to a wider audience have failed for a number of reasons, including a lack of standards which are easy to implement and have broad support. Adobe's Portable Document Format (PDF) has in recent years become the de facto standard for secure, dependable electronic information exchange. It has done so by creating an open format, providing support for multiple platforms and being reliable and extensible. By providing support for the ECMA standardmore » Universal 3D (U3D) file format in its free Adobe Reader software, Adobe has made it easy to distribute and interact with 3D content. By providing support for scripting and animation, temporal data can also be easily distributed to a wide audience. In this talk, we present examples of HEP applications which take advantage of this functionality. We demonstrate how 3D detector elements can be documented, using either CAD drawings or other sources such as GEANT visualizations as input. Using this technique, higher dimensional data, such as LEGO plots or time-dependent information can be included in PDF files. In principle, a complete event display, with full interactivity, can be incorporated into a PDF file. This would allow the end user not only to customize the view and representation of the data, but to access the underlying data itself.« less

  14. Information-Flow-Based Access Control for Web Browsers

    NASA Astrophysics Data System (ADS)

    Yoshihama, Sachiko; Tateishi, Takaaki; Tabuchi, Naoshi; Matsumoto, Tsutomu

    The emergence of Web 2.0 technologies such as Ajax and Mashup has revealed the weakness of the same-origin policy[1], the current de facto standard for the Web browser security model. We propose a new browser security model to allow fine-grained access control in the client-side Web applications for secure mashup and user-generated contents. We propose a browser security model that is based on information-flow-based access control (IBAC) to overcome the dynamic nature of the client-side Web applications and to accurately determine the privilege of scripts in the event-driven programming model.

  15. Fourth Generation Undersea Warfare: Getting C2 for Undersea Connectivity Right

    DTIC Science & Technology

    2013-06-01

    information dominance (ID) will not be realized unless undersea connectivity issues are given the same priority as the vehicles and sensors themselves. Said another way, failure to establish an effective C2 architecture in this degraded and austere environment will be a de facto surrender to the adversary’s attempts to deny access. One way in which the U.S. Navy has moved forward to address this issue is by changing the programmatic organization that supports this vehicle and sensor integration. As Executive Agent for the Navy’s undersea connectivity efforts,

  16. A qualitative study of relationships among parenting strategies, social capital, the juvenile justice system, and mental health care for at-risk African American male youth.

    PubMed

    Richardson, Joseph B; Brakle, Mischelle Van

    2011-10-01

    For many poor, African American families living in the inner city, the juvenile justice system has become a de facto mental health service provider. In this article, longitudinal, ethnographic study methods were used to examine how resource-deprived, inner-city parents in a New York City community relied on the juvenile justice system to provide their African American male children with mental health care resources. The results of three case studies indicate that this strategy actually contributed to an escalation in delinquency among the youth.

  17. Dataset for forensic analysis of B-tree file system.

    PubMed

    Wani, Mohamad Ahtisham; Bhat, Wasim Ahmad

    2018-06-01

    Since B-tree file system (Btrfs) is set to become de facto standard file system on Linux (and Linux based) operating systems, Btrfs dataset for forensic analysis is of great interest and immense value to forensic community. This article presents a novel dataset for forensic analysis of Btrfs that was collected using a proposed data-recovery procedure. The dataset identifies various generalized and common file system layouts and operations, specific node-balancing mechanisms triggered, logical addresses of various data structures, on-disk records, recovered-data as directory entries and extent data from leaf and internal nodes, and percentage of data recovered.

  18. Political life and half-life: the future formulation of nuclear waste public policy in the United States.

    PubMed

    Leroy, David

    2006-11-01

    The United States continues to need forward-thinking and revised public policy to assure safe nuclear waste disposal. Both the high- and low-level disposal plans enacted by Congress in the 1980's have been frustrated by practical and political interventions. In the interim, ad hoc solutions and temporary fixes have emerged as de facto policy. Future statutory, regulatory, and administrative guidance will likely be less bold, more narrowly focused, and adopted at lower levels of government, more informally, in contrast to the top-down, statutory policies of the 1980's.

  19. Social Semantics for an Effective Enterprise

    NASA Technical Reports Server (NTRS)

    Berndt, Sarah; Doane, Mike

    2012-01-01

    An evolution of the Semantic Web, the Social Semantic Web (s2w), facilitates knowledge sharing with "useful information based on human contributions, which gets better as more people participate." The s2w reaches beyond the search box to move us from a collection of hyperlinked facts, to meaningful, real time context. When focused through the lens of Enterprise Search, the Social Semantic Web facilitates the fluid transition of meaningful business information from the source to the user. It is the confluence of human thought and computer processing structured with the iterative application of taxonomies, folksonomies, ontologies, and metadata schemas. The importance and nuances of human interaction are often deemphasized when focusing on automatic generation of semantic markup, which results in dissatisfied users and unrealized return on investment. Users consistently qualify the value of information sets through the act of selection, making them the de facto stakeholders of the Social Semantic Web. Employers are the ultimate beneficiaries of s2w utilization with a better informed, more decisive workforce; one not achieved with an IT miracle technology, but by improved human-computer interactions. Johnson Space Center Taxonomist Sarah Berndt and Mike Doane, principal owner of Term Management, LLC discuss the planning, development, and maintenance stages for components of a semantic system while emphasizing the necessity of a Social Semantic Web for the Enterprise. Identification of risks and variables associated with layering the successful implementation of a semantic system are also modeled.

  20. Studying Axon-Astrocyte Functional Interactions by 3D Two-Photon Ca2+ Imaging: A Practical Guide to Experiments and "Big Data" Analysis.

    PubMed

    Savtchouk, Iaroslav; Carriero, Giovanni; Volterra, Andrea

    2018-01-01

    Recent advances in fast volumetric imaging have enabled rapid generation of large amounts of multi-dimensional functional data. While many computer frameworks exist for data storage and analysis of the multi-gigabyte Ca 2+ imaging experiments in neurons, they are less useful for analyzing Ca 2+ dynamics in astrocytes, where transients do not follow a predictable spatio-temporal distribution pattern. In this manuscript, we provide a detailed protocol and commentary for recording and analyzing three-dimensional (3D) Ca 2+ transients through time in GCaMP6f-expressing astrocytes of adult brain slices in response to axonal stimulation, using our recently developed tools to perform interactive exploration, filtering, and time-correlation analysis of the transients. In addition to the protocol, we release our in-house software tools and discuss parameters pertinent to conducting axonal stimulation/response experiments across various brain regions and conditions. Our software tools are available from the Volterra Lab webpage at https://wwwfbm.unil.ch/dnf/group/glia-an-active-synaptic-partner/member/volterra-andrea-volterra in the form of software plugins for Image J (NIH)-a de facto standard in scientific image analysis. Three programs are available: MultiROI_TZ_profiler for interactive graphing of several movable ROIs simultaneously, Gaussian_Filter5D for Gaussian filtering in several dimensions, and Correlation_Calculator for computing various cross-correlation parameters on voxel collections through time.

  1. Challenges for Rule Systems on the Web

    NASA Astrophysics Data System (ADS)

    Hu, Yuh-Jong; Yeh, Ching-Long; Laun, Wolfgang

    The RuleML Challenge started in 2007 with the objective of inspiring the issues of implementation for management, integration, interoperation and interchange of rules in an open distributed environment, such as the Web. Rules are usually classified as three types: deductive rules, normative rules, and reactive rules. The reactive rules are further classified as ECA rules and production rules. The study of combination rule and ontology is traced back to an earlier active rule system for relational and object-oriented (OO) databases. Recently, this issue has become one of the most important research problems in the Semantic Web. Once we consider a computer executable policy as a declarative set of rules and ontologies that guides the behavior of entities within a system, we have a flexible way to implement real world policies without rewriting the computer code, as we did before. Fortunately, we have de facto rule markup languages, such as RuleML or RIF to achieve the portability and interchange of rules for different rule systems. Otherwise, executing real-life rule-based applications on the Web is almost impossible. Several commercial or open source rule engines are available for the rule-based applications. However, we still need a standard rule language and benchmark for not only to compare the rule systems but also to measure the progress in the field. Finally, a number of real-life rule-based use cases will be investigated to demonstrate the applicability of current rule systems on the Web.

  2. The Dancer, the Sculptor, and the Astronomer: Science and Aesthetics at the Fin de Siécle

    NASA Astrophysics Data System (ADS)

    Wells, G.

    2013-04-01

    The latter part of the 19th century was a time of remarkable imaginative invention and profound intellectual insight. Increasingly bold pronouncements from the sciences energized and inspired the arts, and in turn the arts provided a de facto language with which to celebrate a new understanding of nature. Scientists and artists found common ground in the spirit of adventure that characterized their respective disciplines. The popularization of science depended upon the ability of artists to find the appropriate visual metaphors with which to express the ideas of the age. This is the case with the friendship between Loie Fuller, Auguste Rodin, and Camille Flammarion. The relationship between the dancer, the sculptor, and the astronomer represented the intersection of science and aesthetics in the public and professional spheres.

  3. Perceived stigma and social support in treatment for pharmaceutical opioid dependence.

    PubMed

    Cooper, Sasha; Campbell, Gabrielle; Larance, Briony; Murnion, Bridin; Nielsen, Suzanne

    2018-02-01

    The dramatic increase in pharmaceutical opioid (PO) use in high-income countries is a growing public health concern. Stigma and social support are important as they may influence treatment uptake and outcomes, yet few studies exist regarding perceived stigma and social support among people with PO dependence. The aims of the study are to: (i) compare characteristics of those with PO dependence from iatrogenic and non-iatrogenic causes; (ii) document perceived stigma and its correlates in people in treatment for PO dependence; and (iii) examine correlates of social support in people in treatment for PO dependence. This prospective cohort study included (n = 108) PO-dependent people referred from treatment services. Telephone interviews were conducted at baseline, 3, 12 and 24 months. Multivariate linear regression was used to examine correlations. Mean age was 41 (SD = 10.5). Half (n = 56, 52%) were female. Two in five met the criteria for iatrogenic dependence (n = 41, 38%), with iatrogenic dependence associated with chronic pain, and no history of injection or heroin use. One quarter of study subjects reported past month unsanctioned opioid use (n = 25, 23%). Being married/de facto or female was associated with higher levels of perceived stigma. Unsanctioned opioid use, iatrogenic dependence and mental health conditions were associated with lower social support. Stigma affects all people in treatment. Those who are married/de facto and female may benefit from interventions to address stigma. The association of low social support with poorer mental health and ongoing substance use indicate that treatment could focus more on this area. © 2017 Australasian Professional Society on Alcohol and other Drugs.

  4. GPU-Accelerated Large-Scale Electronic Structure Theory on Titan with a First-Principles All-Electron Code

    NASA Astrophysics Data System (ADS)

    Huhn, William Paul; Lange, Björn; Yu, Victor; Blum, Volker; Lee, Seyong; Yoon, Mina

    Density-functional theory has been well established as the dominant quantum-mechanical computational method in the materials community. Large accurate simulations become very challenging on small to mid-scale computers and require high-performance compute platforms to succeed. GPU acceleration is one promising approach. In this talk, we present a first implementation of all-electron density-functional theory in the FHI-aims code for massively parallel GPU-based platforms. Special attention is paid to the update of the density and to the integration of the Hamiltonian and overlap matrices, realized in a domain decomposition scheme on non-uniform grids. The initial implementation scales well across nodes on ORNL's Titan Cray XK7 supercomputer (8 to 64 nodes, 16 MPI ranks/node) and shows an overall speed up in runtime due to utilization of the K20X Tesla GPUs on each Titan node of 1.4x, with the charge density update showing a speed up of 2x. Further acceleration opportunities will be discussed. Work supported by the LDRD Program of ORNL managed by UT-Battle, LLC, for the U.S. DOE and by the Oak Ridge Leadership Computing Facility, which is a DOE Office of Science User Facility supported under Contract DE-AC05-00OR22725.

  5. Vertical Interaction in Open Software Engineering Communities

    DTIC Science & Technology

    2009-03-01

    Program in CASOS (NSF,DGE-9972762), the Office of Naval Research under Dynamic Network Analysis program (N00014-02-1-0973, the Air Force Office of...W91WAW07C0063) for research in the area of dynamic network analysis. Additional support was provided by CASOS - the center for Computational Analysis of Social...methods across the domain. For a given project, de - velopers can choose from dozens of models, tools, platforms, and languages for specification, design

  6. Césarienne à Lubumbashi, République Démocratique du Congo II: facteurs de risque de mortalité maternelle et périnatale

    PubMed Central

    Kinenkinda, Xavier; Mukuku, Olivier; Chenge, Faustin; Kakudji, Prosper; Banzulu, Peter; Kakoma, Jean-Baptiste; Kizonde, Justin

    2017-01-01

    Introduction L’objectif était d’analyser les facteurs de risque de mortalité maternelle et périnatale de la césarienne à Lubumbashi, République Démocratique du Congo (RDC). Méthodes Étude multicentrique de 3643 césariennes réalisées entre le 1er janvier 2009 et le 31 décembre 2013 sur un total de 34199 accouchements dans cinq formations hospitalières de référence à Lubumbashi (RDC). Les données sociodémographiques, les indications, l’environnement obstétrical et la morbi-mortalité maternelles et périnatales ont été analysés au logiciel Epi Info 2011. Les fréquences calculées sont exprimées en pourcentage et les moyennes avec leurs écart-types. Le test de Chi-carré et le test exact de Fisher lorsque recommandés ont été utilisés pour la comparaison des fréquences. L’odds ratio a été calculé avec l’intervalle de confiance de 95% de Cornfield grâce à un modèle de régression logistique pour déterminer la puissance de facteurs de risque. Le seuil de signification a été fixé à p < 0,05. Résultats La fréquence de la césarienne était de 10,65%. L'âge moyen des césarisées était de 28,83±6,8 ans (extrêmes: 14 et 49 ans). La parité variait de 1 à 16 avec une moyenne de 2,6. De ces opérées, une sur neuf (10,9%) était porteuse d’un utérus cicatriciel de césarienne antérieure et 22,3% étaient des évacuées obstétricales. Les taux de létalité maternelle et périnatale étaient respectivement de 1,4% et 7,07% lors de la césarienne. L’analyse des facteurs de risque montre que la grande multiparité (≥5), l’absence de surveillance de la grossesse, le caractère urgent de l’indication opératoire influent significativement sur la mortalité maternelle. A ces facteurs s’ajoutent pour la mortalité périnatale l’âge maternel avancé (> 35 ans), l’évacuation comme mode d’admission et l’immaturité fœtale. Conclusion Cette étude montre que la césarienne dans nos conditions de travail est couplée à une forte mortalité maternelle et périnatale. Les facteurs de risque identifiés sont en grande partie évitables, surtout à tort ou à raison imputés à l’opération masquant ipso facto les circonstances souvent irrationnelles de sa pratique. Introduction The objective was to analyze risk factors for maternal and perinatal mortality among women undergoing cesarean section in Lubumbashi, Democratic Republic of Congo (DRC). Methods We conducted a multicenter study of 3643 women undergoing cesarean sections between 1 January 2009 and 31 December 2013 out of 34199 women delivering in five general referral hospitals in Lubumbashi (DRC). Sociodemographic data, indications, obstetrical environment as well as maternal and perinatal morbi-mortality were analyzed using Epi Info 2011 software. Computed frequencies were expressed in percentage and mean values were expressed in terms of standard deviations. Chi-square test and Fisher’s exact test, when recommended, were used to compare frequencies. The odds ratio was calculated using Cornfield 95% confidence interval based on a logistic regression model in order to determine the strength of risk factors. Threshold significance level was set at p < 0.05. Results The frequency of cesarean sections was 10.65%. The average age of women undergoing cesarean section was 28.83 ± 6.8 years (with a range from 14 to 49 years). Parity ranged from 1 to 16 with an average of 2.6. 1 out of 9 (10.9%) women undergoing cesarean section were patients with previous caesarean section uterine scar on the anterior wall of the uterus and 22.3% of women were patients with previous obstetric evaquation. Maternal and perinatal mortality rate was 1.4% and 7.07% during cesarean section respectively. The analysis of risk factors shows that the great multiparity (≥5), the absence of monitoring during pregnancy, the urgent nature of emergency surgery significantly affect maternal mortality. Other factors for perinatal mortality included advanced maternal age (>35 years), patients referral from one facility to another as a mode of admission and fetal immaturity. Conclusion This study shows that cesarean section in our working condition is associated to a significant maternal and perinatal mortality. Identified risk factors are largely preventable, because they are rightly or wrongly ascribed to cesarean section glossing over, ipso facto, the often irrational circumstances of its practice. PMID:28690723

  7. Computing Platforms for Big Biological Data Analytics: Perspectives and Challenges.

    PubMed

    Yin, Zekun; Lan, Haidong; Tan, Guangming; Lu, Mian; Vasilakos, Athanasios V; Liu, Weiguo

    2017-01-01

    The last decade has witnessed an explosion in the amount of available biological sequence data, due to the rapid progress of high-throughput sequencing projects. However, the biological data amount is becoming so great that traditional data analysis platforms and methods can no longer meet the need to rapidly perform data analysis tasks in life sciences. As a result, both biologists and computer scientists are facing the challenge of gaining a profound insight into the deepest biological functions from big biological data. This in turn requires massive computational resources. Therefore, high performance computing (HPC) platforms are highly needed as well as efficient and scalable algorithms that can take advantage of these platforms. In this paper, we survey the state-of-the-art HPC platforms for big biological data analytics. We first list the characteristics of big biological data and popular computing platforms. Then we provide a taxonomy of different biological data analysis applications and a survey of the way they have been mapped onto various computing platforms. After that, we present a case study to compare the efficiency of different computing platforms for handling the classical biological sequence alignment problem. At last we discuss the open issues in big biological data analytics.

  8. OWLS as platform technology in OPTOS satellite

    NASA Astrophysics Data System (ADS)

    Rivas Abalo, J.; Martínez Oter, J.; Arruego Rodríguez, I.; Martín-Ortega Rico, A.; de Mingo Martín, J. R.; Jiménez Martín, J. J.; Martín Vodopivec, B.; Rodríguez Bustabad, S.; Guerrero Padrón, H.

    2017-12-01

    The aim of this work is to show the Optical Wireless Link to intraSpacecraft Communications (OWLS) technology as a platform technology for space missions, and more specifically its use within the On-Board Communication system of OPTOS satellite. OWLS technology was proposed by Instituto Nacional de Técnica Aeroespacial (INTA) at the end of the 1990s and developed along 10 years through a number of ground demonstrations, technological developments and in-orbit experiments. Its main benefits are: mass reduction, flexibility, and simplification of the Assembly, Integration and Tests phases. The final step was to go from an experimental technology to a platform one. This step was carried out in the OPTOS satellite, which makes use of optical wireless links in a distributed network based on an OLWS implementation of the CAN bus. OPTOS is the first fully wireless satellite. It is based on the triple configuration (3U) of the popular Cubesat standard, and was completely built at INTA. It was conceived to procure a fast development, low cost, and yet reliable platform to the Spanish scientific community, acting as a test bed for space born science and technology. OPTOS presents a distributed OBDH architecture in which all satellite's subsystems and payloads incorporate a small Distributed On-Board Computer (OBC) Terminal (DOT). All DOTs (7 in total) communicate between them by means of the OWLS-CAN that enables full data sharing capabilities. This collaboration allows them to perform all tasks that would normally be carried out by a centralized On-Board Computer.

  9. Bringing your tools to CyVerse Discovery Environment using Docker

    PubMed Central

    Devisetty, Upendra Kumar; Kennedy, Kathleen; Sarando, Paul; Merchant, Nirav; Lyons, Eric

    2016-01-01

    Docker has become a very popular container-based virtualization platform for software distribution that has revolutionized the way in which scientific software and software dependencies (software stacks) can be packaged, distributed, and deployed. Docker makes the complex and time-consuming installation procedures needed for scientific software a one-time process. Because it enables platform-independent installation, versioning of software environments, and easy redeployment and reproducibility, Docker is an ideal candidate for the deployment of identical software stacks on different compute environments such as XSEDE and Amazon AWS. CyVerse’s Discovery Environment also uses Docker for integrating its powerful, community-recommended software tools into CyVerse’s production environment for public use. This paper will help users bring their tools into CyVerse Discovery Environment (DE) which will not only allows users to integrate their tools with relative ease compared to the earlier method of tool deployment in DE but will also help users to share their apps with collaborators and release them for public use. PMID:27803802

  10. Bringing your tools to CyVerse Discovery Environment using Docker.

    PubMed

    Devisetty, Upendra Kumar; Kennedy, Kathleen; Sarando, Paul; Merchant, Nirav; Lyons, Eric

    2016-01-01

    Docker has become a very popular container-based virtualization platform for software distribution that has revolutionized the way in which scientific software and software dependencies (software stacks) can be packaged, distributed, and deployed. Docker makes the complex and time-consuming installation procedures needed for scientific software a one-time process. Because it enables platform-independent installation, versioning of software environments, and easy redeployment and reproducibility, Docker is an ideal candidate for the deployment of identical software stacks on different compute environments such as XSEDE and Amazon AWS. CyVerse's Discovery Environment also uses Docker for integrating its powerful, community-recommended software tools into CyVerse's production environment for public use. This paper will help users bring their tools into CyVerse Discovery Environment (DE) which will not only allows users to integrate their tools with relative ease compared to the earlier method of tool deployment in DE but will also help users to share their apps with collaborators and release them for public use.

  11. Use of Magnetic Resonance Imaging to Monitor Iron Overload

    PubMed Central

    Wood, John C.

    2014-01-01

    SYNOPSIS Treatment of iron overload requires robust estimates of total body iron burden and its response to iron chelation therapy. Compliance with chelation therapy varies considerably among patients and individual reporting is notoriously unreliable. Even with perfect compliance, intersubject variability in chelator effectiveness is extremely high, necessitating reliable iron estimates to guide dose titration. In addition, each chelator has a unique profile with respect to clearing iron stores from different organs. This chapter will present the tools available to clinicians monitoring their patients, focusing on non-invasive magnetic resonance imaging methods because they have become the de-facto standard of care. PMID:25064711

  12. Electrotherapy and mental illness: then and now.

    PubMed

    Gilman, Sander L

    2008-09-01

    Today electrotherapy has reappeared as a therapy of choice for the treatment of depression and other forms of mental illness. It had de facto vanished from allopathic medicine from the 1920s to the end of the century. The debates about electrotherapy mirror the question of whether mental illness was somatic and to be treated by somatic means or psychological to be treated with psychotherapy. Sigmund Freud's move from an advocate to an opponent of electrotherapy is exemplary for a shift in attitude and the decline of electrotherapy. With the re-somaticization of mental illness over the past decades has come the reappearance of somatic therapies such as electrotherapy.

  13. Unification - An international aerospace information issue

    NASA Technical Reports Server (NTRS)

    Cotter, Gladys A.; Lahr, Thomas F.

    1992-01-01

    Scientific and Technical Information (STI) represents the results of large investments in research and development (R&D) and the expertise of a nation and is a valuable resource. For more than four decades, NASA and its predecessor organizations have developed and managed the preeminent aerospace information system. NASA obtains foreign materials through its international exchange relationships, continually increasing the comprehensiveness of the NASA Aerospace Database (NAD). The NAD is de facto the international aerospace database. This paper reviews current NASA goals and activities with a view toward maintaining compatibility among international aerospace information systems, eliminating duplication of effort, and sharing resources through international cooperation wherever possible.

  14. Where Is the Evidence for "Evidence-Based" Therapy?

    PubMed

    Shedler, Jonathan

    2018-06-01

    The term evidence-based therapy is a de facto code word for manualized therapy, most often brief cognitive behavior therapy and its variants. It is widely asserted that "evidence-based" therapy is scientifically proven, superior to other forms of psychotherapy, and the gold standard of care. Research findings do not support such assertions. Research on evidence-based therapies demonstrates that they are weak treatments. They have not shown superiority to other forms of psychotherapy, few patients get well, and treatment benefits do not last. Questionable research practices create a distorted picture of the actual benefits of these therapies. Copyright © 2018 Jonathan Shedler. Published by Elsevier Inc. All rights reserved.

  15. Assessing the need for a medical respite: perceptions of service providers and homeless persons.

    PubMed

    Biederman, Donna J; Gamble, Julia; Manson, Marigny; Taylor, Destry

    2014-01-01

    For homeless persons, posthospitalization care is increasingly provided in formal medical respite programs, and their success is now reported in the literature. However, there is a dearth of literature on posthospitalization transitional care for homeless persons in the absence of a respite program. Through this formative study, we sought to understand the process of securing posthospitalization care in the absence of a formal homeless medical respite. Results demonstrated a de facto patchwork respite process that has emerged. We describe both human and monetary costs associated with patchwork respite and demonstrate opportunities for improvement in homeless health care transitions.

  16. Psychotherapeutic issues with "kinky" clients: clinical problems, yours and theirs.

    PubMed

    Nichols, Margaret

    2006-01-01

    People whose sexual repertoire includes BDSM, fetish, or other "kinky" practices have become increasingly visible, on the Internet, in the real world, and in psychotherapists' offices. Unfortunately, the prevailing psychiatric view of BDSM remains a negative one: These sexual practices are usually considered paraphilias, i.e., de facto evidence of pathology. A different, affirming view of BDSM is taken in this paper. After defining BDSM and reviewing common misconceptions, a variety of issues the practitioner will face are described. These include problems of countertransference, of working with people with newly emerging sexual identities, working with spouses and partners, and discriminating between abuse and sexual "play."

  17. CSB: a Python framework for structural bioinformatics.

    PubMed

    Kalev, Ivan; Mechelke, Martin; Kopec, Klaus O; Holder, Thomas; Carstens, Simeon; Habeck, Michael

    2012-11-15

    Computational Structural Biology Toolbox (CSB) is a cross-platform Python class library for reading, storing and analyzing biomolecular structures with rich support for statistical analyses. CSB is designed for reusability and extensibility and comes with a clean, well-documented API following good object-oriented engineering practice. Stable release packages are available for download from the Python Package Index (PyPI) as well as from the project's website http://csb.codeplex.com. ivan.kalev@gmail.com or michael.habeck@tuebingen.mpg.de

  18. SU-D-BRD-01: Cloud-Based Radiation Treatment Planning: Performance Evaluation of Dose Calculation and Plan Optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Na, Y; Kapp, D; Kim, Y

    2014-06-01

    Purpose: To report the first experience on the development of a cloud-based treatment planning system and investigate the performance improvement of dose calculation and treatment plan optimization of the cloud computing platform. Methods: A cloud computing-based radiation treatment planning system (cc-TPS) was developed for clinical treatment planning. Three de-identified clinical head and neck, lung, and prostate cases were used to evaluate the cloud computing platform. The de-identified clinical data were encrypted with 256-bit Advanced Encryption Standard (AES) algorithm. VMAT and IMRT plans were generated for the three de-identified clinical cases to determine the quality of the treatment plans and computationalmore » efficiency. All plans generated from the cc-TPS were compared to those obtained with the PC-based TPS (pc-TPS). The performance evaluation of the cc-TPS was quantified as the speedup factors for Monte Carlo (MC) dose calculations and large-scale plan optimizations, as well as the performance ratios (PRs) of the amount of performance improvement compared to the pc-TPS. Results: Speedup factors were improved up to 14.0-fold dependent on the clinical cases and plan types. The computation times for VMAT and IMRT plans with the cc-TPS were reduced by 91.1% and 89.4%, respectively, on average of the clinical cases compared to those with pc-TPS. The PRs were mostly better for VMAT plans (1.0 ≤ PRs ≤ 10.6 for the head and neck case, 1.2 ≤ PRs ≤ 13.3 for lung case, and 1.0 ≤ PRs ≤ 10.3 for prostate cancer cases) than for IMRT plans. The isodose curves of plans on both cc-TPS and pc-TPS were identical for each of the clinical cases. Conclusion: A cloud-based treatment planning has been setup and our results demonstrate the computation efficiency of treatment planning with the cc-TPS can be dramatically improved while maintaining the same plan quality to that obtained with the pc-TPS. This work was supported in part by the National Cancer Institute (1R01 CA133474) and by Leading Foreign Research Institute Recruitment Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Science, ICT and Future Planning (MSIP) (Grant No.2009-00420)« less

  19. Distributed Processing of Sentinel-2 Products using the BIGEARTH Platform

    NASA Astrophysics Data System (ADS)

    Bacu, Victor; Stefanut, Teodor; Nandra, Constantin; Mihon, Danut; Gorgan, Dorian

    2017-04-01

    The constellation of observational satellites orbiting around Earth is constantly increasing, providing more data that need to be processed in order to extract meaningful information and knowledge from it. Sentinel-2 satellites, part of the Copernicus Earth Observation program, aim to be used in agriculture, forestry and many other land management applications. ESA's SNAP toolbox can be used to process data gathered by Sentinel-2 satellites but is limited to the resources provided by a stand-alone computer. In this paper we present a cloud based software platform that makes use of this toolbox together with other remote sensing software applications to process Sentinel-2 products. The BIGEARTH software platform [1] offers an integrated solution for processing Earth Observation data coming from different sources (such as satellites or on-site sensors). The flow of processing is defined as a chain of tasks based on the WorDeL description language [2]. Each task could rely on a different software technology (such as Grass GIS and ESA's SNAP) in order to process the input data. One important feature of the BIGEARTH platform comes from this possibility of interconnection and integration, throughout the same flow of processing, of the various well known software technologies. All this integration is transparent from the user perspective. The proposed platform extends the SNAP capabilities by enabling specialists to easily scale the processing over distributed architectures, according to their specific needs and resources. The software platform [3] can be used in multiple configurations. In the basic one the software platform runs as a standalone application inside a virtual machine. Obviously in this case the computational resources are limited but it will give an overview of the functionalities of the software platform, and also the possibility to define the flow of processing and later on to execute it on a more complex infrastructure. The most complex and robust configuration is based on cloud computing and allows the installation on a private or public cloud infrastructure. In this configuration, the processing resources can be dynamically allocated and the execution time can be considerably improved by the available virtual resources and the number of parallelizable sequences in the processing flow. The presentation highlights the benefits and issues of the proposed solution by analyzing some significant experimental use cases. Main references for further information: [1] BigEarth project, http://cgis.utcluj.ro/projects/bigearth [2] Constantin Nandra, Dorian Gorgan: "Defining Earth data batch processing tasks by means of a flexible workflow description language", ISPRS Ann. Photogramm. Remote Sens. Spatial Inf. Sci., III-4, 59-66, (2016). [3] Victor Bacu, Teodor Stefanut, Dorian Gorgan, "Adaptive Processing of Earth Observation Data on Cloud Infrastructures Based on Workflow Description", Proceedings of the Intelligent Computer Communication and Processing (ICCP), IEEE-Press, pp.444-454, (2015).

  20. Enhancing the Selection of Backoff Interval Using Fuzzy Logic over Wireless Ad Hoc Networks

    PubMed Central

    Ranganathan, Radha; Kannan, Kathiravan

    2015-01-01

    IEEE 802.11 is the de facto standard for medium access over wireless ad hoc network. The collision avoidance mechanism (i.e., random binary exponential backoff—BEB) of IEEE 802.11 DCF (distributed coordination function) is inefficient and unfair especially under heavy load. In the literature, many algorithms have been proposed to tune the contention window (CW) size. However, these algorithms make every node select its backoff interval between [0, CW] in a random and uniform manner. This randomness is incorporated to avoid collisions among the nodes. But this random backoff interval can change the optimal order and frequency of channel access among competing nodes which results in unfairness and increased delay. In this paper, we propose an algorithm that schedules the medium access in a fair and effective manner. This algorithm enhances IEEE 802.11 DCF with additional level of contention resolution that prioritizes the contending nodes according to its queue length and waiting time. Each node computes its unique backoff interval using fuzzy logic based on the input parameters collected from contending nodes through overhearing. We evaluate our algorithm against IEEE 802.11, GDCF (gentle distributed coordination function) protocols using ns-2.35 simulator and show that our algorithm achieves good performance. PMID:25879066

  1. PKI-based secure mobile access to electronic health services and data.

    PubMed

    Kambourakis, G; Maglogiannis, I; Rouskas, A

    2005-01-01

    Recent research works examine the potential employment of public-key cryptography schemes in e-health environments. In such systems, where a Public Key Infrastructure (PKI) is established beforehand, Attribute Certificates (ACs) and public key enabled protocols like TLS, can provide the appropriate mechanisms to effectively support authentication, authorization and confidentiality services. In other words, mutual trust and secure communications between all the stakeholders, namely physicians, patients and e-health service providers, can be successfully established and maintained. Furthermore, as the recently introduced mobile devices with access to computer-based patient record systems are expanding, the need of physicians and nurses to interact increasingly with such systems arises. Considering public key infrastructure requirements for mobile online health networks, this paper discusses the potential use of Attribute Certificates (ACs) in an anticipated trust model. Typical trust interactions among doctors, patients and e-health providers are presented, indicating that resourceful security mechanisms and trust control can be obtained and implemented. The application of attribute certificates to support medical mobile service provision along with the utilization of the de-facto TLS protocol to offer competent confidentiality and authorization services is also presented and evaluated through experimentation, using both the 802.11 WLAN and General Packet Radio Service (GPRS) networks.

  2. Permutation modulation for quantization and information reconciliation in CV-QKD systems

    NASA Astrophysics Data System (ADS)

    Daneshgaran, Fred; Mondin, Marina; Olia, Khashayar

    2017-08-01

    This paper is focused on the problem of Information Reconciliation (IR) for continuous variable Quantum Key Distribution (QKD). The main problem is quantization and assignment of labels to the samples of the Gaussian variables observed at Alice and Bob. Trouble is that most of the samples, assuming that the Gaussian variable is zero mean which is de-facto the case, tend to have small magnitudes and are easily disturbed by noise. Transmission over longer and longer distances increases the losses corresponding to a lower effective Signal to Noise Ratio (SNR) exasperating the problem. Here we propose to use Permutation Modulation (PM) as a means of quantization of Gaussian vectors at Alice and Bob over a d-dimensional space with d ≫ 1. The goal is to achieve the necessary coding efficiency to extend the achievable range of continuous variable QKD by quantizing over larger and larger dimensions. Fractional bit rate per sample is easily achieved using PM at very reasonable computational cost. Ordered statistics is used extensively throughout the development from generation of the seed vector in PM to analysis of error rates associated with the signs of the Gaussian samples at Alice and Bob as a function of the magnitude of the observed samples at Bob.

  3. A privacy-preserving solution for compressed storage and selective retrieval of genomic data.

    PubMed

    Huang, Zhicong; Ayday, Erman; Lin, Huang; Aiyar, Raeka S; Molyneaux, Adam; Xu, Zhenyu; Fellay, Jacques; Steinmetz, Lars M; Hubaux, Jean-Pierre

    2016-12-01

    In clinical genomics, the continuous evolution of bioinformatic algorithms and sequencing platforms makes it beneficial to store patients' complete aligned genomic data in addition to variant calls relative to a reference sequence. Due to the large size of human genome sequence data files (varying from 30 GB to 200 GB depending on coverage), two major challenges facing genomics laboratories are the costs of storage and the efficiency of the initial data processing. In addition, privacy of genomic data is becoming an increasingly serious concern, yet no standard data storage solutions exist that enable compression, encryption, and selective retrieval. Here we present a privacy-preserving solution named SECRAM (Selective retrieval on Encrypted and Compressed Reference-oriented Alignment Map) for the secure storage of compressed aligned genomic data. Our solution enables selective retrieval of encrypted data and improves the efficiency of downstream analysis (e.g., variant calling). Compared with BAM, the de facto standard for storing aligned genomic data, SECRAM uses 18% less storage. Compared with CRAM, one of the most compressed nonencrypted formats (using 34% less storage than BAM), SECRAM maintains efficient compression and downstream data processing, while allowing for unprecedented levels of security in genomic data storage. Compared with previous work, the distinguishing features of SECRAM are that (1) it is position-based instead of read-based, and (2) it allows random querying of a subregion from a BAM-like file in an encrypted form. Our method thus offers a space-saving, privacy-preserving, and effective solution for the storage of clinical genomic data. © 2016 Huang et al.; Published by Cold Spring Harbor Laboratory Press.

  4. A privacy-preserving solution for compressed storage and selective retrieval of genomic data

    PubMed Central

    Huang, Zhicong; Ayday, Erman; Lin, Huang; Aiyar, Raeka S.; Molyneaux, Adam; Xu, Zhenyu; Hubaux, Jean-Pierre

    2016-01-01

    In clinical genomics, the continuous evolution of bioinformatic algorithms and sequencing platforms makes it beneficial to store patients’ complete aligned genomic data in addition to variant calls relative to a reference sequence. Due to the large size of human genome sequence data files (varying from 30 GB to 200 GB depending on coverage), two major challenges facing genomics laboratories are the costs of storage and the efficiency of the initial data processing. In addition, privacy of genomic data is becoming an increasingly serious concern, yet no standard data storage solutions exist that enable compression, encryption, and selective retrieval. Here we present a privacy-preserving solution named SECRAM (Selective retrieval on Encrypted and Compressed Reference-oriented Alignment Map) for the secure storage of compressed aligned genomic data. Our solution enables selective retrieval of encrypted data and improves the efficiency of downstream analysis (e.g., variant calling). Compared with BAM, the de facto standard for storing aligned genomic data, SECRAM uses 18% less storage. Compared with CRAM, one of the most compressed nonencrypted formats (using 34% less storage than BAM), SECRAM maintains efficient compression and downstream data processing, while allowing for unprecedented levels of security in genomic data storage. Compared with previous work, the distinguishing features of SECRAM are that (1) it is position-based instead of read-based, and (2) it allows random querying of a subregion from a BAM-like file in an encrypted form. Our method thus offers a space-saving, privacy-preserving, and effective solution for the storage of clinical genomic data. PMID:27789525

  5. Muecas: A Multi-Sensor Robotic Head for Affective Human Robot Interaction and Imitation

    PubMed Central

    Cid, Felipe; Moreno, Jose; Bustos, Pablo; Núñez, Pedro

    2014-01-01

    This paper presents a multi-sensor humanoid robotic head for human robot interaction. The design of the robotic head, Muecas, is based on ongoing research on the mechanisms of perception and imitation of human expressions and emotions. These mechanisms allow direct interaction between the robot and its human companion through the different natural language modalities: speech, body language and facial expressions. The robotic head has 12 degrees of freedom, in a human-like configuration, including eyes, eyebrows, mouth and neck, and has been designed and built entirely by IADeX (Engineering, Automation and Design of Extremadura) and RoboLab. A detailed description of its kinematics is provided along with the design of the most complex controllers. Muecas can be directly controlled by FACS (Facial Action Coding System), the de facto standard for facial expression recognition and synthesis. This feature facilitates its use by third party platforms and encourages the development of imitation and of goal-based systems. Imitation systems learn from the user, while goal-based ones use planning techniques to drive the user towards a final desired state. To show the flexibility and reliability of the robotic head, the paper presents a software architecture that is able to detect, recognize, classify and generate facial expressions in real time using FACS. This system has been implemented using the robotics framework, RoboComp, which provides hardware-independent access to the sensors in the head. Finally, the paper presents experimental results showing the real-time functioning of the whole system, including recognition and imitation of human facial expressions. PMID:24787636

  6. IPDA PDS4 Project: Towards an International Planetary Data Standard

    NASA Astrophysics Data System (ADS)

    Martinez, Santa; Roatsch, Thomas; Capria, Maria Teresa; Heather, David; Yamamoto, Yukio; Hughes, Steven; Stein, Thomas; Cecconi, Baptiste; Prashar, Ajay; Batanov, Oleg; Gopala Krishna, Barla

    2016-07-01

    The International Planetary Data Alliance (IPDA) is an international collaboration of space agencies with the main objective of facilitating discovery, access and use of planetary data managed across international boundaries. For this purpose, the IPDA has adopted the NASA's Planetary Data System (PDS) standard as the de-facto archiving standard, and is working towards the internationalisation of the new generation of the standards, called PDS4. PDS4 is the largest upgrade in the history of the PDS, and is a significant step towards an online, distributed, model-driven and service-oriented architecture international archive. Following the successful deployment of PDS4 to support NASA's LADEE and MAVEN missions, PDS4 was endorsed by IPDA in 2014. This has led to the adoption of PDS4 by a number of international space agencies (ESA, JAXA, ISRO and Roscosmos, among others) for their upcoming missions. In order to closely follow the development of the PDS4 standards and to coordinate the international contribution and participation in its evolution, a group of experts from each international agency is dedicated to review different aspects of the standards and to capture recommendations and requirements to ensure the international needs are met. The activities performed by this group cover the assessment and implementation of all aspects of PDS4, including its use, documentation, tools, validation strategies and information model. This contribution will present the activities carried out by this group and how this partnership between PDS and IPDA provides an excellent foundation towards an international platform for planetary science research.

  7. Preparing systems engineering and computing science students in disciplined methods, quantitative, and advanced statistical techniques to improve process performance

    NASA Astrophysics Data System (ADS)

    McCray, Wilmon Wil L., Jr.

    The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization model and dashboard that demonstrates the use of statistical methods, statistical process control, sensitivity analysis, quantitative and optimization techniques to establish a baseline and predict future customer satisfaction index scores (outcomes). The American Customer Satisfaction Index (ACSI) model and industry benchmarks were used as a framework for the simulation model.

  8. Open Babel: An open chemical toolbox

    PubMed Central

    2011-01-01

    Background A frequent problem in computational modeling is the interconversion of chemical structures between different formats. While standard interchange formats exist (for example, Chemical Markup Language) and de facto standards have arisen (for example, SMILES format), the need to interconvert formats is a continuing problem due to the multitude of different application areas for chemistry data, differences in the data stored by different formats (0D versus 3D, for example), and competition between software along with a lack of vendor-neutral formats. Results We discuss, for the first time, Open Babel, an open-source chemical toolbox that speaks the many languages of chemical data. Open Babel version 2.3 interconverts over 110 formats. The need to represent such a wide variety of chemical and molecular data requires a library that implements a wide range of cheminformatics algorithms, from partial charge assignment and aromaticity detection, to bond order perception and canonicalization. We detail the implementation of Open Babel, describe key advances in the 2.3 release, and outline a variety of uses both in terms of software products and scientific research, including applications far beyond simple format interconversion. Conclusions Open Babel presents a solution to the proliferation of multiple chemical file formats. In addition, it provides a variety of useful utilities from conformer searching and 2D depiction, to filtering, batch conversion, and substructure and similarity searching. For developers, it can be used as a programming library to handle chemical data in areas such as organic chemistry, drug design, materials science, and computational chemistry. It is freely available under an open-source license from http://openbabel.org. PMID:21982300

  9. Data sharing platforms for de-identified data from human clinical trials.

    PubMed

    Huser, Vojtech; Shmueli-Blumberg, Dikla

    2018-04-01

    Data sharing of de-identified individual participant data is being adopted by an increasing number of sponsors of human clinical trials. In addition to standardizing data syntax for shared trial data, semantic integration of various data elements is the focus of several initiatives that define research common data elements. This perspective article, in the first part, compares several data sharing platforms for de-identified clinical research data in terms of their size, policies and supported features. In the second part, we use a case study approach to describe in greater detail one data sharing platform (Data Share from National Institute of Drug Abuse). We present data on the past use of the platform, data formats offered, data de-identification approaches and its use of research common data elements. We conclude with a summary of current and expected future trends that facilitate secondary research use of data from completed human clinical trials.

  10. Computational analyses of spectral trees from electrospray multi-stage mass spectrometry to aid metabolite identification.

    PubMed

    Cao, Mingshu; Fraser, Karl; Rasmussen, Susanne

    2013-10-31

    Mass spectrometry coupled with chromatography has become the major technical platform in metabolomics. Aided by peak detection algorithms, the detected signals are characterized by mass-over-charge ratio (m/z) and retention time. Chemical identities often remain elusive for the majority of the signals. Multi-stage mass spectrometry based on electrospray ionization (ESI) allows collision-induced dissociation (CID) fragmentation of selected precursor ions. These fragment ions can assist in structural inference for metabolites of low molecular weight. Computational investigations of fragmentation spectra have increasingly received attention in metabolomics and various public databases house such data. We have developed an R package "iontree" that can capture, store and analyze MS2 and MS3 mass spectral data from high throughput metabolomics experiments. The package includes functions for ion tree construction, an algorithm (distMS2) for MS2 spectral comparison, and tools for building platform-independent ion tree (MS2/MS3) libraries. We have demonstrated the utilization of the package for the systematic analysis and annotation of fragmentation spectra collected in various metabolomics platforms, including direct infusion mass spectrometry, and liquid chromatography coupled with either low resolution or high resolution mass spectrometry. Assisted by the developed computational tools, we have demonstrated that spectral trees can provide informative evidence complementary to retention time and accurate mass to aid with annotating unknown peaks. These experimental spectral trees once subjected to a quality control process, can be used for querying public MS2 databases or de novo interpretation. The putatively annotated spectral trees can be readily incorporated into reference libraries for routine identification of metabolites.

  11. Fast multi-core based multimodal registration of 2D cross-sections and 3D datasets.

    PubMed

    Scharfe, Michael; Pielot, Rainer; Schreiber, Falk

    2010-01-11

    Solving bioinformatics tasks often requires extensive computational power. Recent trends in processor architecture combine multiple cores into a single chip to improve overall performance. The Cell Broadband Engine (CBE), a heterogeneous multi-core processor, provides power-efficient and cost-effective high-performance computing. One application area is image analysis and visualisation, in particular registration of 2D cross-sections into 3D image datasets. Such techniques can be used to put different image modalities into spatial correspondence, for example, 2D images of histological cuts into morphological 3D frameworks. We evaluate the CBE-driven PlayStation 3 as a high performance, cost-effective computing platform by adapting a multimodal alignment procedure to several characteristic hardware properties. The optimisations are based on partitioning, vectorisation, branch reducing and loop unrolling techniques with special attention to 32-bit multiplies and limited local storage on the computing units. We show how a typical image analysis and visualisation problem, the multimodal registration of 2D cross-sections and 3D datasets, benefits from the multi-core based implementation of the alignment algorithm. We discuss several CBE-based optimisation methods and compare our results to standard solutions. More information and the source code are available from http://cbe.ipk-gatersleben.de. The results demonstrate that the CBE processor in a PlayStation 3 accelerates computational intensive multimodal registration, which is of great importance in biological/medical image processing. The PlayStation 3 as a low cost CBE-based platform offers an efficient option to conventional hardware to solve computational problems in image processing and bioinformatics.

  12. GenomeVIP: a cloud platform for genomic variant discovery and interpretation

    PubMed Central

    Mashl, R. Jay; Scott, Adam D.; Huang, Kuan-lin; Wyczalkowski, Matthew A.; Yoon, Christopher J.; Niu, Beifang; DeNardo, Erin; Yellapantula, Venkata D.; Handsaker, Robert E.; Chen, Ken; Koboldt, Daniel C.; Ye, Kai; Fenyö, David; Raphael, Benjamin J.; Wendl, Michael C.; Ding, Li

    2017-01-01

    Identifying genomic variants is a fundamental first step toward the understanding of the role of inherited and acquired variation in disease. The accelerating growth in the corpus of sequencing data that underpins such analysis is making the data-download bottleneck more evident, placing substantial burdens on the research community to keep pace. As a result, the search for alternative approaches to the traditional “download and analyze” paradigm on local computing resources has led to a rapidly growing demand for cloud-computing solutions for genomics analysis. Here, we introduce the Genome Variant Investigation Platform (GenomeVIP), an open-source framework for performing genomics variant discovery and annotation using cloud- or local high-performance computing infrastructure. GenomeVIP orchestrates the analysis of whole-genome and exome sequence data using a set of robust and popular task-specific tools, including VarScan, GATK, Pindel, BreakDancer, Strelka, and Genome STRiP, through a web interface. GenomeVIP has been used for genomic analysis in large-data projects such as the TCGA PanCanAtlas and in other projects, such as the ICGC Pilots, CPTAC, ICGC-TCGA DREAM Challenges, and the 1000 Genomes SV Project. Here, we demonstrate GenomeVIP's ability to provide high-confidence annotated somatic, germline, and de novo variants of potential biological significance using publicly available data sets. PMID:28522612

  13. Maritime Analytics Prototype: Final Development Report

    DTIC Science & Technology

    2014-04-01

    access management platform OpenAM , support for multiple instances of the same type of widget and support for installation specific configuration files to...et de la gestion de l’accès OpenAM , le support pour plusieurs instances du même type de widget et le support des fichiers d’installation de...open source authentication and access management platform OpenAM , support for multiple instances of the same type of widget and support for

  14. Energy Consumption Management of Virtual Cloud Computing Platform

    NASA Astrophysics Data System (ADS)

    Li, Lin

    2017-11-01

    For energy consumption management research on virtual cloud computing platforms, energy consumption management of virtual computers and cloud computing platform should be understood deeper. Only in this way can problems faced by energy consumption management be solved. In solving problems, the key to solutions points to data centers with high energy consumption, so people are in great need to use a new scientific technique. Virtualization technology and cloud computing have become powerful tools in people’s real life, work and production because they have strong strength and many advantages. Virtualization technology and cloud computing now is in a rapid developing trend. It has very high resource utilization rate. In this way, the presence of virtualization and cloud computing technologies is very necessary in the constantly developing information age. This paper has summarized, explained and further analyzed energy consumption management questions of the virtual cloud computing platform. It eventually gives people a clearer understanding of energy consumption management of virtual cloud computing platform and brings more help to various aspects of people’s live, work and son on.

  15. Atomic layer deposition of metal oxide by non-aqueous sol-gel chemistry =

    NASA Astrophysics Data System (ADS)

    Marichy, Catherine

    O trabalho apresentado neste manuscrito foi desenvolvido no ambito do programa doutoral intitulado “Deposicao de Camadas Atomicas (ALD) de oxido de metais por sol-gel nao-aquoso”. O objectivo deste trabalho foi a preparacao de hetero-estruturas funcionais por ALD e a sua caracterizacao. Foi desenvolvido um novo processo de deposicao de oxido de estanho a temperatura baixa-moderada, utilizando um metodo ALD nao-aquoso, o qual foi aplicado com sucesso ao revestimento controlado das paredes internas e externas de nanotubos de carbono. Uma vez que a preparacao de nanomateriais funcionais requer uma elevada exatidao do processo de deposicao, foi demonstrada a deposicao precisa de filmes que se adaptem a forma do substrato ou de filmes nano-estruturados constituidos por particulas em varios substratos. Alem disso, foram depositados com grande exatidao varios oxidos de metal em nanotubos de carbono e demonstrou-se a possibilidade de ajustar o revestimento feito por ALD atraves do controlo da funcionalizacao da superficie do substrato nano-estruturado de carbono. As hetero-estruturas obtidas foram posteriormente aplicadas como sensores de gases. O melhoramento verificado na sensibilidade foi atribuido a formacao de heterojuncoes p-n entre o filme de oxido de metais e o suporte. O trabalho desenvolvido tendo como objetivo o controlo do revestimento por ALD atraves da funcionalizacao da superficie do suporte e certamente de interesse para o design de hetero-estruturas funcionais baseadas em substratos de carbono. De facto, durante o ultimo periodo do programa de doutoramento, este conceito foi alargado a funcionalizacao e revestimento com oxidos de metal de fibras de carbono preparadas por “electrospinning”, de forma a melhorar a estabilidade e a atividade eletrocatalitica de catalisadores a base de Pt. Este trabalho foi realizado maioritariamente na Universidade de Aveiro mas tambem na Universidade Nacional de Seul e beneficiou de varias colaboracoes internacionais devido a natureza multidisciplinar da area de investigacao em que esta inserido.

  16. Xyce Parallel Electronic Simulator Users' Guide Version 6.8

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keiter, Eric R.; Aadithya, Karthik Venkatraman; Mei, Ting

    This manual describes the use of the Xyce Parallel Electronic Simulator. Xyce has been de- signed as a SPICE-compatible, high-performance analog circuit simulator, and has been written to support the simulation needs of the Sandia National Laboratories electrical designers. This development has focused on improving capability over the current state-of-the-art in the following areas: Capability to solve extremely large circuit problems by supporting large-scale parallel com- puting platforms (up to thousands of processors). This includes support for most popular parallel and serial computers. A differential-algebraic-equation (DAE) formulation, which better isolates the device model package from solver algorithms. This allows onemore » to develop new types of analysis without requiring the implementation of analysis-specific device models. Device models that are specifically tailored to meet Sandia's needs, including some radiation- aware devices (for Sandia users only). Object-oriented code design and implementation using modern coding practices. Xyce is a parallel code in the most general sense of the phrase$-$ a message passing parallel implementation $-$ which allows it to run efficiently a wide range of computing platforms. These include serial, shared-memory and distributed-memory parallel platforms. Attention has been paid to the specific nature of circuit-simulation problems to ensure that optimal parallel efficiency is achieved as the number of processors grows.« less

  17. Toward a web-based real-time radiation treatment planning system in a cloud computing environment.

    PubMed

    Na, Yong Hum; Suh, Tae-Suk; Kapp, Daniel S; Xing, Lei

    2013-09-21

    To exploit the potential dosimetric advantages of intensity modulated radiation therapy (IMRT) and volumetric modulated arc therapy (VMAT), an in-depth approach is required to provide efficient computing methods. This needs to incorporate clinically related organ specific constraints, Monte Carlo (MC) dose calculations, and large-scale plan optimization. This paper describes our first steps toward a web-based real-time radiation treatment planning system in a cloud computing environment (CCE). The Amazon Elastic Compute Cloud (EC2) with a master node (named m2.xlarge containing 17.1 GB of memory, two virtual cores with 3.25 EC2 Compute Units each, 420 GB of instance storage, 64-bit platform) is used as the backbone of cloud computing for dose calculation and plan optimization. The master node is able to scale the workers on an 'on-demand' basis. MC dose calculation is employed to generate accurate beamlet dose kernels by parallel tasks. The intensity modulation optimization uses total-variation regularization (TVR) and generates piecewise constant fluence maps for each initial beam direction in a distributed manner over the CCE. The optimized fluence maps are segmented into deliverable apertures. The shape of each aperture is iteratively rectified to be a sequence of arcs using the manufacture's constraints. The output plan file from the EC2 is sent to the simple storage service. Three de-identified clinical cancer treatment plans have been studied for evaluating the performance of the new planning platform with 6 MV flattening filter free beams (40 × 40 cm(2)) from the Varian TrueBeam(TM) STx linear accelerator. A CCE leads to speed-ups of up to 14-fold for both dose kernel calculations and plan optimizations in the head and neck, lung, and prostate cancer cases considered in this study. The proposed system relies on a CCE that is able to provide an infrastructure for parallel and distributed computing. The resultant plans from the cloud computing are identical to PC-based IMRT and VMAT plans, confirming the reliability of the cloud computing platform. This cloud computing infrastructure has been established for a radiation treatment planning. It substantially improves the speed of inverse planning and makes future on-treatment adaptive re-planning possible.

  18. Toward a web-based real-time radiation treatment planning system in a cloud computing environment

    NASA Astrophysics Data System (ADS)

    Hum Na, Yong; Suh, Tae-Suk; Kapp, Daniel S.; Xing, Lei

    2013-09-01

    To exploit the potential dosimetric advantages of intensity modulated radiation therapy (IMRT) and volumetric modulated arc therapy (VMAT), an in-depth approach is required to provide efficient computing methods. This needs to incorporate clinically related organ specific constraints, Monte Carlo (MC) dose calculations, and large-scale plan optimization. This paper describes our first steps toward a web-based real-time radiation treatment planning system in a cloud computing environment (CCE). The Amazon Elastic Compute Cloud (EC2) with a master node (named m2.xlarge containing 17.1 GB of memory, two virtual cores with 3.25 EC2 Compute Units each, 420 GB of instance storage, 64-bit platform) is used as the backbone of cloud computing for dose calculation and plan optimization. The master node is able to scale the workers on an ‘on-demand’ basis. MC dose calculation is employed to generate accurate beamlet dose kernels by parallel tasks. The intensity modulation optimization uses total-variation regularization (TVR) and generates piecewise constant fluence maps for each initial beam direction in a distributed manner over the CCE. The optimized fluence maps are segmented into deliverable apertures. The shape of each aperture is iteratively rectified to be a sequence of arcs using the manufacture’s constraints. The output plan file from the EC2 is sent to the simple storage service. Three de-identified clinical cancer treatment plans have been studied for evaluating the performance of the new planning platform with 6 MV flattening filter free beams (40 × 40 cm2) from the Varian TrueBeamTM STx linear accelerator. A CCE leads to speed-ups of up to 14-fold for both dose kernel calculations and plan optimizations in the head and neck, lung, and prostate cancer cases considered in this study. The proposed system relies on a CCE that is able to provide an infrastructure for parallel and distributed computing. The resultant plans from the cloud computing are identical to PC-based IMRT and VMAT plans, confirming the reliability of the cloud computing platform. This cloud computing infrastructure has been established for a radiation treatment planning. It substantially improves the speed of inverse planning and makes future on-treatment adaptive re-planning possible.

  19. FORCEnet Net Centric Architecture - A Standards View

    DTIC Science & Technology

    2006-06-01

    SHARED SERVICES NETWORKING/COMMUNICATIONS STORAGE COMPUTING PLATFORM DATA INTERCHANGE/INTEGRATION DATA MANAGEMENT APPLICATION...R V I C E P L A T F O R M S E R V I C E F R A M E W O R K USER-FACING SERVICES SHARED SERVICES NETWORKING/COMMUNICATIONS STORAGE COMPUTING PLATFORM...E F R A M E W O R K USER-FACING SERVICES SHARED SERVICES NETWORKING/COMMUNICATIONS STORAGE COMPUTING PLATFORM DATA INTERCHANGE/INTEGRATION

  20. Homicide-suicide in Victoria, Australia.

    PubMed

    Milroy, C M; Dratsas, M; Ranson, D L

    1997-12-01

    Thirty-nine incidents of homicide-suicide occurring in Victoria, Australia between 1985 and 1989 were examined. In 33 cases the assailants were men. The victims were spouses or women living in a de facto marriage. The majority of the victims were shot, and this was also the most frequent method of suicide. Breakdown in a relationship was the most frequent reason for killing. Mental illness of the assailant accounted for the killing in approximately 20% of cases. Physical ill health and financial stress were identified as important associative factors, particularly in the elderly. The pattern of homicide-suicide in Victoria is similar to that observed in other jurisdictions and represents an important and distinct subgroup of homicide.

  1. The Transition from Paper to Digital: Lessons for Medical Specialty Societies

    PubMed Central

    Miller, Donald W.

    2008-01-01

    Medical specialty societies often serve their membership by publishing paper forms that may simultaneously include practice guidelines, dataset specifications, and suggested layouts. Many times these forms become de facto standards for the specialty but transform poorly to the logic, structure, preciseness, and flexibility needed in modern electronic medical records. This paper analyzes one such form - a prenatal record published by the American College of Obstetricians and Gynecologists - with the intent to elucidate lessons for other specialty societies who might craft their recommendations to be effectively incorporated within modern electronic medical records. Lessons learned include separating datasets from guidelines/recommendations, specifying, codifying, and qualifying atomic data elements, and leaving graphic design to professionals. PMID:18998856

  2. Health plans and selection: formal risk adjustment vs. market design and contracts.

    PubMed

    Frank, R G; Rosenthal, M B

    2001-01-01

    In this paper, we explore the demand for risk adjustment by health plans that contract with private employers by considering the conditions under which plans might value risk adjustment. Three factors reduce the value of risk adjustment from the plans' point of view. First, only a relatively small segment of privately insured Americans face a choice of competing health plans. Second, health plans share much of their insurance risk with payers, providers, and reinsurers. Third, de facto experience rating that occurs during the premium negotiation process and management of coverage appear to substitute for risk adjustment. While the current environment has not generated much demand for risk adjustment, we reflect on its future potential.

  3. Somatic Embryogenesis in Two Orchid Genera (Cymbidium, Dendrobium).

    PubMed

    da Silva, Jaime A Teixeira; Winarto, Budi

    2016-01-01

    The protocorm-like body (PLB) is the de facto somatic embryo in orchids. Here we describe detailed protocols for two orchid genera (hybrid Cymbidium Twilight Moon 'Day Light' and Dendrobium 'Jayakarta', D. 'Gradita 31', and D. 'Zahra FR 62') for generating PLBs. These protocols will most likely have to be tweaked for different cultivars as the response of orchids in vitro tends to be dependent on genotype. In addition to primary somatic embryogenesis, secondary (or repetitive) somatic embryogenesis is also described for both genera. The use of thin cell layers as a sensitive tissue assay is outlined for hybrid Cymbidium while the protocol outlined is suitable for bioreactor culture of D. 'Zahra FR 62'.

  4. [Sensible cooperation in urology].

    PubMed

    Jonitz, H

    2006-08-01

    The main features of the reform of the healthcare system disclosed by the grand coalition on 4 July 2006 include, among other points, annulment of all budgets, payment for services rendered at fixed euro rates, and introduction of complex flat rates starting in 2009. The direct and medium-term consequences involve establishment of a health fund, but also drawing on tax money, e.g., to cofinance underage children. In addition, hospital outpatient departments in specialty fields are to be completely opened. All of these measures can lead to a marked reduction of specialized practices. All in all, one must bear in mind that establishing a "health fund" represents de facto the institution of state-run medicine.

  5. Using e-Learning Platforms for Mastery Learning in Developmental Mathematics Courses

    ERIC Educational Resources Information Center

    Boggs, Stacey; Shore, Mark; Shore, JoAnna

    2004-01-01

    Many colleges and universities have adopted e-learning platforms to utilize computers as an instructional tool in developmental (i.e., beginning and intermediate algebra) mathematics courses. An e-learning platform is a computer program used to enhance course instruction via computers and the Internet. Allegany College of Maryland is currently…

  6. Ex Post Facto Monte Carlo Variance Reduction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Booth, Thomas E.

    The variance in Monte Carlo particle transport calculations is often dominated by a few particles whose importance increases manyfold on a single transport step. This paper describes a novel variance reduction method that uses a large importance change as a trigger to resample the offending transport step. That is, the method is employed only after (ex post facto) a random walk attempts a transport step that would otherwise introduce a large variance in the calculation.Improvements in two Monte Carlo transport calculations are demonstrated empirically using an ex post facto method. First, the method is shown to reduce the variance inmore » a penetration problem with a cross-section window. Second, the method empirically appears to modify a point detector estimator from an infinite variance estimator to a finite variance estimator.« less

  7. Boutiques: a flexible framework to integrate command-line applications in computing platforms.

    PubMed

    Glatard, Tristan; Kiar, Gregory; Aumentado-Armstrong, Tristan; Beck, Natacha; Bellec, Pierre; Bernard, Rémi; Bonnet, Axel; Brown, Shawn T; Camarasu-Pop, Sorina; Cervenansky, Frédéric; Das, Samir; Ferreira da Silva, Rafael; Flandin, Guillaume; Girard, Pascal; Gorgolewski, Krzysztof J; Guttmann, Charles R G; Hayot-Sasson, Valérie; Quirion, Pierre-Olivier; Rioux, Pierre; Rousseau, Marc-Étienne; Evans, Alan C

    2018-05-01

    We present Boutiques, a system to automatically publish, integrate, and execute command-line applications across computational platforms. Boutiques applications are installed through software containers described in a rich and flexible JSON language. A set of core tools facilitates the construction, validation, import, execution, and publishing of applications. Boutiques is currently supported by several distinct virtual research platforms, and it has been used to describe dozens of applications in the neuroinformatics domain. We expect Boutiques to improve the quality of application integration in computational platforms, to reduce redundancy of effort, to contribute to computational reproducibility, and to foster Open Science.

  8. A high performance scientific cloud computing environment for materials simulations

    NASA Astrophysics Data System (ADS)

    Jorissen, K.; Vila, F. D.; Rehr, J. J.

    2012-09-01

    We describe the development of a scientific cloud computing (SCC) platform that offers high performance computation capability. The platform consists of a scientific virtual machine prototype containing a UNIX operating system and several materials science codes, together with essential interface tools (an SCC toolset) that offers functionality comparable to local compute clusters. In particular, our SCC toolset provides automatic creation of virtual clusters for parallel computing, including tools for execution and monitoring performance, as well as efficient I/O utilities that enable seamless connections to and from the cloud. Our SCC platform is optimized for the Amazon Elastic Compute Cloud (EC2). We present benchmarks for prototypical scientific applications and demonstrate performance comparable to local compute clusters. To facilitate code execution and provide user-friendly access, we have also integrated cloud computing capability in a JAVA-based GUI. Our SCC platform may be an alternative to traditional HPC resources for materials science or quantum chemistry applications.

  9. Boutiques: a flexible framework to integrate command-line applications in computing platforms

    PubMed Central

    Glatard, Tristan; Kiar, Gregory; Aumentado-Armstrong, Tristan; Beck, Natacha; Bellec, Pierre; Bernard, Rémi; Bonnet, Axel; Brown, Shawn T; Camarasu-Pop, Sorina; Cervenansky, Frédéric; Das, Samir; Ferreira da Silva, Rafael; Flandin, Guillaume; Girard, Pascal; Gorgolewski, Krzysztof J; Guttmann, Charles R G; Hayot-Sasson, Valérie; Quirion, Pierre-Olivier; Rioux, Pierre; Rousseau, Marc-Étienne; Evans, Alan C

    2018-01-01

    Abstract We present Boutiques, a system to automatically publish, integrate, and execute command-line applications across computational platforms. Boutiques applications are installed through software containers described in a rich and flexible JSON language. A set of core tools facilitates the construction, validation, import, execution, and publishing of applications. Boutiques is currently supported by several distinct virtual research platforms, and it has been used to describe dozens of applications in the neuroinformatics domain. We expect Boutiques to improve the quality of application integration in computational platforms, to reduce redundancy of effort, to contribute to computational reproducibility, and to foster Open Science. PMID:29718199

  10. A Lightweight Remote Parallel Visualization Platform for Interactive Massive Time-varying Climate Data Analysis

    NASA Astrophysics Data System (ADS)

    Li, J.; Zhang, T.; Huang, Q.; Liu, Q.

    2014-12-01

    Today's climate datasets are featured with large volume, high degree of spatiotemporal complexity and evolving fast overtime. As visualizing large volume distributed climate datasets is computationally intensive, traditional desktop based visualization applications fail to handle the computational intensity. Recently, scientists have developed remote visualization techniques to address the computational issue. Remote visualization techniques usually leverage server-side parallel computing capabilities to perform visualization tasks and deliver visualization results to clients through network. In this research, we aim to build a remote parallel visualization platform for visualizing and analyzing massive climate data. Our visualization platform was built based on Paraview, which is one of the most popular open source remote visualization and analysis applications. To further enhance the scalability and stability of the platform, we have employed cloud computing techniques to support the deployment of the platform. In this platform, all climate datasets are regular grid data which are stored in NetCDF format. Three types of data access methods are supported in the platform: accessing remote datasets provided by OpenDAP servers, accessing datasets hosted on the web visualization server and accessing local datasets. Despite different data access methods, all visualization tasks are completed at the server side to reduce the workload of clients. As a proof of concept, we have implemented a set of scientific visualization methods to show the feasibility of the platform. Preliminary results indicate that the framework can address the computation limitation of desktop based visualization applications.

  11. An Ex Post Facto Study Exploring the Impact of Parental Level of Education and Parental Support on First-Year College Adjustment

    ERIC Educational Resources Information Center

    Huntley, Kristy M.

    2011-01-01

    This study explored the impact that parental levels of education and parental support have on college adjustment for first-year students. An ex post facto design was used to examine parental level of education and parental support as variables. Parental level of education is a categorical variable based on report from the student. Parental support…

  12. Designing research: ex post facto designs.

    PubMed

    Giuffre, M

    1997-06-01

    The research design is the overall plan or structure of the study. The goal of a good research design is to insure internal validity and answer the question being asked. The only clear rule in selecting a design is that the question dictates the design. Over the next few issues this column will cover types of research designs and their inherent strengths and weaknesses. This article discusses ex post facto research.

  13. Studies on the Multi-functional Nature of Courses in Economics and the Role of Domain Specific Expertise. Ex Post Facto Research 1.

    ERIC Educational Resources Information Center

    Dochy, F. J. R. C.; Bouwens, M. R. J.

    This paper reports an investigation that was done ex post facto, examining the hypothesis that within economics courses defined economics students achieved better results than did law students in the same courses. This should not be the case if the courses are truly multifunctional. Information on an economics and money course and a course on the…

  14. Fast multi-core based multimodal registration of 2D cross-sections and 3D datasets

    PubMed Central

    2010-01-01

    Background Solving bioinformatics tasks often requires extensive computational power. Recent trends in processor architecture combine multiple cores into a single chip to improve overall performance. The Cell Broadband Engine (CBE), a heterogeneous multi-core processor, provides power-efficient and cost-effective high-performance computing. One application area is image analysis and visualisation, in particular registration of 2D cross-sections into 3D image datasets. Such techniques can be used to put different image modalities into spatial correspondence, for example, 2D images of histological cuts into morphological 3D frameworks. Results We evaluate the CBE-driven PlayStation 3 as a high performance, cost-effective computing platform by adapting a multimodal alignment procedure to several characteristic hardware properties. The optimisations are based on partitioning, vectorisation, branch reducing and loop unrolling techniques with special attention to 32-bit multiplies and limited local storage on the computing units. We show how a typical image analysis and visualisation problem, the multimodal registration of 2D cross-sections and 3D datasets, benefits from the multi-core based implementation of the alignment algorithm. We discuss several CBE-based optimisation methods and compare our results to standard solutions. More information and the source code are available from http://cbe.ipk-gatersleben.de. Conclusions The results demonstrate that the CBE processor in a PlayStation 3 accelerates computational intensive multimodal registration, which is of great importance in biological/medical image processing. The PlayStation 3 as a low cost CBE-based platform offers an efficient option to conventional hardware to solve computational problems in image processing and bioinformatics. PMID:20064262

  15. Institute for Sustained Performance, Energy, and Resilience (SuPER)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jagode, Heike; Bosilca, George; Danalis, Anthony

    The University of Tennessee (UTK) and University of Texas at El Paso (UTEP) partnership supported the three main thrusts of the SUPER project---performance, energy, and resilience. The UTK-UTEP effort thus helped advance the main goal of SUPER, which was to ensure that DOE's computational scientists can successfully exploit the emerging generation of high performance computing (HPC) systems. This goal is being met by providing application scientists with strategies and tools to productively maximize performance, conserve energy, and attain resilience. The primary vehicle through which UTK provided performance measurement support to SUPER and the larger HPC community is the Performance Applicationmore » Programming Interface (PAPI). PAPI is an ongoing project that provides a consistent interface and methodology for collecting hardware performance information from various hardware and software components, including most major CPUs, GPUs and accelerators, interconnects, I/O systems, and power interfaces, as well as virtual cloud environments. The PAPI software is widely used for performance modeling of scientific and engineering applications---for example, the HOMME (High Order Methods Modeling Environment) climate code, and the GAMESS and NWChem computational chemistry codes---on DOE supercomputers. PAPI is widely deployed as middleware for use by higher-level profiling, tracing, and sampling tools (e.g., CrayPat, HPCToolkit, Scalasca, Score-P, TAU, Vampir, PerfExpert), making it the de facto standard for hardware counter analysis. PAPI has established itself as fundamental software infrastructure in every application domain (spanning academia, government, and industry), where improving performance can be mission critical. Ultimately, as more application scientists migrate their applications to HPC platforms, they will benefit from the extended capabilities this grant brought to PAPI to analyze and optimize performance in these environments, whether they use PAPI directly, or via third-party performance tools. Capabilities added to PAPI through this grant include support for new architectures such as the lastest GPU and Xeon Phi accelerators, and advanced power measurement and management features. Another important topic for the UTK team was providing support for a rich ecosystem of different fault management strategies in the context of parallel computing. Our long term efforts have been oriented toward proposing flexible strategies and providing building boxes that application developers can use to build the most efficient fault management technique for their application. These efforts span across the entire software spectrum, from theoretical models of existing strategies to easily assess their performance, to algorithmic modifications to take advantage of specific mathematical properties for data redundancy and to extensions to widely used programming paradigms to empower the application developers to deal with all types of faults. We have also continued our tight collaborations with users to help them adopt these technologies to ensure their application always deliver meaningful scientific data. Large supercomputer systems are becoming more and more power and energy constrained, and future systems and applications running on them will need to be optimized to run under power caps and/or minimize energy consumption. The UTEP team contributed to the SUPER energy thrust by developing power modeling methodologies and investigating power management strategies. Scalability modeling results showed that some applications can scale better with respect to an increasing power budget than with respect to only the number of processors. Power management, in particular shifting power to processors on the critical path of an application execution, can reduce perturbation due to system noise and other sources of runtime variability, which are growing problems on large-scale power-constrained computer systems.« less

  16. Cloud-based Predictive Modeling System and its Application to Asthma Readmission Prediction

    PubMed Central

    Chen, Robert; Su, Hang; Khalilia, Mohammed; Lin, Sizhe; Peng, Yue; Davis, Tod; Hirsh, Daniel A; Searles, Elizabeth; Tejedor-Sojo, Javier; Thompson, Michael; Sun, Jimeng

    2015-01-01

    The predictive modeling process is time consuming and requires clinical researchers to handle complex electronic health record (EHR) data in restricted computational environments. To address this problem, we implemented a cloud-based predictive modeling system via a hybrid setup combining a secure private server with the Amazon Web Services (AWS) Elastic MapReduce platform. EHR data is preprocessed on a private server and the resulting de-identified event sequences are hosted on AWS. Based on user-specified modeling configurations, an on-demand web service launches a cluster of Elastic Compute 2 (EC2) instances on AWS to perform feature selection and classification algorithms in a distributed fashion. Afterwards, the secure private server aggregates results and displays them via interactive visualization. We tested the system on a pediatric asthma readmission task on a de-identified EHR dataset of 2,967 patients. We conduct a larger scale experiment on the CMS Linkable 2008–2010 Medicare Data Entrepreneurs’ Synthetic Public Use File dataset of 2 million patients, which achieves over 25-fold speedup compared to sequential execution. PMID:26958172

  17. Impact of Data Placement on Resilience in Large-Scale Object Storage Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carns, Philip; Harms, Kevin; Jenkins, John

    Distributed object storage architectures have become the de facto standard for high-performance storage in big data, cloud, and HPC computing. Object storage deployments using commodity hardware to reduce costs often employ object replication as a method to achieve data resilience. Repairing object replicas after failure is a daunting task for systems with thousands of servers and billions of objects, however, and it is increasingly difficult to evaluate such scenarios at scale on realworld systems. Resilience and availability are both compromised if objects are not repaired in a timely manner. In this work we leverage a high-fidelity discrete-event simulation model tomore » investigate replica reconstruction on large-scale object storage systems with thousands of servers, billions of objects, and petabytes of data. We evaluate the behavior of CRUSH, a well-known object placement algorithm, and identify configuration scenarios in which aggregate rebuild performance is constrained by object placement policies. After determining the root cause of this bottleneck, we then propose enhancements to CRUSH and the usage policies atop it to enable scalable replica reconstruction. We use these methods to demonstrate a simulated aggregate rebuild rate of 410 GiB/s (within 5% of projected ideal linear scaling) on a 1,024-node commodity storage system. We also uncover an unexpected phenomenon in rebuild performance based on the characteristics of the data stored on the system.« less

  18. OVA: integrating molecular and physical phenotype data from multiple biomedical domain ontologies with variant filtering for enhanced variant prioritization.

    PubMed

    Antanaviciute, Agne; Watson, Christopher M; Harrison, Sally M; Lascelles, Carolina; Crinnion, Laura; Markham, Alexander F; Bonthron, David T; Carr, Ian M

    2015-12-01

    Exome sequencing has become a de facto standard method for Mendelian disease gene discovery in recent years, yet identifying disease-causing mutations among thousands of candidate variants remains a non-trivial task. Here we describe a new variant prioritization tool, OVA (ontology variant analysis), in which user-provided phenotypic information is exploited to infer deeper biological context. OVA combines a knowledge-based approach with a variant-filtering framework. It reduces the number of candidate variants by considering genotype and predicted effect on protein sequence, and scores the remainder on biological relevance to the query phenotype.We take advantage of several ontologies in order to bridge knowledge across multiple biomedical domains and facilitate computational analysis of annotations pertaining to genes, diseases, phenotypes, tissues and pathways. In this way, OVA combines information regarding molecular and physical phenotypes and integrates both human and model organism data to effectively prioritize variants. By assessing performance on both known and novel disease mutations, we show that OVA performs biologically meaningful candidate variant prioritization and can be more accurate than another recently published candidate variant prioritization tool. OVA is freely accessible at http://dna2.leeds.ac.uk:8080/OVA/index.jsp. Supplementary data are available at Bioinformatics online. umaan@leeds.ac.uk. © The Author 2015. Published by Oxford University Press.

  19. How Heterogeneity Affects the Design of Hadoop MapReduce Schedulers: A State-of-the-Art Survey and Challenges.

    PubMed

    Pandey, Vaibhav; Saini, Poonam

    2018-06-01

    MapReduce (MR) computing paradigm and its open source implementation Hadoop have become a de facto standard to process big data in a distributed environment. Initially, the Hadoop system was homogeneous in three significant aspects, namely, user, workload, and cluster (hardware). However, with growing variety of MR jobs and inclusion of different configurations of nodes in the existing cluster, heterogeneity has become an essential part of Hadoop systems. The heterogeneity factors adversely affect the performance of a Hadoop scheduler and limit the overall throughput of the system. To overcome this problem, various heterogeneous Hadoop schedulers have been proposed in the literature. Existing survey works in this area mostly cover homogeneous schedulers and classify them on the basis of quality of service parameters they optimize. Hence, there is a need to study the heterogeneous Hadoop schedulers on the basis of various heterogeneity factors considered by them. In this survey article, we first discuss different heterogeneity factors that typically exist in a Hadoop system and then explore various challenges that arise while designing the schedulers in the presence of such heterogeneity. Afterward, we present the comparative study of heterogeneous scheduling algorithms available in the literature and classify them by the previously said heterogeneity factors. Lastly, we investigate different methods and environment used for evaluation of discussed Hadoop schedulers.

  20. Exposure Render: An Interactive Photo-Realistic Volume Rendering Framework

    PubMed Central

    Kroes, Thomas; Post, Frits H.; Botha, Charl P.

    2012-01-01

    The field of volume visualization has undergone rapid development during the past years, both due to advances in suitable computing hardware and due to the increasing availability of large volume datasets. Recent work has focused on increasing the visual realism in Direct Volume Rendering (DVR) by integrating a number of visually plausible but often effect-specific rendering techniques, for instance modeling of light occlusion and depth of field. Besides yielding more attractive renderings, especially the more realistic lighting has a positive effect on perceptual tasks. Although these new rendering techniques yield impressive results, they exhibit limitations in terms of their exibility and their performance. Monte Carlo ray tracing (MCRT), coupled with physically based light transport, is the de-facto standard for synthesizing highly realistic images in the graphics domain, although usually not from volumetric data. Due to the stochastic sampling of MCRT algorithms, numerous effects can be achieved in a relatively straight-forward fashion. For this reason, we have developed a practical framework that applies MCRT techniques also to direct volume rendering (DVR). With this work, we demonstrate that a host of realistic effects, including physically based lighting, can be simulated in a generic and flexible fashion, leading to interactive DVR with improved realism. In the hope that this improved approach to DVR will see more use in practice, we have made available our framework under a permissive open source license. PMID:22768292

  1. Lennard-Jones Lattice Summation in Bilayer Simulations Has Critical Effects on Surface Tension and Lipid Properties.

    PubMed

    Wennberg, Christian L; Murtola, Teemu; Hess, Berk; Lindahl, Erik

    2013-08-13

    The accuracy of electrostatic interactions in molecular dynamics advanced tremendously with the introduction of particle-mesh Ewald (PME) summation almost 20 years ago. Lattice summation electrostatics is now the de facto standard for most types of biomolecular simulations, and in particular, for lipid bilayers, it has been a critical improvement due to the large charges typically present in zwitterionic lipid headgroups. In contrast, Lennard-Jones interactions have continued to be handled with increasingly longer cutoffs, partly because few alternatives have been available despite significant difficulties in tuning cutoffs and parameters to reproduce lipid properties. Here, we present a new Lennard-Jones PME implementation applied to lipid bilayers. We confirm that long-range contributions are well approximated by dispersion corrections in simple systems such as pentadecane (which makes parameters transferable), but for inhomogeneous and anisotropic systems such as lipid bilayers there are large effects on surface tension, resulting in up to 5.5% deviations in area per lipid and order parameters-far larger than many differences for which reparameterization has been attempted. We further propose an approximation for combination rules in reciprocal space that significantly reduces the computational cost of Lennard-Jones PME and makes accurate treatment of all nonbonded interactions competitive with simulations employing long cutoffs. These results could potentially have broad impact on important applications such as membrane proteins and free energy calculations.

  2. Formal design and verification of a reliable computing platform for real-time control. Phase 1: Results

    NASA Technical Reports Server (NTRS)

    Divito, Ben L.; Butler, Ricky W.; Caldwell, James L.

    1990-01-01

    A high-level design is presented for a reliable computing platform for real-time control applications. Design tradeoffs and analyses related to the development of the fault-tolerant computing platform are discussed. The architecture is formalized and shown to satisfy a key correctness property. The reliable computing platform uses replicated processors and majority voting to achieve fault tolerance. Under the assumption of a majority of processors working in each frame, it is shown that the replicated system computes the same results as a single processor system not subject to failures. Sufficient conditions are obtained to establish that the replicated system recovers from transient faults within a bounded amount of time. Three different voting schemes are examined and proved to satisfy the bounded recovery time conditions.

  3. Supervised classification of aerial imagery and multi-source data fusion for flood assessment

    NASA Astrophysics Data System (ADS)

    Sava, E.; Harding, L.; Cervone, G.

    2015-12-01

    Floods are among the most devastating natural hazards and the ability to produce an accurate and timely flood assessment before, during, and after an event is critical for their mitigation and response. Remote sensing technologies have become the de-facto approach for observing the Earth and its environment. However, satellite remote sensing data are not always available. For these reasons, it is crucial to develop new techniques in order to produce flood assessments during and after an event. Recent advancements in data fusion techniques of remote sensing with near real time heterogeneous datasets have allowed emergency responders to more efficiently extract increasingly precise and relevant knowledge from the available information. This research presents a fusion technique using satellite remote sensing imagery coupled with non-authoritative data such as Civil Air Patrol (CAP) and tweets. A new computational methodology is proposed based on machine learning algorithms to automatically identify water pixels in CAP imagery. Specifically, wavelet transformations are paired with multiple classifiers, run in parallel, to build models discriminating water and non-water regions. The learned classification models are first tested against a set of control cases, and then used to automatically classify each image separately. A measure of uncertainty is computed for each pixel in an image proportional to the number of models classifying the pixel as water. Geo-tagged tweets are continuously harvested and stored on a MongoDB and queried in real time. They are fused with CAP classified data, and with satellite remote sensing derived flood extent results to produce comprehensive flood assessment maps. The final maps are then compared with FEMA generated flood extents to assess their accuracy. The proposed methodology is applied on two test cases, relative to the 2013 floods in Boulder CO, and the 2015 floods in Texas.

  4. Prototype methodology for obtaining cloud seeding guidance from HRRR model data

    NASA Astrophysics Data System (ADS)

    Dawson, N.; Blestrud, D.; Kunkel, M. L.; Waller, B.; Ceratto, J.

    2017-12-01

    Weather model data, along with real time observations, are critical to determine whether atmospheric conditions are prime for super-cooled liquid water during cloud seeding operations. Cloud seeding groups can either use operational forecast models, or run their own model on a computer cluster. A custom weather model provides the most flexibility, but is also expensive. For programs with smaller budgets, openly-available operational forecasting models are the de facto method for obtaining forecast data. The new High-Resolution Rapid Refresh (HRRR) model (3 x 3 km grid size), developed by the Earth System Research Laboratory (ESRL), provides hourly model runs with 18 forecast hours per run. While the model cannot be fine-tuned for a specific area or edited to provide cloud-seeding-specific output, model output is openly available on a near-real-time basis. This presentation focuses on a prototype methodology for using HRRR model data to create maps which aid in near-real-time cloud seeding decision making. The R programming language is utilized to run a script on a Windows® desktop/laptop computer either on a schedule (such as every half hour) or manually. The latest HRRR model run is downloaded from NOAA's Operational Model Archive and Distribution System (NOMADS). A GRIB-filter service, provided by NOMADS, is used to obtain surface and mandatory pressure level data for a subset domain which greatly cuts down on the amount of data transfer. Then, a set of criteria, identified by the Idaho Power Atmospheric Science Group, is used to create guidance maps. These criteria include atmospheric stability (lapse rates), dew point depression, air temperature, and wet bulb temperature. The maps highlight potential areas where super-cooled liquid water may exist, reasons as to why cloud seeding should not be attempted, and wind speed at flight level.

  5. Ideal walking dynamics via a gauged NJL model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rantaharju, Jarno; Pica, Claudio; Sannino, Francesco

    According to the ideal walking technicolor paradigm, large mass anomalous dimensions arise in gauged Nambu–Jona-Lasinio (NJL) models when the four-fermion coupling is sufficiently strong to induce spontaneous symmetry breaking in an otherwise conformal gauge theory. Therefore, we study the SU(2) gauged NJL model with two adjoint fermions using lattice simulations. The model is in an infrared conformal phase at small NJL coupling while it displays a chirally broken phase at large NJL couplings. In the infrared conformal phase, we find that the mass anomalous dimension varies with the NJL coupling, reaching γm ~ 1 close to the chiral symmetry breakingmore » transition, de facto making the present model the first explicit realization of the ideal walking scenario.« less

  6. The Cyber Defense (CyDef) Model for Assessing Countermeasure Capabilities.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kimura, Margot; DeVries, Troy Robert; Gordon, Susanna P.

    Cybersecurity is essential to maintaining operations, and is now a de facto cost of business. Despite this, there is little consensus on how to systematically make decisions about cyber countermeasures investments. Identifying gaps and determining the expected return on investment (ROI) of adding a new cybersecurity countermeasure is frequently a hand-waving exercise at best. Worse, cybersecurity nomenclature is murky and frequently over-loaded, which further complicates issues by inhibiting clear communication. This paper presents a series of foundational models and nomenclature for discussing cybersecurity countermeasures, and then introduces the Cyber Defense (CyDef) model, which provides a systematic and intuitive way formore » decision-makers to effectively communicate with operations and device experts.« less

  7. [Genetically modified food--great unknown].

    PubMed

    Cichosz, G; Wiackowski, S K

    2012-08-01

    Genetically modified food (GMF) creates evident threat to consumers' health. In spite of assurances of biotechnologists, DNA of transgenic plants is instable, so, synthesis of foreign, allergenic proteins is possible. Due to high trypsin inhibitor content the GMF is digested much more slowly what, alike Bt toxin presence, increases probability of alimentary canal diseases. Next threats are bound to the presence of fitoestrogens and residues of Roundup pesticide, that can diminish reproductiveness; and even lead to cancerogenic transformation through disturbance of human hormonal metabolism. In spite of food producers and distributors assurances that food made of GMF raw materials is marked, de facto consumers have no choice. Moreover, along the food law products containing less than 0.9% of GMF protein are not included into genetically modified food.

  8. Ideal walking dynamics via a gauged NJL model

    DOE PAGES

    Rantaharju, Jarno; Pica, Claudio; Sannino, Francesco

    2017-07-25

    According to the ideal walking technicolor paradigm, large mass anomalous dimensions arise in gauged Nambu–Jona-Lasinio (NJL) models when the four-fermion coupling is sufficiently strong to induce spontaneous symmetry breaking in an otherwise conformal gauge theory. Therefore, we study the SU(2) gauged NJL model with two adjoint fermions using lattice simulations. The model is in an infrared conformal phase at small NJL coupling while it displays a chirally broken phase at large NJL couplings. In the infrared conformal phase, we find that the mass anomalous dimension varies with the NJL coupling, reaching γm ~ 1 close to the chiral symmetry breakingmore » transition, de facto making the present model the first explicit realization of the ideal walking scenario.« less

  9. A Casualty in the Class War: Canada's Medicare.

    PubMed

    Evans, Robert G

    2012-02-01

    "There's class warfare, all right, but it's my class, the rich class, that's making war, and we're winning." (Warren Buffett, five years ago.) Last year's Occupy Wall Street movement suggested that people are finally catching on. Note, making war: Buffett meant that there was deliberate intent and agency behind the huge transfer of wealth, since 1980, from the 99% to the 1%. Nor is the war metaphorical. There are real casualties, even if no body bags. Sadly, much Canadian commentary on inequality is pitiably naïve or deliberately obfuscatory. The 1% have captured national governments. The astronomical cost of American elections excludes the 99%. In Canada, parliamentary government permits one man to rule as a de facto dictator. The 1% don't like medicare.

  10. Understanding risk behaviours: how the sociology of deviance may contribute? The case of drug-taking.

    PubMed

    Peretti-Watel, Patrick; Moatti, Jean-Paul

    2006-08-01

    This paper argues that the sociology of deviance can be used to improve our understanding of some difficulties and unintended effects of health-promotion interventions designed to change risk behaviours, especially drug-taking. Firstly, many people engaged in 'risk behaviours' tend to deny the 'risky' label just as delinquents neutralise the 'deviant' label, and preventive information itself may be used by individuals in shaping risk denial. Secondly, deliberate risk-taking may be an 'innovative deviance',which is related to difficulties of conforming to the dominant 'risk culture'. Health promotion is likely to be quite ineffective if it remains wedded to the dominant risk culture and de facto contributes to the spread of it.

  11. Gstruct: a system for extracting schemas from GML documents

    NASA Astrophysics Data System (ADS)

    Chen, Hui; Zhu, Fubao; Guan, Jihong; Zhou, Shuigeng

    2008-10-01

    Geography Markup Language (GML) becomes the de facto standard for geographic information representation on the internet. GML schema provides a way to define the structure, content, and semantic of GML documents. It contains useful structural information of GML documents and plays an important role in storing, querying and analyzing GML data. However, GML schema is not mandatory, and it is common that a GML document contains no schema. In this paper, we present Gstruct, a tool for GML schema extraction. Gstruct finds the features in the input GML documents, identifies geometry datatypes as well as simple datatypes, then integrates all these features and eliminates improper components to output the optimal schema. Experiments demonstrate that Gstruct is effective in extracting semantically meaningful schemas from GML documents.

  12. Near Real-Time Image Reconstruction

    NASA Astrophysics Data System (ADS)

    Denker, C.; Yang, G.; Wang, H.

    2001-08-01

    In recent years, post-facto image-processing algorithms have been developed to achieve diffraction-limited observations of the solar surface. We present a combination of frame selection, speckle-masking imaging, and parallel computing which provides real-time, diffraction-limited, 256×256 pixel images at a 1-minute cadence. Our approach to achieve diffraction limited observations is complementary to adaptive optics (AO). At the moment, AO is limited by the fact that it corrects wavefront abberations only for a field of view comparable to the isoplanatic patch. This limitation does not apply to speckle-masking imaging. However, speckle-masking imaging relies on short-exposure images which limits its spectroscopic applications. The parallel processing of the data is performed on a Beowulf-class computer which utilizes off-the-shelf, mass-market technologies to provide high computational performance for scientific calculations and applications at low cost. Beowulf computers have a great potential, not only for image reconstruction, but for any kind of complex data reduction. Immediate access to high-level data products and direct visualization of dynamic processes on the Sun are two of the advantages to be gained.

  13. Evaluation of Emerging Energy-Efficient Heterogeneous Computing Platforms for Biomolecular and Cellular Simulation Workloads.

    PubMed

    Stone, John E; Hallock, Michael J; Phillips, James C; Peterson, Joseph R; Luthey-Schulten, Zaida; Schulten, Klaus

    2016-05-01

    Many of the continuing scientific advances achieved through computational biology are predicated on the availability of ongoing increases in computational power required for detailed simulation and analysis of cellular processes on biologically-relevant timescales. A critical challenge facing the development of future exascale supercomputer systems is the development of new computing hardware and associated scientific applications that dramatically improve upon the energy efficiency of existing solutions, while providing increased simulation, analysis, and visualization performance. Mobile computing platforms have recently become powerful enough to support interactive molecular visualization tasks that were previously only possible on laptops and workstations, creating future opportunities for their convenient use for meetings, remote collaboration, and as head mounted displays for immersive stereoscopic viewing. We describe early experiences adapting several biomolecular simulation and analysis applications for emerging heterogeneous computing platforms that combine power-efficient system-on-chip multi-core CPUs with high-performance massively parallel GPUs. We present low-cost power monitoring instrumentation that provides sufficient temporal resolution to evaluate the power consumption of individual CPU algorithms and GPU kernels. We compare the performance and energy efficiency of scientific applications running on emerging platforms with results obtained on traditional platforms, identify hardware and algorithmic performance bottlenecks that affect the usability of these platforms, and describe avenues for improving both the hardware and applications in pursuit of the needs of molecular modeling tasks on mobile devices and future exascale computers.

  14. Flexible Description and Adaptive Processing of Earth Observation Data through the BigEarth Platform

    NASA Astrophysics Data System (ADS)

    Gorgan, Dorian; Bacu, Victor; Stefanut, Teodor; Nandra, Cosmin; Mihon, Danut

    2016-04-01

    The Earth Observation data repositories extending periodically by several terabytes become a critical issue for organizations. The management of the storage capacity of such big datasets, accessing policy, data protection, searching, and complex processing require high costs that impose efficient solutions to balance the cost and value of data. Data can create value only when it is used, and the data protection has to be oriented toward allowing innovation that sometimes depends on creative people, which achieve unexpected valuable results through a flexible and adaptive manner. The users need to describe and experiment themselves different complex algorithms through analytics in order to valorize data. The analytics uses descriptive and predictive models to gain valuable knowledge and information from data analysis. Possible solutions for advanced processing of big Earth Observation data are given by the HPC platforms such as cloud. With platforms becoming more complex and heterogeneous, the developing of applications is even harder and the efficient mapping of these applications to a suitable and optimum platform, working on huge distributed data repositories, is challenging and complex as well, even by using specialized software services. From the user point of view, an optimum environment gives acceptable execution times, offers a high level of usability by hiding the complexity of computing infrastructure, and supports an open accessibility and control to application entities and functionality. The BigEarth platform [1] supports the entire flow of flexible description of processing by basic operators and adaptive execution over cloud infrastructure [2]. The basic modules of the pipeline such as the KEOPS [3] set of basic operators, the WorDeL language [4], the Planner for sequential and parallel processing, and the Executor through virtual machines, are detailed as the main components of the BigEarth platform [5]. The presentation exemplifies the development of some Earth Observation oriented applications based on flexible description of processing, and adaptive and portable execution over Cloud infrastructure. Main references for further information: [1] BigEarth project, http://cgis.utcluj.ro/projects/bigearth [2] Gorgan, D., "Flexible and Adaptive Processing of Earth Observation Data over High Performance Computation Architectures", International Conference and Exhibition Satellite 2015, August 17-19, Houston, Texas, USA. [3] Mihon, D., Bacu, V., Colceriu, V., Gorgan, D., "Modeling of Earth Observation Use Cases through the KEOPS System", Proceedings of the Intelligent Computer Communication and Processing (ICCP), IEEE-Press, pp. 455-460, (2015). [4] Nandra, C., Gorgan, D., "Workflow Description Language for Defining Big Earth Data Processing Tasks", Proceedings of the Intelligent Computer Communication and Processing (ICCP), IEEE-Press, pp. 461-468, (2015). [5] Bacu, V., Stefan, T., Gorgan, D., "Adaptive Processing of Earth Observation Data on Cloud Infrastructures Based on Workflow Description", Proceedings of the Intelligent Computer Communication and Processing (ICCP), IEEE-Press, pp.444-454, (2015).

  15. PAVICS: A Platform for the Analysis and Visualization of Climate Science

    NASA Astrophysics Data System (ADS)

    Gauvin St-Denis, B.; Landry, T.; Huard, D. B.; Byrns, D.; Chaumont, D.; Foucher, S.

    2016-12-01

    Climate service providers are boundary organizations working at the interface of climate science research and users of climate information. Users include academics in other disciplines looking for credible, customized future climate scenarios, government planners, resource managers, asset owners, as well as service utilities. These users are looking for relevant information regarding the impacts of climate change as well as informing decisions regarding adaptation options. As climate change concerns become mainstream, the pressure on climate service providers to deliver tailored, high quality information in a timely manner increases rapidly. To meet this growing demand, Ouranos, a climate service center located in Montreal, is collaborating with the Centre de recherche informatique de Montreal (CRIM) to develop a climate data analysis web-based platform interacting with RESTful services covering data access and retrieval, geospatial analysis, bias correction, distributed climate indicator computing and results visualization. The project, financed by CANARIE, relies on the experience of the UV-CDAT and ESGF-CWT teams, as well as on the Birdhouse framework developed by the German Climate Research Center (DKRZ) and French IPSL. Climate data is accessed through OPEnDAP, while computations are carried through WPS. Regions such as watersheds or user-defined polygons, used as spatial selections for computations, are managed by GeoServer, also providing WMS, WFS and WPS capabilities. The services are hosted on independent servers communicating by high throughput network. Deployment, maintenance and collaboration with other development teams are eased by the use of Docker and OpenStack VMs. Web-based tools are developed with modern web frameworks such as React-Redux, OpenLayers 3, Cesium and Plotly. Although the main objective of the project is to build a functioning, usable data analysis pipeline within two years, time is also devoted to explore emerging technologies and assess their potential. For instance, sandbox environments will store climate data in HDFS, process it with Apache Spark and allow interaction through Jupyter Notebooks. Data streaming of observational data with OpenGL and Cesium is also considered.

  16. Development of a computer model to predict platform station keeping requirements in the Gulf of Mexico using remote sensing data

    NASA Technical Reports Server (NTRS)

    Barber, Bryan; Kahn, Laura; Wong, David

    1990-01-01

    Offshore operations such as oil drilling and radar monitoring require semisubmersible platforms to remain stationary at specific locations in the Gulf of Mexico. Ocean currents, wind, and waves in the Gulf of Mexico tend to move platforms away from their desired locations. A computer model was created to predict the station keeping requirements of a platform. The computer simulation uses remote sensing data from satellites and buoys as input. A background of the project, alternate approaches to the project, and the details of the simulation are presented.

  17. Traffic information computing platform for big data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duan, Zongtao, E-mail: ztduan@chd.edu.cn; Li, Ying, E-mail: ztduan@chd.edu.cn; Zheng, Xibin, E-mail: ztduan@chd.edu.cn

    Big data environment create data conditions for improving the quality of traffic information service. The target of this article is to construct a traffic information computing platform for big data environment. Through in-depth analysis the connotation and technology characteristics of big data and traffic information service, a distributed traffic atomic information computing platform architecture is proposed. Under the big data environment, this type of traffic atomic information computing architecture helps to guarantee the traffic safety and efficient operation, more intelligent and personalized traffic information service can be used for the traffic information users.

  18. Open source acceleration of wave optics simulations on energy efficient high-performance computing platforms

    NASA Astrophysics Data System (ADS)

    Beck, Jeffrey; Bos, Jeremy P.

    2017-05-01

    We compare several modifications to the open-source wave optics package, WavePy, intended to improve execution time. Specifically, we compare the relative performance of the Intel MKL, a CPU based OpenCV distribution, and GPU-based version. Performance is compared between distributions both on the same compute platform and between a fully-featured computing workstation and the NVIDIA Jetson TX1 platform. Comparisons are drawn in terms of both execution time and power consumption. We have found that substituting the Fast Fourier Transform operation from OpenCV provides a marked improvement on all platforms. In addition, we show that embedded platforms offer some possibility for extensive improvement in terms of efficiency compared to a fully featured workstation.

  19. An Ex-Post Facto Examination of Relationships among the Developmental Designs Professional Development Model/Classroom Management Approach, School Leadership, Climate, Student Achievement, Attendance, and Behavior in High Poverty, Middle Grades Schools

    ERIC Educational Resources Information Center

    Hough, David L.; Schmitt, Vicki L.

    2011-01-01

    This study reports finding from an ex post facto causal-comparison study utilizing data from a multifaceted program evaluation of a professional development approach to classroom management known as Development Designs 1 and Developmental Designs 2 (DD1 & D2). Data from this program evaluation indicate that teachers implement a number of classroom…

  20. Application of the Modular Command and Control Structure (MCES) to Marine Corps SINCGARS Allocation

    DTIC Science & Technology

    1991-07-01

    The first goal is to delineate the difference between the system being analyzed and its environment . To bound the C3 system, the analyst should...hardware and software entities and structures, is related to the forces it controls and the environmental stimuli to which it responds, including the enemy...M CES represents the environmental facto:s that require explicit assumptions in the problem. This ring may be seen as including the major scenario

  1. Digital Literacy Development of Students Involved in an ICT Educational Project

    NASA Astrophysics Data System (ADS)

    Quintana, Maria Graciela Badilla; Pujol, Meritxell Cortada

    The impact of the Information and Communication Technologies (ICT) has become the core of a change that involves most of the society fields, consequently the technological and informational literacy are essential requirements in education. The research is a quasi-experimental and ex-post-facto study in schools from Spain. The aim was to describe and analyze the involvement showed by 219 students who participated in a development of ICT's Project named Ponte dos Brozos. The research objective was to respond if the students who usually worked with ICT, had better knowledge and management with computing tools, and if they are better prepared in researching and selecting information. Results showed that students who have a higher contact with ICTs know about the technology and how to use it, also better knowledge and control of the computer and operative systems, a high information management level trough the Internet, although their literacy in information is devoid.

  2. Interactive Computer-Assisted Instruction in Acid-Base Physiology for Mobile Computer Platforms

    ERIC Educational Resources Information Center

    Longmuir, Kenneth J.

    2014-01-01

    In this project, the traditional lecture hall presentation of acid-base physiology in the first-year medical school curriculum was replaced by interactive, computer-assisted instruction designed primarily for the iPad and other mobile computer platforms. Three learning modules were developed, each with ~20 screens of information, on the subjects…

  3. Evaluation of Emerging Energy-Efficient Heterogeneous Computing Platforms for Biomolecular and Cellular Simulation Workloads

    PubMed Central

    Stone, John E.; Hallock, Michael J.; Phillips, James C.; Peterson, Joseph R.; Luthey-Schulten, Zaida; Schulten, Klaus

    2016-01-01

    Many of the continuing scientific advances achieved through computational biology are predicated on the availability of ongoing increases in computational power required for detailed simulation and analysis of cellular processes on biologically-relevant timescales. A critical challenge facing the development of future exascale supercomputer systems is the development of new computing hardware and associated scientific applications that dramatically improve upon the energy efficiency of existing solutions, while providing increased simulation, analysis, and visualization performance. Mobile computing platforms have recently become powerful enough to support interactive molecular visualization tasks that were previously only possible on laptops and workstations, creating future opportunities for their convenient use for meetings, remote collaboration, and as head mounted displays for immersive stereoscopic viewing. We describe early experiences adapting several biomolecular simulation and analysis applications for emerging heterogeneous computing platforms that combine power-efficient system-on-chip multi-core CPUs with high-performance massively parallel GPUs. We present low-cost power monitoring instrumentation that provides sufficient temporal resolution to evaluate the power consumption of individual CPU algorithms and GPU kernels. We compare the performance and energy efficiency of scientific applications running on emerging platforms with results obtained on traditional platforms, identify hardware and algorithmic performance bottlenecks that affect the usability of these platforms, and describe avenues for improving both the hardware and applications in pursuit of the needs of molecular modeling tasks on mobile devices and future exascale computers. PMID:27516922

  4. Autonomous self-organizing resource manager for multiple networked platforms

    NASA Astrophysics Data System (ADS)

    Smith, James F., III

    2002-08-01

    A fuzzy logic based expert system for resource management has been developed that automatically allocates electronic attack (EA) resources in real-time over many dissimilar autonomous naval platforms defending their group against attackers. The platforms can be very general, e.g., ships, planes, robots, land based facilities, etc. Potential foes the platforms deal with can also be general. This paper provides an overview of the resource manager including the four fuzzy decision trees that make up the resource manager; the fuzzy EA model; genetic algorithm based optimization; co-evolutionary data mining through gaming; and mathematical, computational and hardware based validation. Methods of automatically designing new multi-platform EA techniques are considered. The expert system runs on each defending platform rendering it an autonomous system requiring no human intervention. There is no commanding platform. Instead the platforms work cooperatively as a function of battlespace geometry; sensor data such as range, bearing, ID, uncertainty measures for sensor output; intelligence reports; etc. Computational experiments will show the defending networked platform's ability to self- organize. The platforms' ability to self-organize is illustrated through the output of the scenario generator, a software package that automates the underlying data mining problem and creates a computer movie of the platforms' interaction for evaluation.

  5. Uncover the Cloud for Geospatial Sciences and Applications to Adopt Cloud Computing

    NASA Astrophysics Data System (ADS)

    Yang, C.; Huang, Q.; Xia, J.; Liu, K.; Li, J.; Xu, C.; Sun, M.; Bambacus, M.; Xu, Y.; Fay, D.

    2012-12-01

    Cloud computing is emerging as the future infrastructure for providing computing resources to support and enable scientific research, engineering development, and application construction, as well as work force education. On the other hand, there is a lot of doubt about the readiness of cloud computing to support a variety of scientific research, development and educations. This research is a project funded by NASA SMD to investigate through holistic studies how ready is the cloud computing to support geosciences. Four applications with different computing characteristics including data, computing, concurrent, and spatiotemporal intensities are taken to test the readiness of cloud computing to support geosciences. Three popular and representative cloud platforms including Amazon EC2, Microsoft Azure, and NASA Nebula as well as a traditional cluster are utilized in the study. Results illustrates that cloud is ready to some degree but more research needs to be done to fully implemented the cloud benefit as advertised by many vendors and defined by NIST. Specifically, 1) most cloud platform could help stand up new computing instances, a new computer, in a few minutes as envisioned, therefore, is ready to support most computing needs in an on demand fashion; 2) the load balance and elasticity, a defining characteristic, is ready in some cloud platforms, such as Amazon EC2, to support bigger jobs, e.g., needs response in minutes, while some are not ready to support the elasticity and load balance well. All cloud platform needs further research and development to support real time application at subminute level; 3) the user interface and functionality of cloud platforms vary a lot and some of them are very professional and well supported/documented, such as Amazon EC2, some of them needs significant improvement for the general public to adopt cloud computing without professional training or knowledge about computing infrastructure; 4) the security is a big concern in cloud computing platform, with the sharing spirit of cloud computing, it is very hard to ensure higher level security, except a private cloud is built for a specific organization without public access, public cloud platform does not support FISMA medium level yet and may never be able to support FISMA high level; 5) HPC jobs needs of cloud computing is not well supported and only Amazon EC2 supports this well. The research is being taken by NASA and other agencies to consider cloud computing adoption. We hope the publication of the research would also benefit the public to adopt cloud computing.

  6. Rotating Desk for Collaboration by Two Computer Programmers

    NASA Technical Reports Server (NTRS)

    Riley, John Thomas

    2005-01-01

    A special-purpose desk has been designed to facilitate collaboration by two computer programmers sharing one desktop computer or computer terminal. The impetus for the design is a trend toward what is known in the software industry as extreme programming an approach intended to ensure high quality without sacrificing the quantity of computer code produced. Programmers working in pairs is a major feature of extreme programming. The present desk design minimizes the stress of the collaborative work environment. It supports both quality and work flow by making it unnecessary for programmers to get in each other s way. The desk (see figure) includes a rotating platform that supports a computer video monitor, keyboard, and mouse. The desk enables one programmer to work on the keyboard for any amount of time and then the other programmer to take over without breaking the train of thought. The rotating platform is supported by a turntable bearing that, in turn, is supported by a weighted base. The platform contains weights to improve its balance. The base includes a stand for a computer, and is shaped and dimensioned to provide adequate foot clearance for both users. The platform includes an adjustable stand for the monitor, a surface for the keyboard and mouse, and spaces for work papers, drinks, and snacks. The heights of the monitor, keyboard, and mouse are set to minimize stress. The platform can be rotated through an angle of 40 to give either user a straight-on view of the monitor and full access to the keyboard and mouse. Magnetic latches keep the platform preferentially at either of the two extremes of rotation. To switch between users, one simply grabs the edge of the platform and pulls it around. The magnetic latch is easily released, allowing the platform to rotate freely to the position of the other user

  7. Micromagnetics on high-performance workstation and mobile computational platforms

    NASA Astrophysics Data System (ADS)

    Fu, S.; Chang, R.; Couture, S.; Menarini, M.; Escobar, M. A.; Kuteifan, M.; Lubarda, M.; Gabay, D.; Lomakin, V.

    2015-05-01

    The feasibility of using high-performance desktop and embedded mobile computational platforms is presented, including multi-core Intel central processing unit, Nvidia desktop graphics processing units, and Nvidia Jetson TK1 Platform. FastMag finite element method-based micromagnetic simulator is used as a testbed, showing high efficiency on all the platforms. Optimization aspects of improving the performance of the mobile systems are discussed. The high performance, low cost, low power consumption, and rapid performance increase of the embedded mobile systems make them a promising candidate for micromagnetic simulations. Such architectures can be used as standalone systems or can be built as low-power computing clusters.

  8. Continuous measurement of breast tumor hormone receptor expression: a comparison of two computational pathology platforms

    PubMed Central

    Ahern, Thomas P.; Beck, Andrew H.; Rosner, Bernard A.; Glass, Ben; Frieling, Gretchen; Collins, Laura C.; Tamimi, Rulla M.

    2017-01-01

    Background Computational pathology platforms incorporate digital microscopy with sophisticated image analysis to permit rapid, continuous measurement of protein expression. We compared two computational pathology platforms on their measurement of breast tumor estrogen receptor (ER) and progesterone receptor (PR) expression. Methods Breast tumor microarrays from the Nurses’ Health Study were stained for ER (n=592) and PR (n=187). One expert pathologist scored cases as positive if ≥1% of tumor nuclei exhibited stain. ER and PR were then measured with the Definiens Tissue Studio (automated) and Aperio Digital Pathology (user-supervised) platforms. Platform-specific measurements were compared using boxplots, scatter plots, and correlation statistics. Classification of ER and PR positivity by platform-specific measurements was evaluated with areas under receiver operating characteristic curves (AUC) from univariable logistic regression models, using expert pathologist classification as the standard. Results Both platforms showed considerable overlap in continuous measurements of ER and PR between positive and negative groups classified by expert pathologist. Platform-specific measurements were strongly and positively correlated with one another (rho≥0.77). The user-supervised Aperio workflow performed slightly better than the automated Definiens workflow at classifying ER positivity (AUCAperio=0.97; AUCDefiniens=0.90; difference=0.07, 95% CI: 0.05, 0.09) and PR positivity (AUCAperio=0.94; AUCDefiniens=0.87; difference=0.07, 95% CI: 0.03, 0.12). Conclusion Paired hormone receptor expression measurements from two different computational pathology platforms agreed well with one another. The user-supervised workflow yielded better classification accuracy than the automated workflow. Appropriately validated computational pathology algorithms enrich molecular epidemiology studies with continuous protein expression data and may accelerate tumor biomarker discovery. PMID:27729430

  9. Study on the application of mobile internet cloud computing platform

    NASA Astrophysics Data System (ADS)

    Gong, Songchun; Fu, Songyin; Chen, Zheng

    2012-04-01

    The innovative development of computer technology promotes the application of the cloud computing platform, which actually is the substitution and exchange of a sort of resource service models and meets the needs of users on the utilization of different resources after changes and adjustments of multiple aspects. "Cloud computing" owns advantages in many aspects which not merely reduce the difficulties to apply the operating system and also make it easy for users to search, acquire and process the resources. In accordance with this point, the author takes the management of digital libraries as the research focus in this paper, and analyzes the key technologies of the mobile internet cloud computing platform in the operation process. The popularization and promotion of computer technology drive people to create the digital library models, and its core idea is to strengthen the optimal management of the library resource information through computers and construct an inquiry and search platform with high performance, allowing the users to access to the necessary information resources at any time. However, the cloud computing is able to promote the computations within the computers to distribute in a large number of distributed computers, and hence implement the connection service of multiple computers. The digital libraries, as a typical representative of the applications of the cloud computing, can be used to carry out an analysis on the key technologies of the cloud computing.

  10. [The Key Technology Study on Cloud Computing Platform for ECG Monitoring Based on Regional Internet of Things].

    PubMed

    Yang, Shu; Qiu, Yuyan; Shi, Bo

    2016-09-01

    This paper explores the methods of building the internet of things of a regional ECG monitoring, focused on the implementation of ECG monitoring center based on cloud computing platform. It analyzes implementation principles of automatic identifi cation in the types of arrhythmia. It also studies the system architecture and key techniques of cloud computing platform, including server load balancing technology, reliable storage of massive smalfi les and the implications of quick search function.

  11. A Casualty in the Class War: Canada's Medicare

    PubMed Central

    Evans, Robert G.

    2012-01-01

    “There's class warfare, all right, but it's my class, the rich class, that's making war, and we're winning.” (Warren Buffett, five years ago.) Last year's Occupy Wall Street movement suggested that people are finally catching on. Note, making war: Buffett meant that there was deliberate intent and agency behind the huge transfer of wealth, since 1980, from the 99% to the 1%. Nor is the war metaphorical. There are real casualties, even if no body bags. Sadly, much Canadian commentary on inequality is pitiably naïve or deliberately obfuscatory. The 1% have captured national governments. The astronomical cost of American elections excludes the 99%. In Canada, parliamentary government permits one man to rule as a de facto dictator. The 1% don't like medicare. PMID:23372577

  12. Application of neural networks and sensitivity analysis to improved prediction of trauma survival.

    PubMed

    Hunter, A; Kennedy, L; Henry, J; Ferguson, I

    2000-05-01

    The performance of trauma departments is widely audited by applying predictive models that assess probability of survival, and examining the rate of unexpected survivals and deaths. Although the TRISS methodology, a logistic regression modelling technique, is still the de facto standard, it is known that neural network models perform better. A key issue when applying neural network models is the selection of input variables. This paper proposes a novel form of sensitivity analysis, which is simpler to apply than existing techniques, and can be used for both numeric and nominal input variables. The technique is applied to the audit survival problem, and used to analyse the TRISS variables. The conclusions discuss the implications for the design of further improved scoring schemes and predictive models.

  13. Clinical research with economically disadvantaged populations

    PubMed Central

    Denny, Colleen C; Grady, Christine

    2007-01-01

    Concerns about exploiting the poor or economically disadvantaged in clinical research are widespread in the bioethics community. For some, any research that involves economically disadvantaged individuals is de facto ethically problematic. The economically disadvantaged are thought of as “venerable” to exploitation, impaired decision making, or both, thus requiring either special protections or complete exclusion from research. A closer examination of the worries about vulnerabilities among the economically disadvantaged reveals that some of these worries are empirically or logically untenable, while others can be better resolved by improved study designs than by blanket exclusion of poorer individuals from research participation. The scientific objective to generate generalisable results and the ethical objective to fairly distribute both the risks and benefits of research oblige researchers not to unnecessarily bar economically disadvantaged subjects from clinical research participation. PMID:17601862

  14. Route Flap Damping Made Usable

    NASA Astrophysics Data System (ADS)

    Pelsser, Cristel; Maennel, Olaf; Mohapatra, Pradosh; Bush, Randy; Patel, Keyur

    The Border Gateway Protocol (BGP), the de facto inter-domain routing protocol of the Internet, is known to be noisy. The protocol has two main mechanisms to ameliorate this, MinRouteAdvertisementInterval (MRAI), and Route Flap Damping (RFD). MRAI deals with very short bursts on the order of a few to 30 seconds. RFD deals with longer bursts, minutes to hours. Unfortunately, RFD was found to severely penalize sites for being well-connected because topological richness amplifies the number of update messages exchanged. So most operators have disabled it. Through measurement, this paper explores the avenue of absolutely minimal change to code, and shows that a few RFD algorithmic constants and limits can be trivially modified, with the result being damping a non-trivial amount of long term churn without penalizing well-behaved prefixes' normal convergence process.

  15. Children Associate Racial Groups with Wealth: Evidence from South Africa

    PubMed Central

    Olson, Kristina R.; Shutts, Kristin; Kinzler, Katherine D.; Weisman, Kara G.

    2012-01-01

    Group-based social hierarchies exist in nearly every society, yet little is known about whether children understand that they exist. The present studies investigated whether 3- to 10-year-old children (N=84) in South Africa associate higher-status racial groups with higher levels of wealth, one indicator of social status. Children matched higher-value belongings with White people more often than with multiracial or Black people and with multiracial people more often than with Black people, thus showing sensitivity to the de facto racial hierarchy in their society. There were no age-related changes in children’s tendency to associate racial groups with wealth differences. The implications of these results are discussed in light of the general tendency for people to legitimize and perpetuate the status quo. PMID:22860510

  16. Optical gesture sensing and depth mapping technologies for head-mounted displays: an overview

    NASA Astrophysics Data System (ADS)

    Kress, Bernard; Lee, Johnny

    2013-05-01

    Head Mounted Displays (HMDs), and especially see-through HMDs have gained renewed interest in recent time, and for the first time outside the traditional military and defense realm, due to several high profile consumer electronics companies presenting their products to hit market. Consumer electronics HMDs have quite different requirements and constrains as their military counterparts. Voice comments are the de-facto interface for such devices, but when the voice recognition does not work (not connection to the cloud for example), trackpad and gesture sensing technologies have to be used to communicate information to the device. We review in this paper the various technologies developed today integrating optical gesture sensing in a small footprint, as well as the various related 3d depth mapping sensors.

  17. Dealing with chemical reaction pathways and electronic excitations in molecular systems via renormalized and active-space coupled-cluster methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Piecuch, Piotr; Li, Wei; Lutz, Jesse J.

    Coupled-cluster (CC) theory has become the de facto standard for high-accuracy molecular calculations, but the widely used CC and equation-of-motion (EOM) CC approaches, such as CCSD(T) and EOMCCSD, have difficulties with capturing stronger electron correlations that characterize multi-reference molecular problems. This presentation demonstrates that many of these difficulties can be addressed by exploiting the completely renormalized (CR) CC and EOMCC approaches, such as CR-CC(2,3), CR-EOMCCSD(T), and CR-EOMCC(2,3), and their local correlation counterparts applicable to systems with hundreds of atoms, and the active-space CC/EOMCC approaches, such as CCSDt and EOMCCSDt, and their extensions to valence systems via the electron-attached and ionizedmore » formalisms.« less

  18. DFX via the Internet

    NASA Astrophysics Data System (ADS)

    Wagner, Rick; Castanotto, Giuseppe; Goldberg, Kenneth A.

    1995-11-01

    The Internet offers tremendous potential for rapid development of mechanical products to meet global competition. In the past several years, a number of geometric algorithms have been developed to evaluate manufacturing properties such as feedability, fixturability, assemblability, etc. This class of algorithms is sometimes termed `DFX: Design for X'. One problem is that most of these algorithms are tailored to a particular CAD system and format and so have not been widely tested by industry. the World Wide Web may offer a solution: its simple interface language may become a de facto standard for the exchange of geometric data. In this preliminary paper we describe one model for remote analysis of CAD models that we believe holds promise for use in industry (e.g. during the design cycle) and in research (e.g. to encourage verification of results).

  19. Cloud computing for comparative genomics with windows azure platform.

    PubMed

    Kim, Insik; Jung, Jae-Yoon; Deluca, Todd F; Nelson, Tristan H; Wall, Dennis P

    2012-01-01

    Cloud computing services have emerged as a cost-effective alternative for cluster systems as the number of genomes and required computation power to analyze them increased in recent years. Here we introduce the Microsoft Azure platform with detailed execution steps and a cost comparison with Amazon Web Services.

  20. Cloud Computing for Comparative Genomics with Windows Azure Platform

    PubMed Central

    Kim, Insik; Jung, Jae-Yoon; DeLuca, Todd F.; Nelson, Tristan H.; Wall, Dennis P.

    2012-01-01

    Cloud computing services have emerged as a cost-effective alternative for cluster systems as the number of genomes and required computation power to analyze them increased in recent years. Here we introduce the Microsoft Azure platform with detailed execution steps and a cost comparison with Amazon Web Services. PMID:23032609

  1. Analysis of scalability of high-performance 3D image processing platform for virtual colonoscopy

    NASA Astrophysics Data System (ADS)

    Yoshida, Hiroyuki; Wu, Yin; Cai, Wenli

    2014-03-01

    One of the key challenges in three-dimensional (3D) medical imaging is to enable the fast turn-around time, which is often required for interactive or real-time response. This inevitably requires not only high computational power but also high memory bandwidth due to the massive amount of data that need to be processed. For this purpose, we previously developed a software platform for high-performance 3D medical image processing, called HPC 3D-MIP platform, which employs increasingly available and affordable commodity computing systems such as the multicore, cluster, and cloud computing systems. To achieve scalable high-performance computing, the platform employed size-adaptive, distributable block volumes as a core data structure for efficient parallelization of a wide range of 3D-MIP algorithms, supported task scheduling for efficient load distribution and balancing, and consisted of a layered parallel software libraries that allow image processing applications to share the common functionalities. We evaluated the performance of the HPC 3D-MIP platform by applying it to computationally intensive processes in virtual colonoscopy. Experimental results showed a 12-fold performance improvement on a workstation with 12-core CPUs over the original sequential implementation of the processes, indicating the efficiency of the platform. Analysis of performance scalability based on the Amdahl's law for symmetric multicore chips showed the potential of a high performance scalability of the HPC 3DMIP platform when a larger number of cores is available.

  2. Sirepo - Warp

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nagler, Robert; Moeller, Paul

    Sirepo is an open source framework for cloud computing. The graphical user interface (GUI) for Sirepo, also known as the client, executes in any HTML5 compliant web browser on any computing platform, including tablets. The client is built in JavaScript, making use of the following open source libraries: Bootstrap, which is fundamental for cross-platform web applications; AngularJS, which provides a model–view–controller (MVC) architecture and GUI components; and D3.js, which provides interactive plots and data-driven transformations. The Sirepo server is built on the following Python technologies: Flask, which is a lightweight framework for web development; Jin-ja, which is a secure andmore » widely used templating language; and Werkzeug, a utility library that is compliant with the WSGI standard. We use Nginx as the HTTP server and proxy, which provides a scalable event-driven architecture. The physics codes supported by Sirepo execute inside a Docker container. One of the codes supported by Sirepo is Warp. Warp is a particle-in-cell (PIC) code de-signed to simulate high-intensity charged particle beams and plasmas in both the electrostatic and electromagnetic regimes, with a wide variety of integrated physics models and diagnostics. At pre-sent, Sirepo supports a small subset of Warp’s capabilities. Warp is open source and is part of the Berkeley Lab Accelerator Simulation Toolkit.« less

  3. Cloud Based Applications and Platforms (Presentation)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brodt-Giles, D.

    2014-05-15

    Presentation to the Cloud Computing East 2014 Conference, where we are highlighting our cloud computing strategy, describing the platforms on the cloud (including Smartgrid.gov), and defining our process for implementing cloud based applications.

  4. A cloud computing based platform for sleep behavior and chronic diseases collaborative research.

    PubMed

    Kuo, Mu-Hsing; Borycki, Elizabeth; Kushniruk, Andre; Huang, Yueh-Min; Hung, Shu-Hui

    2014-01-01

    The objective of this study is to propose a Cloud Computing based platform for sleep behavior and chronic disease collaborative research. The platform consists of two main components: (1) a sensing bed sheet with textile sensors to automatically record patient's sleep behaviors and vital signs, and (2) a service-oriented cloud computing architecture (SOCCA) that provides a data repository and allows for sharing and analysis of collected data. Also, we describe our systematic approach to implementing the SOCCA. We believe that the new cloud-based platform can provide nurse and other health professional researchers located in differing geographic locations with a cost effective, flexible, secure and privacy-preserved research environment.

  5. CBRAIN: a web-based, distributed computing platform for collaborative neuroimaging research

    PubMed Central

    Sherif, Tarek; Rioux, Pierre; Rousseau, Marc-Etienne; Kassis, Nicolas; Beck, Natacha; Adalat, Reza; Das, Samir; Glatard, Tristan; Evans, Alan C.

    2014-01-01

    The Canadian Brain Imaging Research Platform (CBRAIN) is a web-based collaborative research platform developed in response to the challenges raised by data-heavy, compute-intensive neuroimaging research. CBRAIN offers transparent access to remote data sources, distributed computing sites, and an array of processing and visualization tools within a controlled, secure environment. Its web interface is accessible through any modern browser and uses graphical interface idioms to reduce the technical expertise required to perform large-scale computational analyses. CBRAIN's flexible meta-scheduling has allowed the incorporation of a wide range of heterogeneous computing sites, currently including nine national research High Performance Computing (HPC) centers in Canada, one in Korea, one in Germany, and several local research servers. CBRAIN leverages remote computing cycles and facilitates resource-interoperability in a transparent manner for the end-user. Compared with typical grid solutions available, our architecture was designed to be easily extendable and deployed on existing remote computing sites with no tool modification, administrative intervention, or special software/hardware configuration. As October 2013, CBRAIN serves over 200 users spread across 53 cities in 17 countries. The platform is built as a generic framework that can accept data and analysis tools from any discipline. However, its current focus is primarily on neuroimaging research and studies of neurological diseases such as Autism, Parkinson's and Alzheimer's diseases, Multiple Sclerosis as well as on normal brain structure and development. This technical report presents the CBRAIN Platform, its current deployment and usage and future direction. PMID:24904400

  6. CBRAIN: a web-based, distributed computing platform for collaborative neuroimaging research.

    PubMed

    Sherif, Tarek; Rioux, Pierre; Rousseau, Marc-Etienne; Kassis, Nicolas; Beck, Natacha; Adalat, Reza; Das, Samir; Glatard, Tristan; Evans, Alan C

    2014-01-01

    The Canadian Brain Imaging Research Platform (CBRAIN) is a web-based collaborative research platform developed in response to the challenges raised by data-heavy, compute-intensive neuroimaging research. CBRAIN offers transparent access to remote data sources, distributed computing sites, and an array of processing and visualization tools within a controlled, secure environment. Its web interface is accessible through any modern browser and uses graphical interface idioms to reduce the technical expertise required to perform large-scale computational analyses. CBRAIN's flexible meta-scheduling has allowed the incorporation of a wide range of heterogeneous computing sites, currently including nine national research High Performance Computing (HPC) centers in Canada, one in Korea, one in Germany, and several local research servers. CBRAIN leverages remote computing cycles and facilitates resource-interoperability in a transparent manner for the end-user. Compared with typical grid solutions available, our architecture was designed to be easily extendable and deployed on existing remote computing sites with no tool modification, administrative intervention, or special software/hardware configuration. As October 2013, CBRAIN serves over 200 users spread across 53 cities in 17 countries. The platform is built as a generic framework that can accept data and analysis tools from any discipline. However, its current focus is primarily on neuroimaging research and studies of neurological diseases such as Autism, Parkinson's and Alzheimer's diseases, Multiple Sclerosis as well as on normal brain structure and development. This technical report presents the CBRAIN Platform, its current deployment and usage and future direction.

  7. Acceleration of Cherenkov angle reconstruction with the new Intel Xeon/FPGA compute platform for the particle identification in the LHCb Upgrade

    NASA Astrophysics Data System (ADS)

    Faerber, Christian

    2017-10-01

    The LHCb experiment at the LHC will upgrade its detector by 2018/2019 to a ‘triggerless’ readout scheme, where all the readout electronics and several sub-detector parts will be replaced. The new readout electronics will be able to readout the detector at 40 MHz. This increases the data bandwidth from the detector down to the Event Filter farm to 40 TBit/s, which also has to be processed to select the interesting proton-proton collision for later storage. The architecture of such a computing farm, which can process this amount of data as efficiently as possible, is a challenging task and several compute accelerator technologies are being considered for use inside the new Event Filter farm. In the high performance computing sector more and more FPGA compute accelerators are used to improve the compute performance and reduce the power consumption (e.g. in the Microsoft Catapult project and Bing search engine). Also for the LHCb upgrade the usage of an experimental FPGA accelerated computing platform in the Event Building or in the Event Filter farm is being considered and therefore tested. This platform from Intel hosts a general CPU and a high performance FPGA linked via a high speed link which is for this platform a QPI link. On the FPGA an accelerator is implemented. The used system is a two socket platform from Intel with a Xeon CPU and an FPGA. The FPGA has cache-coherent memory access to the main memory of the server and can collaborate with the CPU. As a first step, a computing intensive algorithm to reconstruct Cherenkov angles for the LHCb RICH particle identification was successfully ported in Verilog to the Intel Xeon/FPGA platform and accelerated by a factor of 35. The same algorithm was ported to the Intel Xeon/FPGA platform with OpenCL. The implementation work and the performance will be compared. Also another FPGA accelerator the Nallatech 385 PCIe accelerator with the same Stratix V FPGA were tested for performance. The results show that the Intel Xeon/FPGA platforms, which are built in general for high performance computing, are also very interesting for the High Energy Physics community.

  8. Program Helps Decompose Complex Design Systems

    NASA Technical Reports Server (NTRS)

    Rogers, James L., Jr.; Hall, Laura E.

    1995-01-01

    DeMAID (Design Manager's Aid for Intelligent Decomposition) computer program is knowledge-based software system for ordering sequence of modules and identifying possible multilevel structure for design problems such as large platforms in outer space. Groups modular subsystems on basis of interactions among them. Saves considerable amount of money and time in total design process, particularly in new design problem in which order of modules has not been defined. Originally written for design problems, also applicable to problems containing modules (processes) that take inputs and generate outputs. Available in three machine versions: Macintosh written in Symantec's Think C 3.01, Sun, and SGI IRIS in C language.

  9. Applications of Dynamic Deployment of Services in Industrial Automation

    NASA Astrophysics Data System (ADS)

    Candido, Gonçalo; Barata, José; Jammes, François; Colombo, Armando W.

    Service-oriented Architecture (SOA) is becoming a de facto paradigm for business and enterprise integration. SOA is expanding into several domains of application envisioning a unified solution suitable across all different layers of an enterprise infrastructure. The application of SOA based on open web standards can significantly enhance the interoperability and openness of those devices. By embedding a dynamical deployment service even into small field de- vices, it would be either possible to allow machine builders to place built- in services and still allow the integrator to deploy on-the-run the services that best fit his current application. This approach allows the developer to keep his own preferred development language, but still deliver a SOA- compliant application. A dynamic deployment service is envisaged as a fundamental framework to support more complex applications, reducing deployment delays, while increasing overall system agility. As use-case scenario, a dynamic deployment service was implemented over DPWS and WS-Management specifications allowing designing and programming an automation application using IEC61131 languages, and deploying these components as web services into devices.

  10. MarDRe: efficient MapReduce-based removal of duplicate DNA reads in the cloud.

    PubMed

    Expósito, Roberto R; Veiga, Jorge; González-Domínguez, Jorge; Touriño, Juan

    2017-09-01

    This article presents MarDRe, a de novo cloud-ready duplicate and near-duplicate removal tool that can process single- and paired-end reads from FASTQ/FASTA datasets. MarDRe takes advantage of the widely adopted MapReduce programming model to fully exploit Big Data technologies on cloud-based infrastructures. Written in Java to maximize cross-platform compatibility, MarDRe is built upon the open-source Apache Hadoop project, the most popular distributed computing framework for scalable Big Data processing. On a 16-node cluster deployed on the Amazon EC2 cloud platform, MarDRe is up to 8.52 times faster than a representative state-of-the-art tool. Source code in Java and Hadoop as well as a user's guide are freely available under the GNU GPLv3 license at http://mardre.des.udc.es . rreye@udc.es. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  11. Strategies Used by Professors through Virtual Educational Platforms in Face-to-Face Classes: A View from the Chamilo Platform

    ERIC Educational Resources Information Center

    Valencia, Heriberto Gonzalez; Villota Enriquez, Jackeline Amparo; Agredo, Patricia Medina

    2017-01-01

    This study consisted in characterizing the strategies used by professors; implemented through virtual educational platforms. The context of this research were the classrooms of the Santiago de Cali University and the virtual space of the Chamilo virtual platform, where two professors from the Faculty of Education of the same university…

  12. Platform-independent method for computer aided schematic drawings

    DOEpatents

    Vell, Jeffrey L [Slingerlands, NY; Siganporia, Darius M [Clifton Park, NY; Levy, Arthur J [Fort Lauderdale, FL

    2012-02-14

    A CAD/CAM method is disclosed for a computer system to capture and interchange schematic drawing and associated design information. The schematic drawing and design information are stored in an extensible, platform-independent format.

  13. A Big Data Platform for Storing, Accessing, Mining and Learning Geospatial Data

    NASA Astrophysics Data System (ADS)

    Yang, C. P.; Bambacus, M.; Duffy, D.; Little, M. M.

    2017-12-01

    Big Data is becoming a norm in geoscience domains. A platform that is capable to effiently manage, access, analyze, mine, and learn the big data for new information and knowledge is desired. This paper introduces our latest effort on developing such a platform based on our past years' experiences on cloud and high performance computing, analyzing big data, comparing big data containers, and mining big geospatial data for new information. The platform includes four layers: a) the bottom layer includes a computing infrastructure with proper network, computer, and storage systems; b) the 2nd layer is a cloud computing layer based on virtualization to provide on demand computing services for upper layers; c) the 3rd layer is big data containers that are customized for dealing with different types of data and functionalities; d) the 4th layer is a big data presentation layer that supports the effient management, access, analyses, mining and learning of big geospatial data.

  14. The Efficacy of the Internet-Based Blackboard Platform in Developmental Writing Classes

    ERIC Educational Resources Information Center

    Shudooh, Yusuf M.

    2016-01-01

    The application of computer-assisted platforms in writing classes is a relatively new paradigm in education. The adoption of computers-assisted writing classes is gaining ground in many western and non western universities. Numerous issues can be addressed when conducting computer-assisted classes (CAC). However, a few studies conducted to assess…

  15. What does fault tolerant Deep Learning need from MPI?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Amatya, Vinay C.; Vishnu, Abhinav; Siegel, Charles M.

    Deep Learning (DL) algorithms have become the {\\em de facto} Machine Learning (ML) algorithm for large scale data analysis. DL algorithms are computationally expensive -- even distributed DL implementations which use MPI require days of training (model learning) time on commonly studied datasets. Long running DL applications become susceptible to faults -- requiring development of a fault tolerant system infrastructure, in addition to fault tolerant DL algorithms. This raises an important question: {\\em What is needed from MPI for designing fault tolerant DL implementations?} In this paper, we address this problem for permanent faults. We motivate the need for amore » fault tolerant MPI specification by an in-depth consideration of recent innovations in DL algorithms and their properties, which drive the need for specific fault tolerance features. We present an in-depth discussion on the suitability of different parallelism types (model, data and hybrid); a need (or lack thereof) for check-pointing of any critical data structures; and most importantly, consideration for several fault tolerance proposals (user-level fault mitigation (ULFM), Reinit) in MPI and their applicability to fault tolerant DL implementations. We leverage a distributed memory implementation of Caffe, currently available under the Machine Learning Toolkit for Extreme Scale (MaTEx). We implement our approaches by extending MaTEx-Caffe for using ULFM-based implementation. Our evaluation using the ImageNet dataset and AlexNet neural network topology demonstrates the effectiveness of the proposed fault tolerant DL implementation using OpenMPI based ULFM.« less

  16. Spice Tools Supporting Planetary Remote Sensing

    NASA Astrophysics Data System (ADS)

    Acton, C.; Bachman, N.; Semenov, B.; Wright, E.

    2016-06-01

    NASA's "SPICE"* ancillary information system has gradually become the de facto international standard for providing scientists the fundamental observation geometry needed to perform photogrammetry, map making and other kinds of planetary science data analysis. SPICE provides position and orientation ephemerides of both the robotic spacecraft and the target body; target body size and shape data; instrument mounting alignment and field-of-view geometry; reference frame specifications; and underlying time system conversions. SPICE comprises not only data, but also a large suite of software, known as the SPICE Toolkit, used to access those data and subsequently compute derived quantities-items such as instrument viewing latitude/longitude, lighting angles, altitude, etc. In existence since the days of the Magellan mission to Venus, the SPICE system has continuously grown to better meet the needs of scientists and engineers. For example, originally the SPICE Toolkit was offered only in Fortran 77, but is now available in C, IDL, MATLAB, and Java Native Interface. SPICE calculations were originally available only using APIs (subroutines), but can now be executed using a client-server interface to a geometry engine. Originally SPICE "products" were only available in numeric form, but now SPICE data visualization is also available. The SPICE components are free of cost, license and export restrictions. Substantial tutorials and programming lessons help new users learn to employ SPICE calculations in their own programs. The SPICE system is implemented and maintained by the Navigation and Ancillary Information Facility (NAIF)-a component of NASA's Planetary Data System (PDS). * Spacecraft, Planet, Instrument, Camera-matrix, Events

  17. Metagenomic assembly through the lens of validation: recent advances in assessing and improving the quality of genomes assembled from metagenomes.

    PubMed

    Olson, Nathan D; Treangen, Todd J; Hill, Christopher M; Cepeda-Espinoza, Victoria; Ghurye, Jay; Koren, Sergey; Pop, Mihai

    2017-08-07

    Metagenomic samples are snapshots of complex ecosystems at work. They comprise hundreds of known and unknown species, contain multiple strain variants and vary greatly within and across environments. Many microbes found in microbial communities are not easily grown in culture making their DNA sequence our only clue into their evolutionary history and biological function. Metagenomic assembly is a computational process aimed at reconstructing genes and genomes from metagenomic mixtures. Current methods have made significant strides in reconstructing DNA segments comprising operons, tandem gene arrays and syntenic blocks. Shorter, higher-throughput sequencing technologies have become the de facto standard in the field. Sequencers are now able to generate billions of short reads in only a few days. Multiple metagenomic assembly strategies, pipelines and assemblers have appeared in recent years. Owing to the inherent complexity of metagenome assembly, regardless of the assembly algorithm and sequencing method, metagenome assemblies contain errors. Recent developments in assembly validation tools have played a pivotal role in improving metagenomics assemblers. Here, we survey recent progress in the field of metagenomic assembly, provide an overview of key approaches for genomic and metagenomic assembly validation and demonstrate the insights that can be derived from assemblies through the use of assembly validation strategies. We also discuss the potential for impact of long-read technologies in metagenomics. We conclude with a discussion of future challenges and opportunities in the field of metagenomic assembly and validation. © The Author 2017. Published by Oxford University Press.

  18. Multivariate meta-analysis: potential and promise.

    PubMed

    Jackson, Dan; Riley, Richard; White, Ian R

    2011-09-10

    The multivariate random effects model is a generalization of the standard univariate model. Multivariate meta-analysis is becoming more commonly used and the techniques and related computer software, although continually under development, are now in place. In order to raise awareness of the multivariate methods, and discuss their advantages and disadvantages, we organized a one day 'Multivariate meta-analysis' event at the Royal Statistical Society. In addition to disseminating the most recent developments, we also received an abundance of comments, concerns, insights, critiques and encouragement. This article provides a balanced account of the day's discourse. By giving others the opportunity to respond to our assessment, we hope to ensure that the various view points and opinions are aired before multivariate meta-analysis simply becomes another widely used de facto method without any proper consideration of it by the medical statistics community. We describe the areas of application that multivariate meta-analysis has found, the methods available, the difficulties typically encountered and the arguments for and against the multivariate methods, using four representative but contrasting examples. We conclude that the multivariate methods can be useful, and in particular can provide estimates with better statistical properties, but also that these benefits come at the price of making more assumptions which do not result in better inference in every case. Although there is evidence that multivariate meta-analysis has considerable potential, it must be even more carefully applied than its univariate counterpart in practice. Copyright © 2011 John Wiley & Sons, Ltd.

  19. Multivariate meta-analysis: Potential and promise

    PubMed Central

    Jackson, Dan; Riley, Richard; White, Ian R

    2011-01-01

    The multivariate random effects model is a generalization of the standard univariate model. Multivariate meta-analysis is becoming more commonly used and the techniques and related computer software, although continually under development, are now in place. In order to raise awareness of the multivariate methods, and discuss their advantages and disadvantages, we organized a one day ‘Multivariate meta-analysis’ event at the Royal Statistical Society. In addition to disseminating the most recent developments, we also received an abundance of comments, concerns, insights, critiques and encouragement. This article provides a balanced account of the day's discourse. By giving others the opportunity to respond to our assessment, we hope to ensure that the various view points and opinions are aired before multivariate meta-analysis simply becomes another widely used de facto method without any proper consideration of it by the medical statistics community. We describe the areas of application that multivariate meta-analysis has found, the methods available, the difficulties typically encountered and the arguments for and against the multivariate methods, using four representative but contrasting examples. We conclude that the multivariate methods can be useful, and in particular can provide estimates with better statistical properties, but also that these benefits come at the price of making more assumptions which do not result in better inference in every case. Although there is evidence that multivariate meta-analysis has considerable potential, it must be even more carefully applied than its univariate counterpart in practice. Copyright © 2011 John Wiley & Sons, Ltd. PMID:21268052

  20. Copyright at the Bedside: Should We Stop the Spread?

    PubMed Central

    Feldman, Robin; Newman, John

    2014-01-01

    We recently published an article in the New England Journal of Medicine describing a crisis in cognitive testing, as doctors and medical researchers increasingly face copyright claims in sets of questions used for testing mental state. We encouraged the creation of a cultural norm in medicine, in which medical researchers would ensure continued availability of their tests through open source licensing for any copyrights that might exist. In this piece, we consider the legal side of the question. Although copyrights are being copiously asserted in medical testing, are those rights valid, and should they be upheld? The legal precedents in this area are anything but clear, and the courts are divided in the few analogous circumstances that have arisen. We examine analogies in standardized testing, computer compilations and baseball pitching forms to consider the marvelous question of how to conceptualize a process—which is the purview of patent law—when that process consists of words—which are the purview of copyright law. We also look from an economics perspective at the issue of investment and value creation in the development of de facto standards. Legal scholars are so often in the position of looking backwards, teasing out solutions to problems that have developed within a doctrinal or theoretical area. Rarely does one have the opportunity to affect the course of events before problems become so deeply entrenched that they are intractable. This is such a moment, and the legal and medical fields should take advantage of the opportunities presented. PMID:25221427

  1. Power Efficient Hardware Architecture of SHA-1 Algorithm for Trusted Mobile Computing

    NASA Astrophysics Data System (ADS)

    Kim, Mooseop; Ryou, Jaecheol

    The Trusted Mobile Platform (TMP) is developed and promoted by the Trusted Computing Group (TCG), which is an industry standard body to enhance the security of the mobile computing environment. The built-in SHA-1 engine in TMP is one of the most important circuit blocks and contributes the performance of the whole platform because it is used as key primitives supporting platform integrity and command authentication. Mobile platforms have very stringent limitations with respect to available power, physical circuit area, and cost. Therefore special architecture and design methods for low power SHA-1 circuit are required. In this paper, we present a novel and efficient hardware architecture of low power SHA-1 design for TMP. Our low power SHA-1 hardware can compute 512-bit data block using less than 7,000 gates and has a power consumption about 1.1 mA on a 0.25μm CMOS process.

  2. An Evaluation of Architectural Platforms for Parallel Navier-Stokes Computations

    NASA Technical Reports Server (NTRS)

    Jayasimha, D. N.; Hayder, M. E.; Pillay, S. K.

    1996-01-01

    We study the computational, communication, and scalability characteristics of a computational fluid dynamics application, which solves the time accurate flow field of a jet using the compressible Navier-Stokes equations, on a variety of parallel architecture platforms. The platforms chosen for this study are a cluster of workstations (the LACE experimental testbed at NASA Lewis), a shared memory multiprocessor (the Cray YMP), and distributed memory multiprocessors with different topologies - the IBM SP and the Cray T3D. We investigate the impact of various networks connecting the cluster of workstations on the performance of the application and the overheads induced by popular message passing libraries used for parallelization. The work also highlights the importance of matching the memory bandwidth to the processor speed for good single processor performance. By studying the performance of an application on a variety of architectures, we are able to point out the strengths and weaknesses of each of the example computing platforms.

  3. Parallelizing Navier-Stokes Computations on a Variety of Architectural Platforms

    NASA Technical Reports Server (NTRS)

    Jayasimha, D. N.; Hayder, M. E.; Pillay, S. K.

    1997-01-01

    We study the computational, communication, and scalability characteristics of a Computational Fluid Dynamics application, which solves the time accurate flow field of a jet using the compressible Navier-Stokes equations, on a variety of parallel architectural platforms. The platforms chosen for this study are a cluster of workstations (the LACE experimental testbed at NASA Lewis), a shared memory multiprocessor (the Cray YMP), distributed memory multiprocessors with different topologies-the IBM SP and the Cray T3D. We investigate the impact of various networks, connecting the cluster of workstations, on the performance of the application and the overheads induced by popular message passing libraries used for parallelization. The work also highlights the importance of matching the memory bandwidth to the processor speed for good single processor performance. By studying the performance of an application on a variety of architectures, we are able to point out the strengths and weaknesses of each of the example computing platforms.

  4. A Web Tool for Research in Nonlinear Optics

    NASA Astrophysics Data System (ADS)

    Prikhod'ko, Nikolay V.; Abramovsky, Viktor A.; Abramovskaya, Natalia V.; Demichev, Andrey P.; Kryukov, Alexandr P.; Polyakov, Stanislav P.

    2016-02-01

    This paper presents a project of developing the web platform called WebNLO for computer modeling of nonlinear optics phenomena. We discuss a general scheme of the platform and a model for interaction between the platform modules. The platform is built as a set of interacting RESTful web services (SaaS approach). Users can interact with the platform through a web browser or command line interface. Such a resource has no analogues in the field of nonlinear optics and will be created for the first time therefore allowing researchers to access high-performance computing resources that will significantly reduce the cost of the research and development process.

  5. Continuous measurement of breast tumour hormone receptor expression: a comparison of two computational pathology platforms.

    PubMed

    Ahern, Thomas P; Beck, Andrew H; Rosner, Bernard A; Glass, Ben; Frieling, Gretchen; Collins, Laura C; Tamimi, Rulla M

    2017-05-01

    Computational pathology platforms incorporate digital microscopy with sophisticated image analysis to permit rapid, continuous measurement of protein expression. We compared two computational pathology platforms on their measurement of breast tumour oestrogen receptor (ER) and progesterone receptor (PR) expression. Breast tumour microarrays from the Nurses' Health Study were stained for ER (n=592) and PR (n=187). One expert pathologist scored cases as positive if ≥1% of tumour nuclei exhibited stain. ER and PR were then measured with the Definiens Tissue Studio (automated) and Aperio Digital Pathology (user-supervised) platforms. Platform-specific measurements were compared using boxplots, scatter plots and correlation statistics. Classification of ER and PR positivity by platform-specific measurements was evaluated with areas under receiver operating characteristic curves (AUC) from univariable logistic regression models, using expert pathologist classification as the standard. Both platforms showed considerable overlap in continuous measurements of ER and PR between positive and negative groups classified by expert pathologist. Platform-specific measurements were strongly and positively correlated with one another (r≥0.77). The user-supervised Aperio workflow performed slightly better than the automated Definiens workflow at classifying ER positivity (AUC Aperio =0.97; AUC Definiens =0.90; difference=0.07, 95% CI 0.05 to 0.09) and PR positivity (AUC Aperio =0.94; AUC Definiens =0.87; difference=0.07, 95% CI 0.03 to 0.12). Paired hormone receptor expression measurements from two different computational pathology platforms agreed well with one another. The user-supervised workflow yielded better classification accuracy than the automated workflow. Appropriately validated computational pathology algorithms enrich molecular epidemiology studies with continuous protein expression data and may accelerate tumour biomarker discovery. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  6. Design of platform for removing screws from LCD display shields

    NASA Astrophysics Data System (ADS)

    Tu, Zimei; Qin, Qin; Dou, Jianfang; Zhu, Dongdong

    2017-11-01

    Removing the screws on the sides of a shield is a necessary process in disassembling a computer LCD display. To solve this issue, a platform has been designed for removing the screws on display shields. This platform uses virtual instrument technology with LabVIEW as the development environment to design the mechanical structure with the technologies of motion control, human-computer interaction and target recognition. This platform removes the screws from the sides of the shield of an LCD display mechanically thus to guarantee follow-up separation and recycle.

  7. Prospects for quantum computing with an array of ultracold polar paramagnetic molecules.

    PubMed

    Karra, Mallikarjun; Sharma, Ketan; Friedrich, Bretislav; Kais, Sabre; Herschbach, Dudley

    2016-03-07

    Arrays of trapped ultracold molecules represent a promising platform for implementing a universal quantum computer. DeMille [Phys. Rev. Lett. 88, 067901 (2002)] has detailed a prototype design based on Stark states of polar (1)Σ molecules as qubits. Herein, we consider an array of polar (2)Σ molecules which are, in addition, inherently paramagnetic and whose Hund's case (b) free-rotor pair-eigenstates are Bell states. We show that by subjecting the array to combinations of concurrent homogeneous and inhomogeneous electric and magnetic fields, the entanglement of the array's Stark and Zeeman states can be tuned and the qubit sites addressed. Two schemes for implementing an optically controlled CNOT gate are proposed and their feasibility discussed in the face of the broadening of spectral lines due to dipole-dipole coupling and the inhomogeneity of the electric and magnetic fields.

  8. Running SINDA '85/FLUINT interactive on the VAX

    NASA Technical Reports Server (NTRS)

    Simmonds, Boris

    1992-01-01

    Computer software as engineering tools are typically run in three modes: Batch, Demand, and Interactive. The first two are the most popular in the SINDA world. The third one is not so popular, due probably to the users inaccessibility to the command procedure files for running SINDA '85, or lack of familiarity with the SINDA '85 execution processes (pre-processor, processor, compilation, linking, execution and all of the file assignment, creation, deletions and de-assignments). Interactive is the mode that makes thermal analysis with SINDA '85 a real-time design tool. This paper explains a command procedure sufficient (the minimum modifications required in an existing demand command procedure) to run SINDA '85 on the VAX in an interactive mode. To exercise the procedure a sample problem is presented exemplifying the mode, plus additional programming capabilities available in SINDA '85. Following the same guidelines the process can be extended to other SINDA '85 residence computer platforms.

  9. On the performances of computer vision algorithms on mobile platforms

    NASA Astrophysics Data System (ADS)

    Battiato, S.; Farinella, G. M.; Messina, E.; Puglisi, G.; Ravì, D.; Capra, A.; Tomaselli, V.

    2012-01-01

    Computer Vision enables mobile devices to extract the meaning of the observed scene from the information acquired with the onboard sensor cameras. Nowadays, there is a growing interest in Computer Vision algorithms able to work on mobile platform (e.g., phone camera, point-and-shot-camera, etc.). Indeed, bringing Computer Vision capabilities on mobile devices open new opportunities in different application contexts. The implementation of vision algorithms on mobile devices is still a challenging task since these devices have poor image sensors and optics as well as limited processing power. In this paper we have considered different algorithms covering classic Computer Vision tasks: keypoint extraction, face detection, image segmentation. Several tests have been done to compare the performances of the involved mobile platforms: Nokia N900, LG Optimus One, Samsung Galaxy SII.

  10. Elastic Cloud Computing Architecture and System for Heterogeneous Spatiotemporal Computing

    NASA Astrophysics Data System (ADS)

    Shi, X.

    2017-10-01

    Spatiotemporal computation implements a variety of different algorithms. When big data are involved, desktop computer or standalone application may not be able to complete the computation task due to limited memory and computing power. Now that a variety of hardware accelerators and computing platforms are available to improve the performance of geocomputation, different algorithms may have different behavior on different computing infrastructure and platforms. Some are perfect for implementation on a cluster of graphics processing units (GPUs), while GPUs may not be useful on certain kind of spatiotemporal computation. This is the same situation in utilizing a cluster of Intel's many-integrated-core (MIC) or Xeon Phi, as well as Hadoop or Spark platforms, to handle big spatiotemporal data. Furthermore, considering the energy efficiency requirement in general computation, Field Programmable Gate Array (FPGA) may be a better solution for better energy efficiency when the performance of computation could be similar or better than GPUs and MICs. It is expected that an elastic cloud computing architecture and system that integrates all of GPUs, MICs, and FPGAs could be developed and deployed to support spatiotemporal computing over heterogeneous data types and computational problems.

  11. Data Fusion and Visualization with the OpenEarth Framework (OEF)

    NASA Astrophysics Data System (ADS)

    Nadeau, D. R.; Baru, C.; Fouch, M. J.; Crosby, C. J.

    2010-12-01

    Data fusion is an increasingly important problem to solve as we strive to integrate data from multiple sources and build better models of the complex processes operating at the Earth’s surface and its interior. These data are often large, multi-dimensional, and subject to differing conventions for file formats, data structures, coordinate spaces, units of measure, and metadata organization. When visualized, these data require differing, and often conflicting, conventions for visual representations, dimensionality, icons, color schemes, labeling, and interaction. These issues make the visualization of fused Earth science data particularly difficult. The OpenEarth Framework (OEF) is an open-source data fusion and visualization suite of software being developed at the Supercomputer Center at the University of California, San Diego. Funded by the NSF, the project is leveraging virtual globe technology from NASA’s WorldWind to create interactive 3D visualization tools that combine layered data from a variety of sources to create a holistic view of features at, above, and beneath the Earth’s surface. The OEF architecture is cross-platform, multi-threaded, modular, and based upon Java. The OEF’s modular approach yields a collection of compatible mix-and-match components for assembling custom applications. Available modules support file format handling, web service communications, data management, data filtering, user interaction, and 3D visualization. File parsers handle a variety of formal and de facto standard file formats. Each one imports data into a general-purpose data representation that supports multidimensional grids, topography, points, lines, polygons, images, and more. From there these data then may be manipulated, merged, filtered, reprojected, and visualized. Visualization features support conventional and new visualization techniques for looking at topography, tomography, maps, and feature geometry. 3D grid data such as seismic tomography may be sliced by multiple oriented cutting planes and isosurfaced to create 3D skins that trace feature boundaries within the data. Topography may be overlaid with satellite imagery along with data such as gravity and magnetics measurements. Multiple data sets may be visualized simultaneously using overlapping layers and a common 3D+time coordinate space. Data management within the OEF handles and hides the quirks of differing file formats, web protocols, storage structures, coordinate spaces, and metadata representations. Derived data are computed automatically to support interaction and visualization while the original data is left unchanged in its original form. Data is cached for better memory and network efficiency, and all visualization is accelerated by 3D graphics hardware found on today’s computers. The OpenEarth Framework project is currently prototyping the software for use in the visualization, and integration of continental scale geophysical data being produced by EarthScope-related research in the Western US. The OEF is providing researchers with new ways to display and interrogate their data and is anticipated to be a valuable tool for future EarthScope-related research.

  12. Leveraging CyVerse Resources for De Novo Comparative Transcriptomics of Underserved (Non-model) Organisms

    PubMed Central

    Joyce, Blake L.; Haug-Baltzell, Asher K.; Hulvey, Jonathan P.; McCarthy, Fiona; Devisetty, Upendra Kumar; Lyons, Eric

    2017-01-01

    This workflow allows novice researchers to leverage advanced computational resources such as cloud computing to carry out pairwise comparative transcriptomics. It also serves as a primer for biologists to develop data scientist computational skills, e.g. executing bash commands, visualization and management of large data sets. All command line code and further explanations of each command or step can be found on the wiki (https://wiki.cyverse.org/wiki/x/dgGtAQ). The Discovery Environment and Atmosphere platforms are connected together through the CyVerse Data Store. As such, once the initial raw sequencing data has been uploaded there is no more need to transfer large data files over an Internet connection, minimizing the amount of time needed to conduct analyses. This protocol is designed to analyze only two experimental treatments or conditions. Differential gene expression analysis is conducted through pairwise comparisons, and will not be suitable to test multiple factors. This workflow is also designed to be manual rather than automated. Each step must be executed and investigated by the user, yielding a better understanding of data and analytical outputs, and therefore better results for the user. Once complete, this protocol will yield de novo assembled transcriptome(s) for underserved (non-model) organisms without the need to map to previously assembled reference genomes (which are usually not available in underserved organism). These de novo transcriptomes are further used in pairwise differential gene expression analysis to investigate genes differing between two experimental conditions. Differentially expressed genes are then functionally annotated to understand the genetic response organisms have to experimental conditions. In total, the data derived from this protocol is used to test hypotheses about biological responses of underserved organisms. PMID:28518075

  13. Discovery and analysis of time delay sources in the USGS personal computer data collection platform (PCDCP) system

    USGS Publications Warehouse

    White, Timothy C.; Sauter, Edward A.; Stewart, Duff C.

    2014-01-01

    Intermagnet is an international oversight group which exists to establish a global network for geomagnetic observatories. This group establishes data standards and standard operating procedures for members and prospective members. Intermagnet has proposed a new One-Second Data Standard, for that emerging geomagnetic product. The standard specifies that all data collected must have a time stamp accuracy of ±10 milliseconds of the top-of-the-second Coordinated Universal Time. Therefore, the U.S. Geological Survey Geomagnetism Program has designed and executed several tests on its current data collection system, the Personal Computer Data Collection Platform. Tests are designed to measure the time shifts introduced by individual components within the data collection system, as well as to measure the time shift introduced by the entire Personal Computer Data Collection Platform. Additional testing designed for Intermagnet will be used to validate further such measurements. Current results of the measurements showed a 5.0–19.9 millisecond lag for the vertical channel (Z) of the Personal Computer Data Collection Platform and a 13.0–25.8 millisecond lag for horizontal channels (H and D) of the collection system. These measurements represent a dynamically changing delay introduced within the U.S. Geological Survey Personal Computer Data Collection Platform.

  14. SU-D-BRD-02: A Web-Based Image Processing and Plan Evaluation Platform (WIPPEP) for Future Cloud-Based Radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chai, X; Liu, L; Xing, L

    Purpose: Visualization and processing of medical images and radiation treatment plan evaluation have traditionally been constrained to local workstations with limited computation power and ability of data sharing and software update. We present a web-based image processing and planning evaluation platform (WIPPEP) for radiotherapy applications with high efficiency, ubiquitous web access, and real-time data sharing. Methods: This software platform consists of three parts: web server, image server and computation server. Each independent server communicates with each other through HTTP requests. The web server is the key component that provides visualizations and user interface through front-end web browsers and relay informationmore » to the backend to process user requests. The image server serves as a PACS system. The computation server performs the actual image processing and dose calculation. The web server backend is developed using Java Servlets and the frontend is developed using HTML5, Javascript, and jQuery. The image server is based on open source DCME4CHEE PACS system. The computation server can be written in any programming language as long as it can send/receive HTTP requests. Our computation server was implemented in Delphi, Python and PHP, which can process data directly or via a C++ program DLL. Results: This software platform is running on a 32-core CPU server virtually hosting the web server, image server, and computation servers separately. Users can visit our internal website with Chrome browser, select a specific patient, visualize image and RT structures belonging to this patient and perform image segmentation running Delphi computation server and Monte Carlo dose calculation on Python or PHP computation server. Conclusion: We have developed a webbased image processing and plan evaluation platform prototype for radiotherapy. This system has clearly demonstrated the feasibility of performing image processing and plan evaluation platform through a web browser and exhibited potential for future cloud based radiotherapy.« less

  15. Using SPARK as a Solver for Modelica

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wetter, Michael; Wetter, Michael; Haves, Philip

    Modelica is an object-oriented acausal modeling language that is well positioned to become a de-facto standard for expressing models of complex physical systems. To simulate a model expressed in Modelica, it needs to be translated into executable code. For generating run-time efficient code, such a translation needs to employ algebraic formula manipulations. As the SPARK solver has been shown to be competitive for generating such code but currently cannot be used with the Modelica language, we report in this paper how SPARK's symbolic and numerical algorithms can be implemented in OpenModelica, an open-source implementation of a Modelica modeling and simulationmore » environment. We also report benchmark results that show that for our air flow network simulation benchmark, the SPARK solver is competitive with Dymola, which is believed to provide the best solver for Modelica.« less

  16. A Novel Approach to model EPIC variable background

    NASA Astrophysics Data System (ADS)

    Marelli, M.; De Luca, A.; Salvetti, D.; Belfiore, A.

    2017-10-01

    One of the main aim of the EXTraS (Exploring the X-ray Transient and variable Sky) project is to characterise the variability of serendipitous XMM-Newton sources within each single observation. Unfortunately, 164 Ms out of the 774 Ms of cumulative exposure considered (21%) are badly affected by soft proton flares, hampering any classical analysis of field sources. De facto, the latest releases of the 3XMM catalog, as well as most of the analysis in literature, simply exclude these 'high background' periods from analysis. We implemented a novel SAS-indipendent approach to produce background-subtracted light curves, which allows to treat the case of very faint sources and very bright proton flares. EXTraS light curves of 3XMM-DR5 sources will be soon released to the community, together with new tools we are developing.

  17. Estimating frame bulk and shear moduli of two double porosity layers by ultrasound transmission.

    PubMed

    Bai, Ruonan; Tinel, Alain; Alem, Abdellah; Franklin, Hervé; Wang, Huaqing

    2016-08-01

    The acoustic plane wave transmission by water saturated double porosity media is investigated. Two samples of double porosity media assumed to obey Berryman and Wang (BW) extension (Berryman and Wang, 1995, 2000) of Biot's theory in the low frequency regime are under consideration: ROBU® (pure binder-free borosilicate glass 3.3 manufactured to form the individual grains) and Tobermorite 11Å (the individual porous cement grains show irregular shapes). The de facto gap existing between theoretical and experimental data can be minimized by modifying adequately two of the parameters estimated from triaxial tests: the frame bulk and shear moduli. The frequency dependent imaginary parts that follow necessary from the minimization are in relation with the energy losses due to contact relaxation and friction between grains. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Trade treaties and alcohol advertising policy.

    PubMed

    Gould, Ellen

    2005-09-01

    Restrictions on alcohol advertising are vulnerable to challenge under international trade agreements. As countries negotiate new trade treaties and expand the scope of existing ones, the risk of such a challenge increases. While alcohol advertising restrictions normally do not distinguish between foreign and domestic products, this neutral character does not protect them from being challenged under trade rules. The article analyzes four provisions of trade agreements--expropriation, de facto discrimination, market access, and necessity--in relation to the jeopardy they pose for alcohol advertising restrictions. Key cases are reviewed to illustrate how these provisions have been used to either overturn existing advertising restrictions or prevent new ones from coming into force. The article also reports on the mixed results governments have had in trying to justify their regulations to trade panels and the stringent criteria imposed for proving that a regulation is "necessary."

  19. The tyranny of taste: the case of organic rice in Cambodia.

    PubMed

    Thavat, Maylee

    2011-01-01

    Fair-trade and organic products are often sold at price premiums justified by smaller production volumes that are associated with greater social and environmental responsibility. The consumption of these products confers on the consumer a greater sense of morality – and usually a claim to better taste. This paper tells the story of attempts to promote organic/fair-trade rice production by de facto organic Cambodian farmers for export to North American and European markets in order to assist poor farmers to trade their way out of poverty. It demonstrates that instead of promoting sustainable agriculture and fair trade between developed and developing markets, organic/fair-trade projects may impose First World consumer ideals and tastes that are out of step with the larger realities of agrarian transition in Cambodia and the wider region of developing Southeast Asia.

  20. Life satisfaction and sexual minorities: Evidence from Australia and the United Kingdom

    PubMed Central

    Powdthavee, Nattavudh; Wooden, Mark

    2017-01-01

    Very little is known about how the differential treatment of sexual minorities could influence subjective reports of overall well-being. This paper seeks to fill this gap. Data from two large surveys that provide nationally representative samples for two different countries –Australia and the UK – are used to estimate a simultaneous equations model of life satisfaction. The model allows for self-reported sexual identity to influence a measure of life satisfaction both directly and indirectly through seven different channels: (i) income; (ii) employment; (iii) health (iv) marriage and de facto relationships; (v) children; (vi) friendship networks; and (vii) education. Lesbian, gay and bisexual persons are found to be significantly less satisfied with their lives than otherwise comparable heterosexual persons. In both countries this is the result of a combination of direct and indirect effects. PMID:29238117

  1. Universal patterns or the tale of two systems? Mathematics achievement and educational expectations in post-socialist Europe

    PubMed Central

    Bodovski, Katerina; Kotok, Stephen; Henck, Adrienne

    2014-01-01

    Although communist ideology claimed to destroy former class stratification based on labor market capitalist relationships, de facto during socialism one social class hierarchy was substituted for another that was equally unequal. The economic transition during the 1990s increased stratification by wealth, which affected educational inequality. This study examines the relationships among parental education, gender, educational expectations, and mathematics achievement of youths in five post-socialist Eastern European countries, comparing them with three Western countries. We employed the 8th-grade data from the Trends in International Mathematics and Science Study (TIMSS) 1995 and 2007. The findings point to the universal associations between parental education and student outcomes, whereas gender comparisons present interesting East-West differences. The theoretical and policy implications of these findings are discussed. PMID:25346564

  2. Digital imaging of root traits (DIRT): a high-throughput computing and collaboration platform for field-based root phenomics.

    PubMed

    Das, Abhiram; Schneider, Hannah; Burridge, James; Ascanio, Ana Karine Martinez; Wojciechowski, Tobias; Topp, Christopher N; Lynch, Jonathan P; Weitz, Joshua S; Bucksch, Alexander

    2015-01-01

    Plant root systems are key drivers of plant function and yield. They are also under-explored targets to meet global food and energy demands. Many new technologies have been developed to characterize crop root system architecture (CRSA). These technologies have the potential to accelerate the progress in understanding the genetic control and environmental response of CRSA. Putting this potential into practice requires new methods and algorithms to analyze CRSA in digital images. Most prior approaches have solely focused on the estimation of root traits from images, yet no integrated platform exists that allows easy and intuitive access to trait extraction and analysis methods from images combined with storage solutions linked to metadata. Automated high-throughput phenotyping methods are increasingly used in laboratory-based efforts to link plant genotype with phenotype, whereas similar field-based studies remain predominantly manual low-throughput. Here, we present an open-source phenomics platform "DIRT", as a means to integrate scalable supercomputing architectures into field experiments and analysis pipelines. DIRT is an online platform that enables researchers to store images of plant roots, measure dicot and monocot root traits under field conditions, and share data and results within collaborative teams and the broader community. The DIRT platform seamlessly connects end-users with large-scale compute "commons" enabling the estimation and analysis of root phenotypes from field experiments of unprecedented size. DIRT is an automated high-throughput computing and collaboration platform for field based crop root phenomics. The platform is accessible at http://www.dirt.iplantcollaborative.org/ and hosted on the iPlant cyber-infrastructure using high-throughput grid computing resources of the Texas Advanced Computing Center (TACC). DIRT is a high volume central depository and high-throughput RSA trait computation platform for plant scientists working on crop roots. It enables scientists to store, manage and share crop root images with metadata and compute RSA traits from thousands of images in parallel. It makes high-throughput RSA trait computation available to the community with just a few button clicks. As such it enables plant scientists to spend more time on science rather than on technology. All stored and computed data is easily accessible to the public and broader scientific community. We hope that easy data accessibility will attract new tool developers and spur creative data usage that may even be applied to other fields of science.

  3. GenomicTools: a computational platform for developing high-throughput analytics in genomics.

    PubMed

    Tsirigos, Aristotelis; Haiminen, Niina; Bilal, Erhan; Utro, Filippo

    2012-01-15

    Recent advances in sequencing technology have resulted in the dramatic increase of sequencing data, which, in turn, requires efficient management of computational resources, such as computing time, memory requirements as well as prototyping of computational pipelines. We present GenomicTools, a flexible computational platform, comprising both a command-line set of tools and a C++ API, for the analysis and manipulation of high-throughput sequencing data such as DNA-seq, RNA-seq, ChIP-seq and MethylC-seq. GenomicTools implements a variety of mathematical operations between sets of genomic regions thereby enabling the prototyping of computational pipelines that can address a wide spectrum of tasks ranging from pre-processing and quality control to meta-analyses. Additionally, the GenomicTools platform is designed to analyze large datasets of any size by minimizing memory requirements. In practical applications, where comparable, GenomicTools outperforms existing tools in terms of both time and memory usage. The GenomicTools platform (version 2.0.0) was implemented in C++. The source code, documentation, user manual, example datasets and scripts are available online at http://code.google.com/p/ibm-cbc-genomic-tools.

  4. Three-Dimensional Multiscale, Multistable, and Geometrically Diverse Microstructures with Tunable Vibrational Dynamics Assembled by Compressive Buckling.

    PubMed

    Ning, Xin; Wang, Heling; Yu, Xinge; Soares, Julio A N T; Yan, Zheng; Nan, Kewang; Velarde, Gabriel; Xue, Yeguang; Sun, Rujie; Dong, Qiyi; Luan, Haiwen; Lee, Chan Mi; Chempakasseril, Aditya; Han, Mengdi; Wang, Yiqi; Li, Luming; Huang, Yonggang; Zhang, Yihui; Rogers, John

    2017-04-11

    Microelectromechanical systems remain an area of significant interest in fundamental and applied research due to their wide ranging applications. Most device designs, however, are largely two-dimensional and constrained to only a few simple geometries. Achieving tunable resonant frequencies or broad operational bandwidths requires complex components and/or fabrication processes. The work presented here reports unusual classes of three-dimensional (3D) micromechanical systems in the form of vibratory platforms assembled by controlled compressive buckling. Such 3D structures can be fabricated across a broad range of length scales and from various materials, including soft polymers, monocrystalline silicon, and their composites, resulting in a wide scope of achievable resonant frequencies and mechanical behaviors. Platforms designed with multistable mechanical responses and vibrationally de-coupled constituent elements offer improved bandwidth and frequency tunability. Furthermore, the resonant frequencies can be controlled through deformations of an underlying elastomeric substrate. Systematic experimental and computational studies include structures with diverse geometries, ranging from tables, cages, rings, ring-crosses, ring-disks, two-floor ribbons, flowers, umbrellas, triple-cantilever platforms, and asymmetric circular helices, to multilayer constructions. These ideas form the foundations for engineering designs that complement those supported by conventional, microelectromechanical systems, with capabilities that could be useful in systems for biosensing, energy harvesting and others.

  5. Computation of dark frames in digital imagers

    NASA Astrophysics Data System (ADS)

    Widenhorn, Ralf; Rest, Armin; Blouke, Morley M.; Berry, Richard L.; Bodegom, Erik

    2007-02-01

    Dark current is caused by electrons that are thermally exited into the conduction band. These electrons are collected by the well of the CCD and add a false signal to the chip. We will present an algorithm that automatically corrects for dark current. It uses a calibration protocol to characterize the image sensor for different temperatures. For a given exposure time, the dark current of every pixel is characteristic of a specific temperature. The dark current of every pixel can therefore be used as an indicator of the temperature. Hot pixels have the highest signal-to-noise ratio and are the best temperature sensors. We use the dark current of a several hundred hot pixels to sense the chip temperature and predict the dark current of all pixels on the chip. Dark current computation is not a new concept, but our approach is unique. Some advantages of our method include applicability for poorly temperature-controlled camera systems and the possibility of ex post facto dark current correction.

  6. The performance of low-cost commercial cloud computing as an alternative in computational chemistry.

    PubMed

    Thackston, Russell; Fortenberry, Ryan C

    2015-05-05

    The growth of commercial cloud computing (CCC) as a viable means of computational infrastructure is largely unexplored for the purposes of quantum chemistry. In this work, the PSI4 suite of computational chemistry programs is installed on five different types of Amazon World Services CCC platforms. The performance for a set of electronically excited state single-point energies is compared between these CCC platforms and typical, "in-house" physical machines. Further considerations are made for the number of cores or virtual CPUs (vCPUs, for the CCC platforms), but no considerations are made for full parallelization of the program (even though parallelization of the BLAS library is implemented), complete high-performance computing cluster utilization, or steal time. Even with this most pessimistic view of the computations, CCC resources are shown to be more cost effective for significant numbers of typical quantum chemistry computations. Large numbers of large computations are still best utilized by more traditional means, but smaller-scale research may be more effectively undertaken through CCC services. © 2015 Wiley Periodicals, Inc.

  7. A Platform-Independent Plugin for Navigating Online Radiology Cases.

    PubMed

    Balkman, Jason D; Awan, Omer A

    2016-06-01

    Software methods that enable navigation of radiology cases on various digital platforms differ between handheld devices and desktop computers. This has resulted in poor compatibility of online radiology teaching files across mobile smartphones, tablets, and desktop computers. A standardized, platform-independent, or "agnostic" approach for presenting online radiology content was produced in this work by leveraging modern hypertext markup language (HTML) and JavaScript web software technology. We describe the design and evaluation of this software, demonstrate its use across multiple viewing platforms, and make it publicly available as a model for future development efforts.

  8. Unified, Cross-Platform, Open-Source Library Package for High-Performance Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kozacik, Stephen

    Compute power is continually increasing, but this increased performance is largely found in sophisticated computing devices and supercomputer resources that are difficult to use, resulting in under-utilization. We developed a unified set of programming tools that will allow users to take full advantage of the new technology by allowing them to work at a level abstracted away from the platform specifics, encouraging the use of modern computing systems, including government-funded supercomputer facilities.

  9. University Students Use of Computers and Mobile Devices for Learning and Their Reading Speed on Different Platforms

    ERIC Educational Resources Information Center

    Mpofu, Bongeka

    2016-01-01

    This research was aimed at the investigation of mobile device and computer use at a higher learning institution. The goal was to determine the current use of computers and mobile devices for learning and the students' reading speed on different platforms. The research was contextualised in a sample of students at the University of South Africa.…

  10. Application verification research of cloud computing technology in the field of real time aerospace experiment

    NASA Astrophysics Data System (ADS)

    Wan, Junwei; Chen, Hongyan; Zhao, Jing

    2017-08-01

    According to the requirements of real-time, reliability and safety for aerospace experiment, the single center cloud computing technology application verification platform is constructed. At the IAAS level, the feasibility of the cloud computing technology be applied to the field of aerospace experiment is tested and verified. Based on the analysis of the test results, a preliminary conclusion is obtained: Cloud computing platform can be applied to the aerospace experiment computing intensive business. For I/O intensive business, it is recommended to use the traditional physical machine.

  11. Performance optimization of Qbox and WEST on Intel Knights Landing

    NASA Astrophysics Data System (ADS)

    Zheng, Huihuo; Knight, Christopher; Galli, Giulia; Govoni, Marco; Gygi, Francois

    We present the optimization of electronic structure codes Qbox and WEST targeting the Intel®Xeon Phi™processor, codenamed Knights Landing (KNL). Qbox is an ab-initio molecular dynamics code based on plane wave density functional theory (DFT) and WEST is a post-DFT code for excited state calculations within many-body perturbation theory. Both Qbox and WEST employ highly scalable algorithms which enable accurate large-scale electronic structure calculations on leadership class supercomputer platforms beyond 100,000 cores, such as Mira and Theta at the Argonne Leadership Computing Facility. In this work, features of the KNL architecture (e.g. hierarchical memory) are explored to achieve higher performance in key algorithms of the Qbox and WEST codes and to develop a road-map for further development targeting next-generation computing architectures. In particular, the optimizations of the Qbox and WEST codes on the KNL platform will target efficient large-scale electronic structure calculations of nanostructured materials exhibiting complex structures and prediction of their electronic and thermal properties for use in solar and thermal energy conversion device. This work was supported by MICCoM, as part of Comp. Mats. Sci. Program funded by the U.S. DOE, Office of Sci., BES, MSE Division. This research used resources of the ALCF, which is a DOE Office of Sci. User Facility under Contract DE-AC02-06CH11357.

  12. GATECloud.net: a platform for large-scale, open-source text processing on the cloud.

    PubMed

    Tablan, Valentin; Roberts, Ian; Cunningham, Hamish; Bontcheva, Kalina

    2013-01-28

    Cloud computing is increasingly being regarded as a key enabler of the 'democratization of science', because on-demand, highly scalable cloud computing facilities enable researchers anywhere to carry out data-intensive experiments. In the context of natural language processing (NLP), algorithms tend to be complex, which makes their parallelization and deployment on cloud platforms a non-trivial task. This study presents a new, unique, cloud-based platform for large-scale NLP research--GATECloud. net. It enables researchers to carry out data-intensive NLP experiments by harnessing the vast, on-demand compute power of the Amazon cloud. Important infrastructural issues are dealt with by the platform, completely transparently for the researcher: load balancing, efficient data upload and storage, deployment on the virtual machines, security and fault tolerance. We also include a cost-benefit analysis and usage evaluation.

  13. Assessment of left ventricular size and function by 3-dimensional transthoracic echocardiography: Impact of the echocardiography platform and analysis software.

    PubMed

    Castel, Anne Laure; Toledano, Manuel; Tribouilloy, Christophe; Delelis, François; Mailliet, Amandine; Marotte, Nathalie; Guerbaai, Raphaëlle A; Levy, Franck; Graux, Pierre; Ennezat, Pierre-Vladimir; Maréchaux, Sylvestre

    2018-05-27

    Whether echocardiography platform and analysis software impact left ventricular (LV) volumes, ejection fraction (EF), and stroke volume (SV) by transthoracic tridimensional echocardiography (3DE) has not yet been assessed. Hence, our aim was to compare 3DE LV end-diastolic and end-systolic volumes (EDV and ESV), LVEF, and SV obtained with echocardiography platform from 2 different manufacturers. 3DE was performed in 84 patients (65% of screened consecutive patients), with equipment from 2 different manufacturers, with subsequent off-line postprocessing to obtain parameters of LV function and size (Philips QLAB 3DQ and General Electric EchoPAC 4D autoLVQ). Twenty-five patients with clinical indication for cardiac magnetic resonance imaging served as a validation subgroup. LVEDV and LVESV from 2 vendors were highly correlated (r = 0.93), but compared with 4D autoLVQ, the use of Qlab 3DQ resulted in lower LVEDV and LVESV (bias: 11 mL, limits of agreement: -25 to +47 and bias: 6 mL, limits of agreement: -22 to +34, respectively). The agreement between LVEF values of each software was poor (intraclass correlation coefficient 0.62) despite no or minimal bias. SVs were also lower with Qlab 3DQ advanced compared with 4D autoLVQ, and both were poorly correlated (r = 0.66). Consistently, the underestimation of LVEDV, LVESV, and SV by 3DE compared with cardiac magnetic resonance imaging was more pronounced with Philips QLAB 3DQ advanced than with 4D autoLVQ. The echocardiography platform and analysis software significantly affect the values of LV parameters obtained by 3DE. Intervendor standardization and improvements in 3DE modalities are needed to broaden the use of LV parameters obtained by 3DE in clinical practice. Copyright © 2018. Published by Elsevier Inc.

  14. Open chemistry: RESTful web APIs, JSON, NWChem and the modern web application.

    PubMed

    Hanwell, Marcus D; de Jong, Wibe A; Harris, Christopher J

    2017-10-30

    An end-to-end platform for chemical science research has been developed that integrates data from computational and experimental approaches through a modern web-based interface. The platform offers an interactive visualization and analytics environment that functions well on mobile, laptop and desktop devices. It offers pragmatic solutions to ensure that large and complex data sets are more accessible. Existing desktop applications/frameworks were extended to integrate with high-performance computing resources, and offer command-line tools to automate interaction-connecting distributed teams to this software platform on their own terms. The platform was developed openly, and all source code hosted on the GitHub platform with automated deployment possible using Ansible coupled with standard Ubuntu-based machine images deployed to cloud machines. The platform is designed to enable teams to reap the benefits of the connected web-going beyond what conventional search and analytics platforms offer in this area. It also has the goal of offering federated instances, that can be customized to the sites/research performed. Data gets stored using JSON, extending upon previous approaches using XML, building structures that support computational chemistry calculations. These structures were developed to make it easy to process data across different languages, and send data to a JavaScript-based web client.

  15. Embedded-Based Graphics Processing Unit Cluster Platform for Multiple Sequence Alignments

    PubMed Central

    Wei, Jyh-Da; Cheng, Hui-Jun; Lin, Chun-Yuan; Ye, Jin; Yeh, Kuan-Yu

    2017-01-01

    High-end graphics processing units (GPUs), such as NVIDIA Tesla/Fermi/Kepler series cards with thousands of cores per chip, are widely applied to high-performance computing fields in a decade. These desktop GPU cards should be installed in personal computers/servers with desktop CPUs, and the cost and power consumption of constructing a GPU cluster platform are very high. In recent years, NVIDIA releases an embedded board, called Jetson Tegra K1 (TK1), which contains 4 ARM Cortex-A15 CPUs and 192 Compute Unified Device Architecture cores (belong to Kepler GPUs). Jetson Tegra K1 has several advantages, such as the low cost, low power consumption, and high applicability, and it has been applied into several specific applications. In our previous work, a bioinformatics platform with a single TK1 (STK platform) was constructed, and this previous work is also used to prove that the Web and mobile services can be implemented in the STK platform with a good cost-performance ratio by comparing a STK platform with the desktop CPU and GPU. In this work, an embedded-based GPU cluster platform will be constructed with multiple TK1s (MTK platform). Complex system installation and setup are necessary procedures at first. Then, 2 job assignment modes are designed for the MTK platform to provide services for users. Finally, ClustalW v2.0.11 and ClustalWtk will be ported to the MTK platform. The experimental results showed that the speedup ratios achieved 5.5 and 4.8 times for ClustalW v2.0.11 and ClustalWtk, respectively, by comparing 6 TK1s with a single TK1. The MTK platform is proven to be useful for multiple sequence alignments. PMID:28835734

  16. Embedded-Based Graphics Processing Unit Cluster Platform for Multiple Sequence Alignments.

    PubMed

    Wei, Jyh-Da; Cheng, Hui-Jun; Lin, Chun-Yuan; Ye, Jin; Yeh, Kuan-Yu

    2017-01-01

    High-end graphics processing units (GPUs), such as NVIDIA Tesla/Fermi/Kepler series cards with thousands of cores per chip, are widely applied to high-performance computing fields in a decade. These desktop GPU cards should be installed in personal computers/servers with desktop CPUs, and the cost and power consumption of constructing a GPU cluster platform are very high. In recent years, NVIDIA releases an embedded board, called Jetson Tegra K1 (TK1), which contains 4 ARM Cortex-A15 CPUs and 192 Compute Unified Device Architecture cores (belong to Kepler GPUs). Jetson Tegra K1 has several advantages, such as the low cost, low power consumption, and high applicability, and it has been applied into several specific applications. In our previous work, a bioinformatics platform with a single TK1 (STK platform) was constructed, and this previous work is also used to prove that the Web and mobile services can be implemented in the STK platform with a good cost-performance ratio by comparing a STK platform with the desktop CPU and GPU. In this work, an embedded-based GPU cluster platform will be constructed with multiple TK1s (MTK platform). Complex system installation and setup are necessary procedures at first. Then, 2 job assignment modes are designed for the MTK platform to provide services for users. Finally, ClustalW v2.0.11 and ClustalWtk will be ported to the MTK platform. The experimental results showed that the speedup ratios achieved 5.5 and 4.8 times for ClustalW v2.0.11 and ClustalWtk, respectively, by comparing 6 TK1s with a single TK1. The MTK platform is proven to be useful for multiple sequence alignments.

  17. GPU-based High-Performance Computing for Radiation Therapy

    PubMed Central

    Jia, Xun; Ziegenhein, Peter; Jiang, Steve B.

    2014-01-01

    Recent developments in radiotherapy therapy demand high computation powers to solve challenging problems in a timely fashion in a clinical environment. Graphics processing unit (GPU), as an emerging high-performance computing platform, has been introduced to radiotherapy. It is particularly attractive due to its high computational power, small size, and low cost for facility deployment and maintenance. Over the past a few years, GPU-based high-performance computing in radiotherapy has experienced rapid developments. A tremendous amount of studies have been conducted, in which large acceleration factors compared with the conventional CPU platform have been observed. In this article, we will first give a brief introduction to the GPU hardware structure and programming model. We will then review the current applications of GPU in major imaging-related and therapy-related problems encountered in radiotherapy. A comparison of GPU with other platforms will also be presented. PMID:24486639

  18. OpenACC performance for simulating 2D radial dambreak using FVM HLLE flux

    NASA Astrophysics Data System (ADS)

    Gunawan, P. H.; Pahlevi, M. R.

    2018-03-01

    The aim of this paper is to investigate the performances of openACC platform for computing 2D radial dambreak. Here, the shallow water equation will be used to describe and simulate 2D radial dambreak with finite volume method (FVM) using HLLE flux. OpenACC is a parallel computing platform based on GPU cores. Indeed, from this research this platform is used to minimize computational time on the numerical scheme performance. The results show the using OpenACC, the computational time is reduced. For the dry and wet radial dambreak simulations using 2048 grids, the computational time of parallel is obtained 575.984 s and 584.830 s respectively for both simulations. These results show the successful of OpenACC when they are compared with the serial time of dry and wet radial dambreak simulations which are collected 28047.500 s and 29269.40 s respectively.

  19. MACBenAbim: A Multi-platform Mobile Application for searching keyterms in Computational Biology and Bioinformatics.

    PubMed

    Oluwagbemi, Olugbenga O; Adewumi, Adewole; Esuruoso, Abimbola

    2012-01-01

    Computational biology and bioinformatics are gradually gaining grounds in Africa and other developing nations of the world. However, in these countries, some of the challenges of computational biology and bioinformatics education are inadequate infrastructures, and lack of readily-available complementary and motivational tools to support learning as well as research. This has lowered the morale of many promising undergraduates, postgraduates and researchers from aspiring to undertake future study in these fields. In this paper, we developed and described MACBenAbim (Multi-platform Mobile Application for Computational Biology and Bioinformatics), a flexible user-friendly tool to search for, define and describe the meanings of keyterms in computational biology and bioinformatics, thus expanding the frontiers of knowledge of the users. This tool also has the capability of achieving visualization of results on a mobile multi-platform context. MACBenAbim is available from the authors for non-commercial purposes.

  20. Workload Characterization of CFD Applications Using Partial Differential Equation Solvers

    NASA Technical Reports Server (NTRS)

    Waheed, Abdul; Yan, Jerry; Saini, Subhash (Technical Monitor)

    1998-01-01

    Workload characterization is used for modeling and evaluating of computing systems at different levels of detail. We present workload characterization for a class of Computational Fluid Dynamics (CFD) applications that solve Partial Differential Equations (PDEs). This workload characterization focuses on three high performance computing platforms: SGI Origin2000, EBM SP-2, a cluster of Intel Pentium Pro bases PCs. We execute extensive measurement-based experiments on these platforms to gather statistics of system resource usage, which results in workload characterization. Our workload characterization approach yields a coarse-grain resource utilization behavior that is being applied for performance modeling and evaluation of distributed high performance metacomputing systems. In addition, this study enhances our understanding of interactions between PDE solver workloads and high performance computing platforms and is useful for tuning these applications.

Top